Remove an accidental semicolon after a function definition which
was ignored by the compiler but generated a warning.
Rename a variable inside a code block that was generating a warning
about shadowing a local variable.
The system gestures on screen edges took precedence over the tap
gestures again after screen rotations.
Call function setNeedsUpdateOfScreenEdgesDeferringSystemGestures
on orientation changes to notify the system that it needs to read
the edges from the view controller’s
preferredScreenEdgesDeferringSystemGestures property again.
Defer system gestures on all edges since the top changes when
rotating the device.
The tap gestures were cancelled due to system gestures taking over
when tapping on the screen edges. This made it hard to click on
verbs in games or buttons in launcher when in "Direct mouse mode".
There is an option to defer the system gestures on selcted edges.
This will make tap gestures to be recognised also on screen edges.
Also it won't interfere with the gesture to close the application
on devices lacking home button.
The change in commit 1b3c783b9e assumed
that the orientation already had been updated when the system called
viewWillTransitionToSize. This seems to be true for iOS 16 while in
iOS 15 the orientation seems to be updated a bit later.
In iOS 16, make sure that the current orientation is updated when the
function viewWillTransitionToSize is called to make sure it's updated
when the adjustViewFrameForSafeArea is called. This makes sure that the
screen size is updated correctly when forcing the orientation based on
the backend user setting.
In iOS 15 (and below), set the current orientation when the transition
animation finishes to make sure that the interface orientation has been
updated to make sure the virtual controller is connected/disconnected
properly based on the orientation.
The overided viewWillTransitionToSize instance method did update the
current orientation when animation to the new orientation completed.
However, that made the handling of safe areas on devices having such,
e.g. iPhones with the sensor bar on top, racy. The correct orientation
was set after the adjustViewFrameForSafeArea function was called.
Set the new orientation before the animation completes.
Implement function to trigger the device to set the screen
orientation according to the configration in the backend
options tab.
The implementation to trigger the setting is different for
devices running iOS prior version 16 and version 16+.
The screen orientation update is triggered when user has
applied the setting in the backen options tab, when the GUI
launcher is loaded and when starting a game.
The orientation is only changed if going from any portrait
mode to any landscape mode and the opposite.
Based on the setting, an UIInterfaceOrientationMask property
is set to hold the allowed interface orientations. If the
setting is "Portrait" the property will be set to
UIInterfaceOrientationMaskPortrait, allowing to only change
the orientation to normal or upside down portrait modes.
If the setting is "Landscape" the UIInterfaceOrientationMask
will be set to UIInterfaceOrientationMaskLandscape allowing
the device to rotate the screen orientation only to either
right or left landscape mode. If set to "Auto" all orientations
will be allowed.
When the device orientation changes, the system calls the
instance property method supportedInterfaceOrientations on the
root view controller or the topmost modal view controller that
fills the window. If the view controller supports the new
orientation, the system rotates the window and the view
controller. The system only calls this method if the view
controller's shouldAutorotate method returns YES, which is the
default value.
The event handler is refactored to receive the internal screen
orientation value instead of the enum value of the UIKit enum
UIInterfaceOrientation. The convertion from UIInterfaceOrientation
to ScreenOrientation is done in iPhoneView class instead when
sending the new orientation value to the event handler.
This also makes the convertion aligned with the screen orientation
settings in the function setSupportedScreenOrientation.
The main change in to use the interface orientation and not the device
orientation. This may be different for example when locking the orientation.
This also changes the way orientation changes are detected using the
documented method. However this means dropping support for iOS 7 as this
method is only available since iOS 8, and alternative methods available in
iOS 7 have been deprecated in iOS 13.
Another change is to properly detect the interface orientation instead of
infering it from the view bounds, which was incorrect on some devices.
Setting this property to true indicates the view controller’s preference
to lock the pointer, although the system may not honor the request.
For the system to consider locking the pointer:
The scene must be full screen, not in Split View or Slide Over, with no
other apps in Slide Over.
The scene must be in the UISceneActivationStateForegroundActive state.
The ScummVM iOS7 client fulfills the above so the pointer is locked.
Locking the pointer hides the OS cursor (the dot), however that's wanted
since the ScummVM engine draws its own pointer.