- Mouse left click and long click gestures are shared between
the Pencil and regular Touch inputs.
- Mouse right click and long click have Pencil specific gestures.
Add a convenience function to check if the application is running
in macOS. Use this method to change the default visibility of on-
screen control buttons and function bar.
Neither of these are needed when running on macOS since both mouse
and keyboard are available. Also the mouse pointer in ScummVM does
not access the on-screen control buttons.
However this requires function keys and key combinations like e.g.
Alt+X to work to be able to save and quit games.
These will be added in future commits.
Add two UI buttons which are placed to the top right corner over
the main view. The left button controls the current touch mode.
When pressed the button changes image to represent the new touch
mode using the mouse and touchpad assets added in previous commit.
The right button triggers the call to the main menu.
Instead of register input based on if hardware is connected or not
register input based on backend capabilities.
Mouse support is default supported through touch events on screen
(iOS) or touch on controller (Apple TV), and through connected
mouse hardwares. Gamepad controllers are supported from iOS 14 and
later.
Register mouse and gamepad input based on above capabilities to be
able to map actions to buttons on these input devices.
Keyboard support is to be added but not in this commit.
Remove the "isConnected" methods for each input and change the same
method for game controllers to a "isSupported" function to deal
with the iOS version support.
Remove the sending of the EVENT_INPUT_CHANGED event due to the
above reasons. The overide of the isConnected property function is
also removed due to this reason. It caused problems that key
mappings were reset on connections/disconnections.
Implement function to trigger the device to set the screen
orientation according to the configration in the backend
options tab.
The implementation to trigger the setting is different for
devices running iOS prior version 16 and version 16+.
The screen orientation update is triggered when user has
applied the setting in the backen options tab, when the GUI
launcher is loaded and when starting a game.
The orientation is only changed if going from any portrait
mode to any landscape mode and the opposite.
Based on the setting, an UIInterfaceOrientationMask property
is set to hold the allowed interface orientations. If the
setting is "Portrait" the property will be set to
UIInterfaceOrientationMaskPortrait, allowing to only change
the orientation to normal or upside down portrait modes.
If the setting is "Landscape" the UIInterfaceOrientationMask
will be set to UIInterfaceOrientationMaskLandscape allowing
the device to rotate the screen orientation only to either
right or left landscape mode. If set to "Auto" all orientations
will be allowed.
When the device orientation changes, the system calls the
instance property method supportedInterfaceOrientations on the
root view controller or the topmost modal view controller that
fills the window. If the view controller supports the new
orientation, the system rotates the window and the view
controller. The system only calls this method if the view
controller's shouldAutorotate method returns YES, which is the
default value.
The event handler is refactored to receive the internal screen
orientation value instead of the enum value of the UIKit enum
UIInterfaceOrientation. The convertion from UIInterfaceOrientation
to ScreenOrientation is done in iPhoneView class instead when
sending the new orientation value to the event handler.
This also makes the convertion aligned with the screen orientation
settings in the function setSupportedScreenOrientation.
It's important that the main frame, displaying the OpenGL rendered
graphics, has the proper dimensions and depending on the device
orientation. It's also important that the frame is not covered by
the iOS keyboard.
This commit calculates the frame size depening on the orientation
and the keyboard status. The keyboard knows its parent view and
can resize it when the keyboard becomes visible or hidden.
There are multiple scenarios where the frame size is changed.
- When the keyboard is hidden/shown which can be automatically
change depending on the device orientation
- If the system demands the keyboard to get visible or hidden
- When rotating the device
- When suspending/resuming the application
There can also be combination of the scenarios above, e.g. if
suspending the application in landscape mode and resume it in
portrait mode.
A lot of effort has been put into testing different scenarios to
verify that the screen size becomes correct. However there might
be some scenario which has not been covered.
Delete the old graphic handling in the IOS7 backend which is not
used anymore after implementing iOSGraphicsManager.
The Accelerate framework is not used anymore. The OpenGLGraphics
manager handles the different color formats.
Previously the mouse position in the view was tracked using the
pointerPosition property. Scaling and relative mosue movements
were calculated in the view using screen properties stored in the
videoContext structure. Now when moving to iOSGraphicsManager all
that handling will be handled by the WindowedGraphicsManager,
which the iOSGraphicsManager inherit.
Rework the input code to send down pure x and y position values,
scaled according to the view content scale factor.
Remove code related to mouse movement that is no longer needed.
Implement callbacks to set up OpenGL context, destroy context, get
scale factor and screen sizes. Implement rendering of graphics drawn
by the iOS graphicsManager.
This commit will enable graphics to be shown again. Screen rotation
and mouse movements are still to be adapted.
The delta values are in number of pixels on the native screen
resolution. Need to scale down the delta values based on the
game resolution. Store reminders that are added to next deltas
to mitigate "dead zones" if doing small movements.
Apple introduced the GCVirtualController in iOS 15 which is a
software emulation of a real controller. The virtual controllers
can be configurable with different inputs. See more info at:
https://developer.apple.com/documentation/gamecontroller/gcvirtualcontroller
A simple gamepad configuration with a dPad and A and B buttons
is added. The user can enable/disable the virtual game controller
swiping two fingers right to left, or through the port-specific
option dialog.
The main change in to use the interface orientation and not the device
orientation. This may be different for example when locking the orientation.
This also changes the way orientation changes are detected using the
documented method. However this means dropping support for iOS 7 as this
method is only available since iOS 8, and alternative methods available in
iOS 7 have been deprecated in iOS 13.
Another change is to properly detect the interface orientation instead of
infering it from the view bounds, which was incorrect on some devices.
This enables some game engines only supporting 32 bit color formats,
e.g. 11th hour and Broken Sword 2.5.
Add support for the pixel formats RGBA8888 and ABGR8888. The pixel
formats are defined in big endian while iOS utilizes little endian,
thus requiring converting some formats to get correct color
representation.
Use the Apple Accelerate framework converting the format requiring
to minimize the CPU load since this is done every frame.
Add isInGame property to track if the launcher is shown of if a game is
running. Handle press on menu key different depending on if launcher is
shown or not. If launcher is shown suspend the application to return to
Apple TV Home Screen since that is the parent view of the launcher. If
in game pause the game and show menu. This is according to Apple
guidelines which can ge read here:
https://developer.apple.com/design/human-interface-guidelines/inputs/remotes
The keyboard can be presented and dismissed without being triggered by
the showKeyboard/hideKeyboard functions e.g. by pressing the menu button
on the Apple TV remote while the keyboard is shown.
If the keyboard visibility is not set entirely by the showKeyboard/
hideKeyboard functions that means that the _keyboardVisible state
variable can be out of sync.
Check if the keyboard is shown based on if the inputView is the first
responder or not. The check has to be made on the main thread.
iOS and tvOS shares a lot of code. However some there are parts that are
specific to iOS, for instance handling of UI device orientation and
certain types of gestures.
Currently there are also some limitations on the Apple TV that needs to
be flagged to the engine. There is no support for virtual keyboard, no
clipboard support and no possibility to open URLs.
Put code specific for iOS within the ObjC platfrom macro TARGET_OS_IOS.
The code specific for tvOS are put within the macro TARGET_OS_TV.
Move touch inputs to a TouchController class to move some logic from the
iPhoneView class. Only do this for touches on screen since connected
trackpads can generate touches as well. The latter ones are of type
UITouchTypeIndirectPointer while touches on screen are of type
UITouchTypeDirect. They are separated thanks to the preference key
UIApplicationSupportsIndirectInputEvents set to YES in Info.plist.
Without the preference above, there is no way to distinguish touches
from screen from a trackpad.
This is better than using an hardcoded delay for two main reasons.
The first one is that the application can terminate as soon as it
has finished saving the state, and the second one is that it will
still work if saving the state takes longer than the delay that
was hardcoded.