The ability to change touch mode between touchpad emulation and
"direct" mouse mode is kept. Inform the user about the updated
mode change and update the on-screen control button to reflect
the new mode which is useful when the on-screen controls are
disabled.
On a touch based device the touches made must be translated to
different mouse events. For example when a touch starts and is
moved that should translate to a mouse pointer movement. A quick
tap should be translated to a mouse button click while holding a
touch for a longer time without movement should be translated to
that the mouse button is kept pressed.
Add UIGestures to replace the current mouse button handling, all
done using the callback functions touchesBegan, touchesMoved and
touchesEnded.
A tap gesture with one finger is a left mouse click
A tap gesture with two fingers is a right mouse click
A long press gesture with one finger is holding the left mouse
button down until the press is released. This is accomplished by
sending an button down event when the gesture is recognized and
changes state to UIGestureRecognizerStateBegan.
A button up event is sent when the same gesture changes state to
UIGestureRecognizerStateEnded.
A long press with two fingers is as above but for the right mouse
button.
This commit adds the gestures and their actions. The current
mouse button handling is removed in upcomming commit.
Add two UI buttons which are placed to the top right corner over
the main view. The left button controls the current touch mode.
When pressed the button changes image to represent the new touch
mode using the mouse and touchpad assets added in previous commit.
The right button triggers the call to the main menu.
The onscreen_controls options should refer to the touch mode setting
and main menu buttons as it's done in the Android backend. The
virtual gamepad controller used the onscreen_controls string to
identify the option. Change the identifier string to
gamepad_controller so onscreen_controller can be used for the buttons
to be added to the iOS backend.
Instead of register input based on if hardware is connected or not
register input based on backend capabilities.
Mouse support is default supported through touch events on screen
(iOS) or touch on controller (Apple TV), and through connected
mouse hardwares. Gamepad controllers are supported from iOS 14 and
later.
Register mouse and gamepad input based on above capabilities to be
able to map actions to buttons on these input devices.
Keyboard support is to be added but not in this commit.
Remove the "isConnected" methods for each input and change the same
method for game controllers to a "isSupported" function to deal
with the iOS version support.
Remove the sending of the EVENT_INPUT_CHANGED event due to the
above reasons. The overide of the isConnected property function is
also removed due to this reason. It caused problems that key
mappings were reset on connections/disconnections.
The UITapGestureRecognizer is used to identify taps with two fingers,
sending an ESC key event. The UITapGestureRecognizer will not stop
touchesBegan from being called when doing a touch with two fingers.
Touching with two fingers when _mouseClickAndDragEnabled is enabled
will send a EVENT_RBUTTONDOWN.
But if doing a double tap the UITapGestureRecognizer with two fingers
will cancel the touchesEnded from being called meaning that no event
EVENT_RBUTTONUP will be sent to the engine. In some engines, e.g.
lure, this will cause problems since the engine might wait for mouse
clicks to be released before processing other events.
Fix the scenario above by introducing a delay of sending the event
EVENT_RBUTTONDOWN. If the UITapGestureRecognizer is triggered before
this timeout it will cancel the EVENT_RBUTTONDOWN by overwriting the
_queuedInputEvent with the ESC key event.
This commit also deletes the legacy implementation handling double
tap with two fingers to trigger an ESC event.
Implement function to trigger the device to set the screen
orientation according to the configration in the backend
options tab.
The implementation to trigger the setting is different for
devices running iOS prior version 16 and version 16+.
The screen orientation update is triggered when user has
applied the setting in the backen options tab, when the GUI
launcher is loaded and when starting a game.
The orientation is only changed if going from any portrait
mode to any landscape mode and the opposite.
Based on the setting, an UIInterfaceOrientationMask property
is set to hold the allowed interface orientations. If the
setting is "Portrait" the property will be set to
UIInterfaceOrientationMaskPortrait, allowing to only change
the orientation to normal or upside down portrait modes.
If the setting is "Landscape" the UIInterfaceOrientationMask
will be set to UIInterfaceOrientationMaskLandscape allowing
the device to rotate the screen orientation only to either
right or left landscape mode. If set to "Auto" all orientations
will be allowed.
When the device orientation changes, the system calls the
instance property method supportedInterfaceOrientations on the
root view controller or the topmost modal view controller that
fills the window. If the view controller supports the new
orientation, the system rotates the window and the view
controller. The system only calls this method if the view
controller's shouldAutorotate method returns YES, which is the
default value.
The event handler is refactored to receive the internal screen
orientation value instead of the enum value of the UIKit enum
UIInterfaceOrientation. The convertion from UIInterfaceOrientation
to ScreenOrientation is done in iPhoneView class instead when
sending the new orientation value to the event handler.
This also makes the convertion aligned with the screen orientation
settings in the function setSupportedScreenOrientation.
If not in "click-and-drag" mode, left mouse button down and up
events are sent on touches ended if the touch lasted less than
250 ms. If the touch lasted longer it was considered as a move
and no button events are sent.
This commit mimic that behaviour in touchpad mode when "click-
and-drag" mode is enabled. The left mouse button down event is
queued for 250 ms. If the touch is dragged within 250 ms it is
considered as a move and the queued mouse button down event is
cacelled. If no movement is made withing 250 ms the queued
mouse button event is processed.
Delete the old graphic handling in the IOS7 backend which is not
used anymore after implementing iOSGraphicsManager.
The Accelerate framework is not used anymore. The OpenGLGraphics
manager handles the different color formats.
When the screen dimension changes, e.g. on rotation of the device,
the graphic manager has to be informed of the new dimension to be
able to resize the surfaces.
To quickly redraw the entire screen, Common::EVENT_SCREEN_CHANGED
event is passed to the event handler.
Previously the mouse position in the view was tracked using the
pointerPosition property. Scaling and relative mosue movements
were calculated in the view using screen properties stored in the
videoContext structure. Now when moving to iOSGraphicsManager all
that handling will be handled by the WindowedGraphicsManager,
which the iOSGraphicsManager inherit.
Rework the input code to send down pure x and y position values,
scaled according to the view content scale factor.
Remove code related to mouse movement that is no longer needed.
Implement support to set the mouse pointer speed in settings.
The mouse pointer speed is applied to both mouse input and touch
input when in touchpad-mode.
Add input events that can be used by mouse devices, e.g. mices and
touchpads. This event sends the raw input actions and doesn't care
about different controller modes such as click-and-drag.
Make the mouse controller utilize the new mouse input events.
The current mouse events are handling events created from both touch
and mouse input. The events have lots of logic to deal with gestures
and different modes (touchpad mode, click-and-drag etc) which are not
applicable for hardware inputs.
Rename the current "mouse events" to "touch events" to clarify which
input that triggered an event. As this is the first commit in multi-
commit change, the mouse input need to use the "touch events" until
a new "mouse event" is implemented.
Apple introduced the GCVirtualController in iOS 15 which is a
software emulation of a real controller. The virtual controllers
can be configurable with different inputs. See more info at:
https://developer.apple.com/documentation/gamecontroller/gcvirtualcontroller
A simple gamepad configuration with a dPad and A and B buttons
is added. The user can enable/disable the virtual game controller
swiping two fingers right to left, or through the port-specific
option dialog.
The "touchpad" mode and "click-and-drag" mode was mutual exclusive
when enabling "click-and-drag" using swipe gesture.
The difference between "touchpad" mode and "click-and-drag" mode
is how the button down/up events are sent. In touchpad mode the
button down and button up events are sent on touches ended, while
in click-and-drag the button down event is sent on touches began
and button up on touches ended.
The timeHandler was driven by calls to the pollEvent callback function.
Each time pollEvent was called the timeHandler called the TimeManager
handle function to advance in time and make sure scheduled tasks were
triggered.
This worked good for most game engines but some, e.g. the Hypno engine
was using the TimeManager to schedule tasks without calling pollEvent
since it was expecting nor handling events at the specific point in
time.
Since iOS have threads the timerHandler can be called from a separate
thread and not rely on pollEvent.
Implement timerHandler to use a Timer Dispatch Source which and make
it operate on a background thread rather than the main thread.
Read more on Dispatch Sources here:
https://developer.apple.com/library/archive/documentation/General/
Conceptual/ConcurrencyProgrammingGuide/GCDWorkQueues/GCDWorkQueues.html
Joystick button up events was not sent to the EventManager due to a
missing break in the kInputJoystickButtonUp case. This caused button
presses in the launcher and dialogs not to be triggered.
Adding the break enables use of joystick buttons in ScummVM GUIs.
Joystick actions are suitable for joysticks and gamepads where the
movements are updated by a controller stick. On gamepads that's usually
a thumbstick.
Add joystick events which can be triggered by each implemented
controller that should utilize the ScummVM Joystick events.
This is better than using an hardcoded delay for two main reasons.
The first one is that the application can terminate as soon as it
has finished saving the state, and the second one is that it will
still work if saving the state takes longer than the delay that
was hardcoded.
This is necessary for properly identifying the Return key pressed from
the software or a hardware keyboard, and this was erronously removed
in commit e5709ed.