On a touch based device the touches made must be translated to
different mouse events. For example when a touch starts and is
moved that should translate to a mouse pointer movement. A quick
tap should be translated to a mouse button click while holding a
touch for a longer time without movement should be translated to
that the mouse button is kept pressed.
Add UIGestures to replace the current mouse button handling, all
done using the callback functions touchesBegan, touchesMoved and
touchesEnded.
A tap gesture with one finger is a left mouse click
A tap gesture with two fingers is a right mouse click
A long press gesture with one finger is holding the left mouse
button down until the press is released. This is accomplished by
sending an button down event when the gesture is recognized and
changes state to UIGestureRecognizerStateBegan.
A button up event is sent when the same gesture changes state to
UIGestureRecognizerStateEnded.
A long press with two fingers is as above but for the right mouse
button.
This commit adds the gestures and their actions. The current
mouse button handling is removed in upcomming commit.
Add two UI buttons which are placed to the top right corner over
the main view. The left button controls the current touch mode.
When pressed the button changes image to represent the new touch
mode using the mouse and touchpad assets added in previous commit.
The right button triggers the call to the main menu.
The virtual controller can be configured with different directional
elements. A thumbstick button and Dpad buttons (left, right, down,
up) can be configured.
A user might want to configure which directional element they want
on the virtual controller. Create two different virtual controllers
which can be switched between depending on the setting in the
backend specific option dialog.
Instead of register input based on if hardware is connected or not
register input based on backend capabilities.
Mouse support is default supported through touch events on screen
(iOS) or touch on controller (Apple TV), and through connected
mouse hardwares. Gamepad controllers are supported from iOS 14 and
later.
Register mouse and gamepad input based on above capabilities to be
able to map actions to buttons on these input devices.
Keyboard support is to be added but not in this commit.
Remove the "isConnected" methods for each input and change the same
method for game controllers to a "isSupported" function to deal
with the iOS version support.
Remove the sending of the EVENT_INPUT_CHANGED event due to the
above reasons. The overide of the isConnected property function is
also removed due to this reason. It caused problems that key
mappings were reset on connections/disconnections.
Add options to let the user configure the screen orientation when
in games and in launcher. The option can be set per game as well
to allow for some games to start in landscape mode while others
in landscape mode.
This commit implements the options in the backend menu tab. The
code is very much similar as the one implemented in the Android
backend.
Delete the old graphic handling in the IOS7 backend which is not
used anymore after implementing iOSGraphicsManager.
The Accelerate framework is not used anymore. The OpenGLGraphics
manager handles the different color formats.
When the screen dimension changes, e.g. on rotation of the device,
the graphic manager has to be informed of the new dimension to be
able to resize the surfaces.
To quickly redraw the entire screen, Common::EVENT_SCREEN_CHANGED
event is passed to the event handler.
Add input events that can be used by mouse devices, e.g. mices and
touchpads. This event sends the raw input actions and doesn't care
about different controller modes such as click-and-drag.
Make the mouse controller utilize the new mouse input events.
The current mouse events are handling events created from both touch
and mouse input. The events have lots of logic to deal with gestures
and different modes (touchpad mode, click-and-drag etc) which are not
applicable for hardware inputs.
Rename the current "mouse events" to "touch events" to clarify which
input that triggered an event. As this is the first commit in multi-
commit change, the mouse input need to use the "touch events" until
a new "mouse event" is implemented.
Joystick actions are suitable for joysticks and gamepads where the
movements are updated by a controller stick. On gamepads that's usually
a thumbstick.
Add joystick events which can be triggered by each implemented
controller that should utilize the ScummVM Joystick events.
This was disabled during the merge of the initial pull request, PR 630
as this "limits the port to 16bit color precision, i.e. preventing it
from coping with engines like Wintermute, and Sword25.".
The intent was to replace this with GLSL based scalers, and thus to
avoid switching around and confusing users this was disabled.
However, since the GLSL solution has not been implemented after 2 years,
and users are asking for this feature on IOS7, am enabling this for now.
This can always be replaced in future with a GLSL based solution if a
motivated developer provides a patch.