This setting comes from the original mobile port of AGS, but it's not clear whether it was
useful. More importantly, there are issues with this:
1. It was never implemented in another hardware accelerated driver (Direct3D), being
exclusive to OpenGL.
2. It was never added to standard setup, so not very visible.
3. It's not clear whether it was working properly all this time. There were multiple changes to graphic renderers, and some of them may assume that rendering at "native resolution" is done at exactly, well, native resolution.
This is meant to skip certain groups of sprites when taking a "screenshot",
for instance when preparing a frozen game screen for the room transition
(fade-out etc).
Think of hiding mouse cursor and engine overlay (fps counter etc).
Partially from upstream b0cd961483dcee8ffff7543c429fcd3272b413d5
Avoid query gfxDriver's properties all the time, instead save them once in init_draw_method() and use saved values.
From upstream db535512af65119d50ae3d6fb06c221d56f42d85
When plugin calls SetVirtualScreen (for software renderer), don't
replace whole virtual screen, instead replace only the current
render stage buffer.
This is complementary to the older changes in software renderer,
made during development of AGS 3.5.0, which featured advanced
room viewports and cameras. Since that change, separate render
stages could draw on sub-bitmaps of smaller size.
While IAGSEngine::GetVirtualScreen() was adjusted to follow that
change, and return not the whole virtual screen, but only a "stage
buffer" (which may be a VS sub-bitmap, or an intermediate bitmap
created specifically for this render stage), SetVirtualScreen() was
NOT adjusted accordingly and kept replacing whole VS. As a
result, this discrepancy could cause logical errors, as well as
crashes with plugins that use this API (e.g. original SnowRain
plugin).
The immediate reason of error is that plugin would remember a
pointer returned from GetVirtualScreen (which is a stage buffer),
and then try to set it back with SetVirtualScreen. As a result,
engine's own stage buffer is assigned as a full virtual screen.
From upstream 09143ea7fbf8474f78932116ae1c81ceff4de95a
This is primarily for backwards compatibility with the older plugins,
that relied only on software drawing.
Previously we added "stage screens" to the hardware-accelerated
graphic drivers (Direct3D/OpenGL), which provide a surface for
plugins to draw upon. If used, these surfaces would then be rendered
as plain sprites in 3D scene at certain place in the sprite sequence.
The remaining problem (left unnoticed) was that the surfaces which
correspond to the two room render callbacks (right after background,
and right after all objects) did not follow the room camera position and
scaling correctly (not all, tbh). This commit addresses these problems by
letting the engine to request certain size and optional position for these
"stage screens" for particular sprite batches.
Partially from upstream 2d43bffc2cb07935ae72d7c5677ad622c8b4d37e
- caches textures while they are in the immediate use;
- this lets to share same texture data among multiple sprites on screen.
- texture entries are identified by an arbitrary uint32 number.
From upstream 6f3f84e049df2f800aa4b06d221a393d904c2826