Alawar Engine. Part two. Features of cross-platform game engine

Good afternoon! In a previous article , the general part of the process of creating games of the HOPA genre (Hidden Object Puzzle Game or “Hidden Objects”) was considered. In this article we will consider the principle of two-level software abstraction , which is the paradigm of the main platform-dependent components of our engine, and the general structure of the lower level of the engine . This approach allowed us to achieve flexibility in porting both the engine to new platforms and the games themselves from one platform to another. And also we managed to create:

  • three-level structure of the game;
  • unified subsystem of 2D graphics;
  • universal organization of source code.

The software part of the Alawar Engine is based on 2 libraries: SF (Stargaze Framework library) and QE (Stargaze Quest Engine library). SF is the core of the entire system and contains almost the entire platform-specific implementation of the game. Moreover, the library has one common source branch for all platforms. Currently SF operates under six platforms: Windows (XP, Vista, Windows 7), Mac OS X, iOS, Android, PS3 and Windows 8 (under development).

The Quest Engine is an add-on for the Stargaze Framework that implements the logic of the game created in the Quest Editor and does not contain platform-specific code. Initially, QE was aimed at games of the HOPA genre with a static set of static objects, but at the moment it is actively developing and allows you to implement games with a dynamic set of dynamic objects, for example, the genres Time Management (Resource management) and Tower Defense. Most platform-specific software modules within SF have two levels of abstraction. Thanks to the above features, we were able to achieve a fairly high portability of our engine.

In general, a game project has a three-level structure and consists of user code, the Quest Engine library, the Stargaze Framework library, and additional libraries, for example, a shopping library. With the right approach to implementation, user code can be run in its original form, without additional changes and efforts on all platforms that I wrote about at the beginning. By the way, our libraries can be connected to the game project both at the source level and at the level of compiled libraries. Using the architecture of the Stargaze Framework library as an example, let’s look at how you can organize a cross-platform implementation of the game in order to understand how this opportunity was achieved.

At the lowest level of the Stargaze Framework are system integration modules (platform-dependent code), which are located in separate directories (win, android, etc.). Directories with the name of the platforms contain exclusively platform-specific source code that is unique to a particular platform and do not contain source code that may be common for different platforms. For example, the implementation of calls to the graphics or audio subsystem of the current OS or algorithms for converting the color space for video decoding optimized for a particular architecture. In addition, platform-dependent code is, in extreme cases, embedded in the main code through conditional compilation directives. This is done in cases where the allocation of a separate program abstraction is incommensurable with the size of the code, which consists in directives of conditional compilation, for example,

#if defined(__SF_WINDOWS)
#elif defined(__SF_MAC) || defined(__SF_IPHONE)

Due to this, the Stargaze Framework has a unified developing branch and unified operating principles for all platforms simultaneously. At the same time, porting to a new platform begins in a separate branch, and then all changes are merged into the current active branch.

The following modules are located at a higher level:

  • the subsystem of resources and settings allows you to load and save configuration files, load resources from files, provides other components with a convenient interface for access to these settings and resources; all calls to resources and settings occur through string identifiers;
  • subsystem of logs and diagnostics - allows you to log, display diagnostic and error messages, stop the game when critical errors occur, collect performance information, etc .;
  • time management subsystem - includes a time manager and timers used to track different times in the game: time from the start of the game, time between frames of the game, etc .;
  • a set of auxiliary components (misc) - includes means for decoding images of various formats (png, jpg, etc.), tools for working with various data compression algorithms, mathematical tools, own data structures for working with stocks and arrays, etc. d .;
  • multitasking subsystem (MT) is a relatively new subsystem for SF, which is a wrapper of system multitasking.

At the next level are its own subsystems of 2D graphics, audio and video, which actively use both subsystems of the previous level and system integration modules:

  • 2D graphics subsystem - allows you to display geometric primitives: lines, rectangles, polygons, etc., draw textures (bitmaps), apply transformation matrices (affine transformations), as well as clipping output areas;
  • audio subsystem - is responsible for the output of sounds and music in games; when playing sounds allows you to set the volume, balance, playback speed. The system allows you to organize sounds in groups, which allows you to apply operations immediately to a set of sounds;
  • video subsystem - this subsystem is responsible for displaying video both inside game scenes and in full-screen mode.

In addition to the above subsystems, a number of intermediate can be distinguished:

  • font subsystem - is responsible for font management and text output, is part of the 2D graphics subsystem;
  • particle subsystem - allows you to animate images (mathematical and physical animation), while creating various visual effects;
  • animation subsystem - also allows you to create animations (clips), but can use many objects for this (animation frame-by-frame and logical);
  • The mechanism of effects is not a complete subsystem, but it gives the developer the freedom to program various visual effects applied to the GUI.

At the very top level is the GUI subsystem - its own library of widgets, which contains a set of ready-made classic primitives, such as windows, buttons, input fields, checkboxes, radio buttons, etc. The integrating element is the widget manager, the main tasks of which are to send user input messages to widgets, update the state of widgets and draw them. In addition, the widget manager contains a gesture emulation module. This module has two main purposes: processing of gestures that are not implemented at the system level and processing of user gestures.

Let's consider in more detail the mechanisms of the application itself, 2D graphics, audio and video. In our framework, the base class of the CApplication application is highlighted, which practically does not contain platform-specific code. This class describes the basic logic of the application, while the game developer creates an heir from this class and fills it with the required functionality. The platform-dependent implementation of the application’s mechanism of operation (initializing the application, creating the main window, processing events, etc.) is hidden in classes inherited from CSystemIntegration:

class CSystemIntegration
	virtual ~CSystemIntegration();
	virtual bool Init() = 0;
	virtual void Run() = 0;
	virtual void Stop() = 0;
	virtual void Shutdown() = 0;
	virtual bool EnsureSingleInstance() = 0; 
	virtual bool ChangeScreenMode(bool _fullscreen, bool _32bpp, size_t _width, size_t _height) = 0;
	virtual bool GetOriginalDesktopDimentions(size_t &_width, size_t &_height) = 0;
	virtual EventInformation &GetCurrentEvent() = 0;
	virtual void DefaultWindowProc() = 0;
	virtual void GetWindowClientRect(misc::IntRect &_rc) = 0;
	virtual void AdjustClientRectToWindow(misc::IntRect &_rc) = 0;
	virtual void GetDesktopWindowedSpace(misc::IntRect &_rc) = 0;
	virtual void ScreenCoordsIntoClient(misc::IntVector& _pos) = 0;
	virtual void ClientCoordsIntoScreen(misc::IntVector& _pos) = 0;
	virtual void EnableSystemGestureRecognizer(int _recognizerType, bool _enable) {};
	virtual void SetMouseCursorPos(const misc::IntVector& _pos) = 0;
	virtual void GetMouseCursorPos(misc::IntVector& _pos) = 0;
	virtual void SetSysCursor(gui::SysCursor _cursor, bool _show_now = true) = 0;
	virtual gui::SysCursor GetSysCursor() = 0;
	virtual void ShowSysCursor(bool _show = true) = 0;
	virtual bool IsSysCursorShown() = 0;
	void AppUpdate();
	void AppDraw();
	void ActivateApp(bool _activate = true);
	void MinimizeApp(bool _minimized = true);

In most cases, a game programmer deals only with a high-level abstraction of CApplication, which in turn uses CSystemIntegration. The interface of this class describes a general model of interaction with the system part of the application. The model assumes that the application has some output area (window), a queue of system messages (keyboard events, gestures, etc.) and the main work cycle. Although the direct implementation of class methods inherited from CSystemIntegration is not standardized, there are several conventions, for example, the main application cycle on any platform should call AppUpdate () and AppDraw () sequentially. For example, for the Windows and PS3 platforms, the main loop implementations are as follows:

void CStandaloneApplicationWindows::MessageCycle()
	MSG msg;
	while (!m_EndModal)
		while (!m_Stop && PeekMessage(&msg, 0, 0, 0, PM_REMOVE))
		if (m_Stop) break;
	m_EndModal = false;
void CStandaloneApplicationPS3::MessageCycle()
	while (!m_EndModal)
		if (g_quitRequested == false)
			if ( cellPadUtilUpdate() )
			if (m_Stop) break;
		}else break;
	m_EndModal = false;

Thus, to launch the application, the game programmer just needs to organize an entry point (on some platforms it is taken out of SF) and write the following code (example for iOS):

bool SFIPhoneMain()
	static game::CGameApplication app;
	if (!app.Init())
		return false;
		return true;
void StartLoadGame()
	sf::core::CApplication * app = sf::core::g_Application;
	app->SetMainWindow(new game::CMainMenuWindow());

Where CGameApplication is the heir from CApplication, and CMainMenuWindow is the heir from CWindow defined by the programmer in the code of the game itself. The code illustrates that this model of two-level software abstraction allows you to achieve the minimum cost of organizing the application, and as a result, the minimum headache associated with transferring the game to another platform. And in the context of several applications, this allows you to highlight the entry point of the application into a separate library of solutions.

The unified 2D-graphics subsystem allows you to port the Stargaze Framework without any problems to platforms with various render-machines, such as D3D9, D3D11, OpenGL ES (including 2.0), GCM. This versatility is achieved thanks to the CRenderer and CRenderDevice classes. The CRenderer class implements a top-level API - a single set of methods for working with 2D graphics that fully covers the requirements for casual 2D games. For instance:

  • void RenderString(const CFont* _font, const SF_WSTRING& _string, float _x, float _y, int _justify_h = -1, int _justify_v = -1, float _scale=1.f, const Color& _color=0xffffffff, const Color& _bk_color=0) – отрисовка текста;
  • void RenderTexture(const CTexture* _texture, const misc::FloatRect& _dest, size_t _frame = -1, const Color& _color = 0xffffffffu) – отрисовка текстуры;
  • void PushState() – сохранить состояние рендера;
  • void PopState() – восстановить состояние рендера;
  • void ApplyMatrix(const misc::FloatMatrix& _matrix) – применить аффинное преобразование.

It also stores the state stack of the render machine: blend color, transformation matrices, current texture and blend mode. The entire platform-specific implementation is hidden in the CRenderDevice classes (lower-level APIs) that have the same interface:

class CRenderDevice
	bool Init();
	void Reset();
	bool BeginScene();
	bool EndScene();
	void Render(RenderPrimitives _primitive, const RENDERVERTEX* const _verts, size_t _verts_count);
	void Render(RenderPrimitives _primitive, const void* const _verts, size_t _verts_count, DWORD _verts_fvf, DWORD _vertex_size);
	void Flush();
	void SetTexture(DWORD _stage, IDirect3DTexture9* _texture);
	void SetTextureStageState(DWORD _stage, DWORD _state, DWORD _val);
	DWORD GetTextureStageState(DWORD _stage, DWORD _state) const;
	void SetBlendMode(BlendModes _blend_mode);
	void SetPixelShader(IDirect3DPixelShader9* _shader);
	void SetRenderTarget(IDirect3DTexture9* _texture);
	bool GetAvailableResolutions(std::list &_container);
	bool ClearRenderTarget(const Color& _color = 0);
	void ToggleHeavyRenderProfile();

When calling the top-level API (CRenderer), for example, to render a texture, CRenderer uses the current state of the render machine, independently recalculates the array of vertices of the texture, and calls the CRenderDevice :: Render function. Whenever the state of the render machine changes, the function Flush () is called inside the CRenderDevice. For convenience, there are potential opportunities to use shaders and draw into textures, but this is rarely used in HOPA games.

To output sound, the CAudioManager class is used, which in turn uses some library that has an implementation on a specific platform (Bass, OpenAL, MultiStream, XAudio2, etc.). Within various platforms, this class can have both a two-level implementation, for example, Windows and MAC OS X, and a single-level implementation, for example iOS. This assumption is made due to the fact that on some platforms there is a convenient set of APIs for playing sounds. This class completely hides the details of sound reproduction, providing access to sounds only by their identifiers and identifiers of sound groups. This allows you to minimize the time for "screwing" sounds to the game. For example, starting a specific track looks like this: sf :: core :: g_AudioManager :: Instance (). Play (“some_music”).

Perhaps the most problematic subsystem (but at the same time the most universal in terms of cross-platform implementation) is the video subsystem. This is due to the context of the use of video in the game. For example, four different video objects, including those with an alpha channel, can be present on the game scene, which leads to a decrease in fps in the game and increased memory consumption. At the moment, for Windows, Android, MAC OS X and iOS, two cross-platform implementations based on Theora and WebM decoders are used. The latter is more preferable. The top-level API of this subsystem integrates the CVideo class, whose interface, like CAudioManager, is quite simple.
The only exceptions are the two Update and Draw methods that must be called to update the decoder and render the decoded texture, respectively. This allows us to display several different videos on the same scene in layers. The lower level APIs are implemented by classes that inherit from CVideo. These classes hide methods of working with a specific decoder, as well as mixing regular video and alpha channel. This approach allowed us to minimize the cost of transferring video from one platform to another.

In the process of porting the Stargaze Framework library to various platforms, we concluded that the two-level model of abstraction of platform-dependent subsystems is more flexible. It allows you to level the specifics of different platforms, providing a single principle for the development and porting of the game.

Also popular now: