Full-screen animation in the iOS game, or what if the graphics are not really loaded into memory

I must say right away that we spent a lot of time in vain on this project, but on the other hand, we gained some useful experience, which, I think, will be interesting for many to read, so as not to step on the same rake. Those interested in working with large animations for iOS, I ask for cat.


So, everyone probably saw a game with a talking cat that repeats with a funny voice everything that they say to him? So our customer had a desire to make a similar game, but with a different character.

Here I want to talk only about the processing of graphics. I will not dwell on other things, although along the way we wrote the SDK for voice distortion, which allows, albeit very rudely, to control the opening of the character’s mouth depending on the pronounced phrase and can easily be integrated into such games.

So, in the process of negotiations with the customer, it turned out that the graphics for the animations were already ready in JPEG format, in full screen, for all types of iOS devices, including iPad3 with its wild screen resolution (recall, this is 2048x1536). Moreover, the animations were rendered based on 60 frames per second. And trying to realize the game on their own, they faced a lot of difficulties that they preferred to lay on someone else’s head. Why did they turn to us.

At the first moment, I was horrified by this approach to preparing graphics, considering it to be extremely not optimal.
First thought - it was necessary to separate the background and the character.
But, on the other hand, then the character’s animations will have to be saved with an alpha channel so that it can then be superimposed on this background, i.e. in PNG format, and this is several times more weight of graphics, and so the application.
Then I brushed aside this soap, having come to the conclusion that their version is not so bad - the graphics, compressed in JPEG, do not weigh much and then we can just cut the character so that later we can only animate this cut-out area of ​​the screen, overlapping it to the background, which can serve as one of the full frames of the animation.

Another thing that immediately occurred to me - with such a frame size, the use of texture atlases disappears immediately, because the maximum texture size that is supported by devices (which is either 2048x2048 or 4096x4096 depending on the device and version of iOS) will not accommodate a couple of frames. Therefore, when loading the animation, we will load each frame separately from the file system.

And perhaps it’s worthwhile to persuade them to cut the embedded fps of animations to, say, 15 frames per second, which, in my experience, is quite enough to animate the movements of the characters. Which, in general, was easily succeeded. As a result, every fourth frame was taken as a basis.

It also turned out that customers did not have a clue about the Cocos2d engine, and they liked the way their animation worked smoothly and quickly in OpenGL in that simple Cocos2d example that I sketched for demonstration.
We shook hands and set to work.

Attempt number one


First of all, we shredded all their animations in such a way as to animate (read, keep in mind) only that part of the frame in which something is changing. But at the same time, for simplicity, we decided to make all frames the same size, for which we determined a zone that would frame the variable part throughout the animation and cut only it out of all frames.



To do this, we used the Photoshop action, which we edited for each animation to set the size of the Crop zone and applied it to the entire animation folder.

Being fully confident that we were on the right track, we processed all their animations in this way and only then began to collect test cases.

Not that our next problem was unexpected, I just did not want to bother with its solution without appreciating its scale.
It was clear that we would not be able to preload all the animations into memory, but according to the specification, quite a few animations will be activated by the user, and therefore should be ready for immediate playback at any time. So at least they need to be kept in mind.
We conducted several tests and soon found out that this was not realistic due to exceeding the limit on used memory.

Well, we decided, the only option is to download them immediately before launch. And this could work for short animations, but any more or less long one caused unacceptable delays, up to 10-12 seconds, of loading. It is understandable. IOS is forced to deploy the JPG image somewhere in a pure bitmap, and then transfer it to OpenGL for use as a texture. And in this form, this is already several megabytes, even despite the fact that the original image weighs less than 100KB.

Another problem that also arose in this case is the huge amount of memory occupied by these textures in expanded form. Some animations simply did not enter the whole memory. So we came to understand that here we can not do without the good old PVRTC format.

Attempt number two


PVRTC (we used PVRTC4444) has several advantages over JPEG.
Firstly, it does not require any transformations inside the device, the picture can be used as a texture right away, which gives serious time savings when loading graphics for the first time. Another important circumstance - pictures in the form of PVRTC weigh less than raw bitmaps, which eventually turn into any JPEG picture in OpenGL, which means that we can keep even the longest animations in memory at once several at once.
But there are also disadvantages. The main one is the size of the application. Although they weigh less than raw bitmaps, in comparison with JPEG they are many times larger files.
Those. winning in the speed of preloading graphics, we significantly increase the weight of the application.

Another minus is the restrictions associated with the geometric size and proportions. All files in the PVRTC format must be square with the side of an equal power of two.
Having estimated the volume of all files, if for iPad3 we bring all our frames to 2048x2048, we immediately abandoned this idea. And they decided to limit it to two sizes. For iPhone, use 512x512 frames, and for all iPads, limit yourself to textures of 1024x1024. Because while playing the animation, the difference in the quality of the picture on the iPad3 and iPad2 is not noticeable anyway, since the geometric size of the screen, all the same, is the same later, the pictures quickly change.
So it was necessary to somehow bring all of our graphics with its chaotic frame sizes (they could be like that - 1355x1278) to 1024x1024 (and 512x512 for iPhone).

Without further ado, we simply decided to scale the already-cut graphics so that, keeping the proportions, bring it to a width of 1024, and the remaining space vertically will be filled with transparency. And then in the code, stretch the downloaded images back to their original size.
And in those cases when the original height was greater than the width, we decided to deliver another square from the bottom to the joint, which will contain the missing part of the original image.



I had to spend two days scaling and reformatting our images in PVRTC. For reformatting, we used the utility that came with Xcode and a simple shell script written by us for processing all the files in a folder, and for scaling, of course, photoshop with its actions).
An interesting detail - the weight of each file in PVRTC is determined only by its geometric dimensions. Those. each iPhone animation file took 128KB, and each square for iPad weighed 512KB, regardless of content.
Of course, the size of the application has grown dramatically to indecent hundreds of megabytes, but downloading animations was almost instantaneous and for most animations you could do just that - load them the moment you need them, play them and immediately unload them from memory. But there were still long animations that still required one or two seconds to preload, which, of course, cut the eye of the customer.

Then we decided to do an “autopsy” and look at how the game about the cat is all the same arranged. It turned out several important points:
Firstly, all the animation graphics in them are full-screen in JPG format, and they are monstrously squeezed - apparently, like us, its developers relied on the fact that it is difficult to see the picture quality during playback.
Secondly, judging by the number of frames and the actual playing time of the animations in the game, they were limited to a speed of 5-6 frames per second.
We also found out that they used the same graphics for both iPads.

After that, we decided to check one assumption.

Attempt number three


We suggested that maybe these guys just drive all the animations in UIKit without even using OpenGL. And indeed, why do you need OpenGL if you are going to be content with 5 frames per second.

For fun, we decided to try our original graphics by playing with UIKit - just loading each full-size frame into the UIImageView sequentially.
It turned out that in this mode, iPad2 manages to play it at a speed of 30 frames per second, and iPad3 (despite the four times the size of the picture) gives out, also quite decent 20 frames per second!
And this is without preloading the entire animation, we just load one frame, display it and immediately, unloading it from memory, replace it with another without setting any delay between frames.
Here is a discovery! It was possible to do nothing at all, everything would work just fine, and graphics in the JPG format would give the application size much smaller, and you don’t need to keep anything in memory. With this approach, the animation can be of any length, it is limited only by the desired size of the application.
In the end, we rewrote the game using this approach.

The conclusion from all this I would do this. If you have to deal with large animations and you are not satisfied with a too high frame rate, then perhaps you should look in the direction of using UIImageView (maybe even in a separate layer on top of your Cocos2d scene). It may well be that this will be the best solution.

In fairness, I still want to note that here I omitted other important points that we had to take into account in our decisions. For example, the game should have been able to record what is happening as a video, for which we initially wanted to use one popular framework that works only with Cocos2d, since it records video from OpenGL. That was one of the reasons we held onto Cocos2d. But in the end, since all the animations were implemented through UIKit, we had to implement video recording on our own.

Also popular now: