Visual effects for Skyforge. Art and Technology
All illustrations given in the article are given as examples on test objects and do not reflect the final quality of the game. I would like to boast, but, sorry, we can not yet.
My name is Dmitry Nikiforov, I am an effects artist at Allods Team. Work on the Skyforge project began for me in May 2011, before that I was making effects for the Allods Online MMORPG.
Effects in games have long been given a big role. They support the general atmosphere and style of the game, more colorful and spectacular reveal the gameplay. For example, can you imagine a fireball without a flame and a spectacular explosion at the end?
Today, visual effects are no longer just informative markers for in-game events. Effects have evolved from animated graphics to an independent art discipline with its own rules, technical requirements and approach.
Technologically, the effects move the entire gaming industry forward. To achieve this or that visual effect, large studies and a huge number of tests are carried out.
How do we make effects? It depends on what exactly needs to be received, what type of effect is required. If we work with a waterfall or a fire-breathing dragon, then the artists draw concept art and transfer it to FX artists. If the effect should accompany a certain ability of the character, like the aforementioned fireball, then the process is somewhat complicated. It will not work just to take and draw. To realize the artist’s plan, the whole team will have to pack up and brainstorm. The visual design is developed in such a way as to most effectively and spectacularly show certain events of the game mechanics. Next, you need to select samples of “materials”, of which the desired effect will consist. Stills from films, photographs are selected, even draft sketches are made on scraps of paper. If necessary, concept art for the future effect in the storyboard is compiled. Then the game mechanics design is tested with dummy effects - does it look good? And only then the effect is transferred to production.
Any successful MMO-project must be continuously developed, including in technological terms. A prime example is Blizzard's WoW. It is enough to compare the set of game and graphic features at the time of launching the game and today.
At Skyforge, we said no to middleware for the effects that Maya has been for a long time and wrote our own editor from scratch, like in all advanced game engines. There were several reasons for this. Firstly, already in "Allods" there is a tradition to make only small "parts" of future effects in Maya, and to collect the finished effect already in the game editor. Secondly, the visualization of effects in the Maya viewport and in the game editor are different. Thirdly, the Maya Particle engine for the new project was not enough.
Own effects editor saves a lot of time for export and subsequent customization of visualization in the game engine, allowing you to immediately see the result of work. The capabilities of the editor are constantly expanding in accordance with the requirements of modern technology. Any artist can "order" the tools he needs. The editor has no name yet. (Any ideas? Suggestions are accepted! :)
And what comes to mind when you hear about the effects in games? Surely these are dynamic simulations like smoke, fire or explosions. However, in most cases in games there are no dynamic effects that can be calculated in real time.
An example of the effect using PhysX technology and without it.
It turned out that the effects are easier and more convenient to calculate in advance and store the simulation result in small binary files.
First, we always get a predictable result. Giving a miscalculation of the effect to a real-time physics engine, one cannot be sure that the output will be exactly what the artist intended. If we take combined effects, where the main part is calculated, and the remainder is simulated in real time, we can see that, despite the general increase in detail and the number of particles, the real-time part is knocked out of the general picture. The biggest problem is collision handling.
Secondly, in this situation, an additional load is created on the dynamic engine, which is not yet able to provide the full variety of effects in real time.
In the Skyforge project, all effects are calculated (at the moment). We see full-fledged dynamic simulation only in the editor. This result is recorded, and then played at the right moment, like a normal animation. As I already said, the capabilities of our editor are much wider than the capabilities of Maya (in terms of game effects). In addition to the usual particles like Billboarded Sprite (a rectangular plane, always facing the camera), there is also Directed Sprites (a plane, one of whose axes is directed to the side, usually in the direction of movement).
Particles of the type “billboarded sprites”
Particles of the type “directed sprites” and an example of the effect where they are used .
Instead of sprites, you can draw software trails in place of particles. These are geometric ribbons that are procedurally built in each frame during the life of a particle.
Particles of the “trails” type and an example of the effect where they are used.
We can also replace the particle render with a full geometry, previously modeled in Maya, the so-called instances.
Particles of the “instanced geometry” type and an example of the effect where they are used.
In addition, we can render light sources instead of particles.
Particles of the “lights” type and an example of the effect where they are used.
The game engine uses deferred shading technology for rendering, where the final picture is built by composing a lot of pre-calculated layers from G-Buffer. Since with this approach it is impossible to correctly display translucent objects (which, first of all, include effects), they are rendered separately. Usually, a lightweight shader is used for effects, independent of lighting, and so that the effects do not get out of the surrounding world, color grading is applied to them, which takes into account the lighting of the local area.
Blending options depend on the type of particles and material settings, which can have several shader code options. For particle sprites, this is both additive and alpha blending. In addition, there is a special shader for liquids, which gains color depending on the “thickness” of the layer and changes the refraction parameters, changes the intensity of flare and reflections on the surface depending on the angle of view of the
Particle with the “water shader” material and an example of the effect where they are are used.
The geometry generated by the particle emitter can have any shader code used in the game. But most often the effects use a lightweight shader with additive or alpha blending. For trails, a separate shader is used, which has four blending options: alpha, additive, multiply and even inverse.
The effects editor allows you to control some shading parameters - color, transparency, self-illumination - and do it separately for each particle. That is why most of our textures are discolored. The textures themselves are collected in texture atlases, which are one or more images on the same area of the texture space (sequences of explosions, fire, smoke, etc.).
Textures can have several mask channels to simulate various material properties. The main texture contains color and transparency mask. Depending on the material, the texture atlas may also contain a map of normals, a mask of self-luminescence, gloss, intensity of glare, and light transmission.
An example of a texture satin.
Particles can be illuminated by global light sources, and also respond to IBL (Image based lighting). The figure shows how the smoke is illuminated depending on the position of the "sun", and also shows the operation of the self-luminosity mask.
Particles and lighting.
The effects editor allows you to generate particles in several ways.
You can specify a specific number of particles and set the position and time of appearance for each, or you can generate them at a constant interval per second.
There are point and volume particle generators, various "force fields" to ensure the desired behavior: gravity, turbulence, pushing and pulling field. There are also auxiliary collision objects to limit the movement of particles using a plane or volume. The figure below shows the operation of a collision object of the "sphere" type and a gravitational field.
The object of "collision".
Each effect consists of one or more so-called particle systems. A similar principle can be seen in the Maya program.
A particle system includes one or more particle generators, and all the particles inside the system are subject to the same rules - be it color, dynamic fields, collisions, etc. All particles in the system can be of only one type (sprites, instances, or trails), and all have the same material. The usual effect contains 3-7 particle systems, while the complex effect can contain more than 10 systems. Each system needs its own Draw Call to render, therefore, the fewer particle systems in the effect, the easier it is for the video card to display it.
In games, as in cinema, no one is interested in how plausibly and physically correctly the interaction of objects and particles in the effect occurs. The main thing is entertainment. But in games, the spectacular effect should be divided by the frequency of use.
Let's say we have a classic mage character who can throw fireballs. If this is a rare ability that can be applied no more than once per minute, then you can make a big "epic" explosion with complex animation that will be displayed for a fairly long time - up to several seconds.
If this is a “spam ability” that can be used at least every second, then the explosion will be completely different. This should be a small, non-irritating effect, which will also need to be quickly removed. Such effects can only live 0.25 seconds.
A game is primarily a gameplay. The effects themselves are secondary and support mechanics. If you follow this rule as precisely as possible, then in the game there will be absolutely no place for prettiness. The surroundings of the game will be dry, as in checkers or road signs, because no matter how beautiful the picture may be, it will inevitably distract attention from the gameplay. Fortunately, the mass audience that we are targeting loves and awaits beautiful things. The task of the effects artist is to come up with and implement it so that it does not contradict the game mechanics, but complements it.
The very styling of effects in Skyforge is different from Allods Online. The render uses global illumination with HDR adaptation and dynamic light sources. All this imposes certain requirements on visual effects. Each of them should look like a certain physical phenomenon (even if it is magic), and not like an abstract something, otherwise the effect will be knocked out of the general stylistics. If in “Allods” we made a cartoon, then in Skyforge we make a Hollywood blockbuster.
About art in effects, you can write a separate article, so here I will stop. It is always better to see everything yourself once than to read it 10 times, albeit with pictures. You will also get this opportunity in closed beta testing of the game.
The profession of an effects artist is a constant challenge. It is a necessity to achieve a plausible result under severe restrictions. Most often, this is a deception of the viewer, a mixture of technology and artist skill. Our specialty is distinguished by the ability to consider and understand the dynamics of natural phenomena, technical and technological processes, and even the operation of simple household appliances like a gas burner in a stove. Therefore, if you suddenly see a person who is “stuck”, looking at a waterfall, fountain or a burning urn, do not rush to conclusions - perhaps he is just an effects artist.
Other materials can be viewed on the Skyforge developers website and in our Vkontakte community .