MLAA morphological anti-aliasing algorithm for CPU

    Intel has published a description of the morphological smoothing algorithm (MLAA) , which is designed to work in real time on the CPU ( demo, source code ).

    As in the case with the graphics depixelization algorithm in games, which was discussed at Habr a couple of months ago , the Intel algorithm does not perform image scaling, but works with pixels in the original resolution. They are modified according to a few simple rules that are shown in the diagram. In short, the MLAA filter looks for L-, Z- and U-shaped borders of pixel groups, and then blurs the surrounding pixels to obtain smooth outlines.





    Simplicity and operation at the source pixel level is the main advantage of MLAA. Complex processing and work with the alpha channel are not required, there are no resource-intensive processes at all. The filter works frame-by-frame with an already rendered image and its performance can be clearly calculated in advance. It does not depend on the complexity of the scene, so you can guarantee the absence of "slowdowns" at any time. On a test machine (see the graph below), the MLAA speed is from 4 ms on a 1280 x 800 frame to 11 ms on a 1920 x 1200 frame, regardless of the complexity of the scene.

    Perhaps this is generally the fastest smoothing method that can work almost in real time on large images without the help of a GPU. Intel positions MLAA as a competitor to MSAA(Multisample anti-aliasing), the graph compares their performance.


    Intel Core i7-2820QM: Sandy Bridge, 4 cores, 8 streams 2.3 GHz, GT2 graphics, 4 GB RAM, Windows 7 Ultimate 64-bit SP 1

    MLAA performance is visible in the following images: frame from the game Battlefield: Bad Company 2 without anti-aliasing (first image) and with MLAA enabled (second image).





    Here is an enlarged (500%) fragment of the first and second image.





    An even more illustrative example (thanks, oxitnik ).

    via Hot Hardware , ExtremeTech , Intel

    Also popular now: