How to develop your own photo editor for iOS. VKontakte Competition Report

    Hello to all habrazhitel and interested!
    Yesterday (suddenly) there was an end to the delivery of projects to the first stage of another glorious contest of photo editors for iOS from VKontakte. And in this article I want to share my experience, tell about the rake and the problems that I encountered when developing my version of this product.


    The requirement of the contest for shooting: “All filters should work in real time and not slow down the application” - raised the question, “which engine to take.” The solution to this issue is google in 5 seconds and is called GPUImage . One of my friends (we will not call him by name) said that he was naughty, there are 140 open issue, and in general it’s easier to take and write everything yourself. But time was a pity, besides, I evaluate my strength objectively, so I took it. I think that almost all participants use this particular library :)
    Of course, there were a lot of problems with GPUImage , but everything seems to be solved.
    For example, a big problem with memory consumption, which leaks in an unknown direction, which is why the application closes. This can be either a problem with your code or a library problem - it’s not entirely clear. Although there are a whole bunch of examples in the library, some subtle points can be a headache.
    So, an error crept into my code when I was doing something like:
    [stillCamera        addTarget:filter];

    [filter prepareForImageCapture];

    twice, due to which the application consumed the cosmic scale of memory and fell on the set (especially with the bluer).
    Also, the problem is that all the filters inside generate high-resolution textures, so I decided that 640kb is enough for everyone, 1024x1536 pixels are enough for everyone, and I did the forced processing to the given size:
    CGSize forceSize = CGSizeMake(1024, 1024 * 1.5);
    if (!frontCameraSelected)
         [processFilter forceProcessingAtSize:forceSize];
    @autoreleasepool {
         [stillCamera capturePhotoAsJPEGProcessedUpToFilter:processFilter ...

    However, I must admit that over the past two weeks, the developer of the Brad Larson library has fixed several problems, and generally keeps in touch. In a word, well done!


    Perhaps this is the most difficult part, from the point of view of the programmer, because all the filters presented had to be selected. To select filters, I wrote an additional application on the iPad, with the help of which I selected filters with the parameters immediately used for GPUImage.

    An approximate process of filter selection:

    I spent about 3 days (1-2 hours) to select the filters presented, and then I began to have fun with additional ones. For example, an eight-bit filter is my favorite:

    Thus, I created a class of filters that were set by the group. Omitting initialization and internal calls, it looks something like this:
         Contrast : 1.032491
         Gamma : 1.196751
         Sepia : 0.442238
         Saturation : 1.198556
        GPUImageContrastFilter * contrast = [[GPUImageContrastFilter alloc] init];
        contrast.contrast = 1.032491f;
        [self prepare:contrast];
        GPUImageGammaFilter * gamma = [[GPUImageGammaFilter alloc] init];
        gamma.gamma = 1.196751;
        [self prepare:gamma];
        GPUImageSepiaFilter * sepia = [[GPUImageSepiaFilter alloc] init];
        sepia.intensity = 0.442238;
        [self prepare:sepia];
        GPUImageSaturationFilter * saturation = [[GPUImageSaturationFilter alloc] init];
        saturation.saturation = 1.198556;
        [self prepare:saturation];    

    Then a new problem arises with filters: real-time braking :) unfortunately, for many filters I could not completely defeat it, although there was an idea not to combine filters into groups, but to immediately combine them into one shader. I tried with one, but did not win much performance, so I decided to regret the time and not use this approach.


    One of the conditions of the competition was masturbation and image fitting. UIScrollView works very well with scaling, but after all, we need to "take the result" in GPUImageView. I went for a trick, or a trick, and applied a GPUImageTransformFilter transformation filter to the image . I considered the transformation according to the results of scaling and dragging UIScrollView, which lies on the top layer.
    The code for the transformation is as follows:

    - (void)scrollViewDidScroll:(UIScrollView *)scrollView
        CGPoint offset = scrollView.contentOffset;
        CGSize size = scrollView.contentSize;
        CGFloat scrollViewWidth = scrollView.frame.size.width,
                scrollViewHeight = scrollView.frame.size.height;
        translationX = 0;
        float a = size.width - scrollViewWidth, b = size.height - scrollViewWidth;
        if ((int)a != 0) {
            translationX = (a / scrollViewWidth) * (0.5f - offset.x / a) * 2;
        translationY = 0;
        if ((int)b != 0) {
            translationY = (b / scrollViewWidth) * (0.5f - offset.y / b) * 2;
        if (size.height < size.width)
            translationX *= aspectRatio;
            translationY *= aspectRatio;
        CGAffineTransform resizeTransform = CGAffineTransformMakeScale(scrollView.zoomScale / scrollView.minimumZoomScale, scrollView.zoomScale / scrollView.minimumZoomScale);
        resizeTransform.tx = translationX;
        resizeTransform.ty =  translationY;
        transformFilter.affineTransform = resizeTransform;
        [self fastRedraw]; //Обновление сцены

    For me, perhaps the strangest thing about this was that when we have a horizontal image, we should multiply the result by the aspect ratio. Honestly, I picked it up more than realized.
    In addition, scaling and dragging and dropping with the filter applied is not a good idea, because it slows down brutally. Therefore, I turned off the filter for the duration of these actions, and turned it on after. It works just fine.

    IPhone 5 support

    This is not a very difficult topic, however, you need to keep it in mind. The application should not just stretch, but also behave a little differently. Fortunately, autoresize solves 80% of the problems, the remaining 20% ​​solves the code using one well-known method:
    - (BOOL)hasFourInchDisplay {
        return ([[UIDevice currentDevice] userInterfaceIdiom] == UIUserInterfaceIdiomPhone && [UIScreen mainScreen].bounds.size.height == 568.0);

    In important places, use this code, and customize animations and frames to new sizes. And everything will be fine with you. You just need to remember the new iPhone, at least a simulator.

    Recent photos

    In the day when everything was practically ready, the “past photos” became a new headache. Their problem is that it is necessary to update them in a timely manner: took a photo - update, deleted from the gallery - update.
    I don’t know how the other participants are, but I did get recent photos using the AssetsLibrary and method enumerateAssetsAtIndexes ... . So this method crashed when you load your assets, then exit the application, delete something from the gallery, and then enter the application again, because the [NSIndexSet indexSetWithIndexesInRange:assetsRange]invalid set is already stored.
    In general, until the very last hours of delivery, this problem tormented me, but now it has been fixed.

    Instead of an epilogue

    During these two weeks, I got a lot of exp, and figured out many interesting aspects, both in development, programming, and image processing, and in general working with similar libraries.
    I would like to wish all the participants good luck and prizes! And to me - the first;)

    PS I will post the source code after the publication of the results of the competition. That there were no excesses.

    PPS Link to the source code for the third round:
    Sources are pretty dirty, it was written very quickly and often thoughtlessly. Sorry :)
    Requires Xcode 4.5 and above

    Also popular now: