The memory-efficient way of using Core Image on iOS?
Asked Answered
V

1

9

I'm using Core Image filters in my app, everything works fine on my iPhone 5 device running iOS 7, but when I test it on iPhone 4s, which has only a total memory of 512MB, the app crashes.

Here's the situation, I have 2 images taken from the camera, with a resolution of 2448x3264 each. In my iPhone 5, the whole process takes up 150MB at the peak according to instruments.

instruments memory usage

However, when I try to run the same code on iPhone 4s, the instruments gave me memory low warning all the time, even if the whole memory use is quite low (around 8 MB). Here's the screenshot below.

iphone 4s memory usage

And here's the code, basically, I loaded two images from documents folder of my app, and applied 2 filters in a row:

    CIImage *foreground = [[CIImage alloc] initWithContentsOfURL:foregroundURL];
    CIImage *background = [[CIImage alloc] initWithContentsOfURL:backgroundURL];
    CIFilter *softLightBlendFilter = [CIFilter filterWithName:@"CISoftLightBlendMode"];
    [softLightBlendFilter setDefaults];
    [softLightBlendFilter setValue:foreground forKey:kCIInputImageKey];
    [softLightBlendFilter setValue:background forKey:kCIInputBackgroundImageKey];

    foreground = [softLightBlendFilter outputImage];
    background = nil;
    softLightBlendFilter = nil;

    CIFilter *gammaAdjustFilter = [CIFilter filterWithName:@"CIGammaAdjust"];
    [gammaAdjustFilter setDefaults];
    [gammaAdjustFilter setValue:foreground forKey:kCIInputImageKey];
    [gammaAdjustFilter setValue:[NSNumber numberWithFloat:value] forKey:@"inputPower"];
    foreground = [gammaAdjustFilter valueForKey:kCIOutputImageKey];

    gammaAdjustFilter = nil;

    CIContext *context = [CIContext contextWithOptions:nil];
    CGRect extent = [foreground extent];
    CGImageRef cgImage = [context createCGImage:foreground fromRect:extent];

    UIImage *image = [UIImage imageWithCGImage:cgImage scale:1.0 orientation:imgOrientation];
    CFRelease(cgImage);
    foreground = nil;

    return image;

The app crashed at this line: CGImageRef cgImage = [context createCGImage:foreground fromRect:extent];

Is there any more memory-efficient way of handling this situation, or what am I doing wrong here?

Big thanks!

Velasquez answered 24/12, 2013 at 13:36 Comment(7)
May be it helps if you put some of the relevant code into an @autoreleasepool { relevant code } ?Udelle
I'm using ARC, so I guess everything is in a autoreleasepool?Velasquez
Try it and see for yourself.Udelle
=> lookup Xcode Documentation for NSAutoReleasePoolUdelle
@DigiMonk, I warped the whole bunch of code above inside a autoreleasepool, but the app still crashed, what is the relevant code you suggest?Velasquez
Ok. I up-voted your question and hopefully others will help soon. If "it" can be solved by inserting an autoreleasepool I would try to "pool" different code blocks, second try would be start it before "CIContext *context" and end it before "return image"; (you already did the first try)Udelle
Thanks @DigiMonk, I'll try the second way!Velasquez
E
15

Short version:

While it seems trivial in concept, this is actually a pretty memory intensive task for the device in question.

Long version:

Consider this: 2 images * 8 bits each for RGBA * 2448 * 3264 ~= 64MB. Then CoreImage will require another ~32MB for the output of the filter operation. Then getting that from a CIContext into a CGImage is likely going to consume another 32MB. I would expect the UIImage copy to share the CGImage's memory representation at least by mapping the image using VM with copy-on-write, although you may get dinged for the double usage anyway since despite not consuming "real" memory, it still counts against pages mapped.

So at a bare minimum, you're using 128MB (Plus any other memory your app happens to use). This is a considerable amount of RAM for a device like the 4S which only starts with 512MB to begin with. IME, I would say that this would sorta be on the outer edge of what would be possible. I would expect it to work at least some of the time, but it does not surprise me to hear that it's getting memory warnings and memory pressure kills. You will want to make sure that the CIContext and all the input images are deallocated/disposed as soon after making the CGImage as possible, and before making the UIImage from the CGImage.

In general, this could be made easier by scaling down the image size.

Without testing, and assuming ARC, I present the following as a potential improvement:

- (UIImage*)imageWithForeground: (NSURL*)foregroundURL background: (NSURL*)backgroundURL orientation:(UIImageOrientation)orientation value: (float)value
{
    CIImage* holder = nil;
    @autoreleasepool
    {
        CIImage *foreground = [[CIImage alloc] initWithContentsOfURL:foregroundURL];
        CIImage *background = [[CIImage alloc] initWithContentsOfURL:backgroundURL];
        CIFilter *softLightBlendFilter = [CIFilter filterWithName:@"CISoftLightBlendMode"];
        [softLightBlendFilter setDefaults];
        [softLightBlendFilter setValue:foreground forKey:kCIInputImageKey];
        [softLightBlendFilter setValue:background forKey:kCIInputBackgroundImageKey];

        holder = [softLightBlendFilter outputImage];
        // This probably the peak usage moment -- I expect both source images as well as the output to be in memory.
    }
    //  At this point, I expect the two source images to be flushed, leaving the one output image
    @autoreleasepool
    {
        CIFilter *gammaAdjustFilter = [CIFilter filterWithName:@"CIGammaAdjust"];
        [gammaAdjustFilter setDefaults];
        [gammaAdjustFilter setValue:holder forKey:kCIInputImageKey];
        [gammaAdjustFilter setValue:[NSNumber numberWithFloat:value] forKey:@"inputPower"];
        holder = [gammaAdjustFilter outputImage];
        // At this point, I expect us to have two images in memory, input and output
    }
    // Here we should be back down to just one image in memory
    CGImageRef cgImage = NULL;

    @autoreleasepool
    {
        CIContext *context = [CIContext contextWithOptions:nil];
        CGRect extent = [holder extent];
        cgImage = [context createCGImage: holder fromRect:extent];
        // One would hope that CG and CI would be sharing memory via VM, but they probably aren't. So we probably have two images in memory at this point too
    }
    // Now I expect all the CIImages to have gone away, and for us to have one image in memory (just the CGImage)
    UIImage *image = [UIImage imageWithCGImage:cgImage scale:1.0 orientation:orientation];
    // I expect UIImage to almost certainly be sharing the image data with the CGImageRef via VM, but even if it's not, we only have two images in memory
    CFRelease(cgImage);
    // Now we should have only one image in memory, the one we're returning.
    return image;
}

As indicated in the comments, the high watermark is going to be the operation that takes two input images and creates one output image. That will always require 3 images to be in memory, no matter what. To get the high watermark down any further from there, you'd have to do the images in sections/tiles or scale them down to a smaller size.

Ectosarc answered 24/12, 2013 at 15:27 Comment(8)
Thanks ipmcc and digimonk, your comment helped me out of this. And now I know how to use local autoreleasepool to urge the memory be released. Big thanks!Velasquez
We had same issues for our app, also using GPUImage lib, and had to resize images to 800px wide, 640px on iPhone 4/4S. Also wrapped most processing in @autorelease.Comminute
The first two autorelease pools have no effect. CIImage don't allocate memory by themselves (think of them as recipes). Memory allocation peak occurs within the third autorelease pool in your example, and if that peak goes beyond available memory, it will crash. That last autolrelease pool probably makes the CIContext internal resources deallocated a bit earlier, which is still an improvement vs. not having it.Glengarry
I had a number of contexts working one after another, but it kept on crashing on devices. This fixed it. Thanks ipmcc.Mckenzie
Can you please post the swift version of this answer?Deuteragonist
I'll bother to learn Swift when Apple ships anything non-trivial written in it. :) Sorry.Ectosarc
Correct me if I'm wrong, but I believe at the point of the comment Now I expect all the CIImages to have gone away the holder CIImage will not have gone away. (Not that CIImages allocate memory as above comment mentioned, but still.)Sheepcote
Is there a reason why you recreate the CIContext instead of use global one?Passed

© 2022 - 2024 — McMap. All rights reserved.