CIGaussianBlur image size
Asked Answered
C

3

14

I want to blur my view, and I use this code:

//Get a UIImage from the UIView
NSLog(@"blur capture");
UIGraphicsBeginImageContext(BlurContrainerView.frame.size);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();

//Blur the UIImage
CIImage *imageToBlur = [CIImage imageWithCGImage:viewImage.CGImage];
CIFilter *gaussianBlurFilter = [CIFilter filterWithName: @"CIGaussianBlur"];
[gaussianBlurFilter setValue:imageToBlur forKey: @"inputImage"];
[gaussianBlurFilter setValue:[NSNumber numberWithFloat: 5] forKey: @"inputRadius"]; //change number to increase/decrease blur
CIImage *resultImage = [gaussianBlurFilter valueForKey: @"outputImage"];

//create UIImage from filtered image
blurredImage = [[UIImage alloc] initWithCIImage:resultImage];

//Place the UIImage in a UIImageView
UIImageView *newView = [[UIImageView alloc] initWithFrame:self.view.bounds];
newView.image = blurredImage;

NSLog(@"%f,%f",newView.frame.size.width,newView.frame.size.height);
//insert blur UIImageView below transparent view inside the blur image container
[BlurContrainerView insertSubview:newView belowSubview:transparentView];

And it blurs the view, but not all of it. How can I blur all of the View?

billed: postimg.org/image/9bee2e4zx/

Cynde answered 11/12, 2013 at 23:14 Comment(0)
G
22

The issue isn't that it's not blurring all of the image, but rather that the blur is extending the boundary of the image, making the image larger, and it's not lining up properly as a result.

To keep the image the same size, after the line:

CIImage *resultImage    = [gaussianBlurFilter valueForKey: @"outputImage"];

You can grab the CGRect for a rectangle the size of the original image in the center of this resultImage:

// note, adjust rect because blur changed size of image

CGRect rect             = [resultImage extent];
rect.origin.x          += (rect.size.width  - viewImage.size.width ) / 2;
rect.origin.y          += (rect.size.height - viewImage.size.height) / 2;
rect.size               = viewImage.size;

And then use CIContext to grab that portion of the image:

CIContext *context      = [CIContext contextWithOptions:nil];
CGImageRef cgimg        = [context createCGImage:resultImage fromRect:rect];
UIImage   *blurredImage = [UIImage imageWithCGImage:cgimg];
CGImageRelease(cgimg);

Alternatively, for iOS 7, if you go to the iOS UIImageEffects sample code and download iOS_UIImageEffects.zip, you can then grab the UIImage+ImageEffects category. Anyway, that provides a few new methods:

- (UIImage *)applyLightEffect;
- (UIImage *)applyExtraLightEffect;
- (UIImage *)applyDarkEffect;
- (UIImage *)applyTintEffectWithColor:(UIColor *)tintColor;
- (UIImage *)applyBlurWithRadius:(CGFloat)blurRadius tintColor:(UIColor *)tintColor saturationDeltaFactor:(CGFloat)saturationDeltaFactor maskImage:(UIImage *)maskImage;

So, to blur and image and lightening it (giving that "frosted glass" effect) you can then do:

UIImage *newImage = [image applyLightEffect];

Interestingly, Apple's code does not employ CIFilter, but rather calls vImageBoxConvolve_ARGB8888 of the vImage high-performance image processing framework. This technique is illustrated in WWDC 2013 video Implementing Engaging UI on iOS.

Gorga answered 12/12, 2013 at 2:17 Comment(1)
Hey Rob! How're things going? I know that's it not your main field, but we have an interesting scenario where we want to downsample (Image Data wise) buffer before displaying it. I'm sure you get a lot of requests, but just in case it slips in, here's a link: #57154140 , as always - thank you!Prima
L
17

A faster solution is to avoid CGImageRef altogether and perform all transformations at CIImage lazy level.

So, instead of your unfitting:

// create UIImage from filtered image (but size is wrong)
blurredImage = [[UIImage alloc] initWithCIImage:resultImage];

A nice solution is to write:

Objective-C

// cropping rect because blur changed size of image
CIImage *croppedImage = [resultImage imageByCroppingToRect:imageToBlur.extent];
// create UIImage from filtered cropped image
blurredImage = [[UIImage alloc] initWithCIImage:croppedImage];

Swift 3

// cropping rect because blur changed size of image
let croppedImage = resultImage.cropping(to: imageToBlur.extent)
// create UIImage from filtered cropped image
let blurredImage = UIImage(ciImage: croppedImage)

Swift 4

// cropping rect because blur changed size of image
let croppedImage = resultImage.cropped(to: imageToBlur.extent)
// create UIImage from filtered cropped image
let blurredImage = UIImage(ciImage: croppedImage)
Lishalishe answered 12/9, 2017 at 7:57 Comment(4)
Thank u!! This saved me -> swift 4Pearson
can we control the blur of CIFilter? I need it to be more blurredOlpe
@AnuranBarman that's a different question. Please post it as a separate question, thank you. :)Benedictbenedicta
@AnuranBarman as a hint, you can play with inputRadius or switch to a different blur effect. If it's not enough, then you may need to code the blur effect yourself with vImageBoxConvolve_ARGB8888 from Accelerate.Benedictbenedicta
S
0

Looks like the blur filter is giving you back an image that’s bigger than the one you started with, which makes sense since pixels at the edges are getting blurred out past them. The easiest solution would probably be to make newView use a contentMode of UIViewContentModeCenter so it doesn’t try to squash the blurred image down; you could also crop blurredImage by drawing it in the center of a new context of the appropriate size, but you don’t really need to.

Snowfield answered 11/12, 2013 at 23:19 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.