Using GPUImage to Recreate iOS 7 Glass Effect
Asked Answered
I

1

12

I am trying to use the iOS 7 style glass effect in my glass by applying image effects to a screenshot of a MKMapView. This UIImage category, provided by Apple, is what I am using as a baseline. This method desaturates the source image, applies a tint color, and blurs heavily using the input vals:

[image applyBlurWithRadius:10.0
                 tintColor:[UIColor colorWithRed:229/255.0f green:246/255.0f blue:255/255.0f alpha:0.33] 
     saturationDeltaFactor:0.66
                 maskImage:nil];

This produces the effect I am looking for, but takes way too long — between .3 and .5 seconds to render on an iPhone 4.

enter image description here

I would like to use the excellent GPUImage as my preliminary attempts have been about 5-10 times faster, but I just can't seem to get it right.

GPUImagePicture *stillImageSource = [[GPUImagePicture alloc] initWithImage:image];

GPUImageSaturationFilter *saturationFilter = [[GPUImageSaturationFilter alloc] init];
saturationFilter.saturation = 0.33; // 1.0 - 0.66;
[stillImageSource addTarget:saturationFilter];

GPUImageMonochromeFilter *monochromeFilter = [[GPUImageMonochromeFilter alloc] init];
[monochromeFilter setColor:(GPUVector4){229/255.0f, 246/255.0f, 1.0f, 0.33f}];
[monochromeFilter setIntensity:0.2];
[saturationFilter addTarget:monochromeFilter];

GPUImageFastBlurFilter *blurFilter = [[GPUImageFastBlurFilter alloc] init];
blurFilter.blurSize = 2;
blurFilter.blurPasses = 3;
[monochromeFilter addTarget:blurFilter];

[saturationFilter prepareForImageCapture];
[monochromeFilter prepareForImageCapture];

[stillImageSource processImage];
image = [blurFilter imageFromCurrentlyProcessedOutput];

This produces an image which is close, but not quite there

enter image description here

The blur doesn't seem to be deep enough, but when I try to increase the blurSize above, it becomes grid-like, almost like a kaleidoscope. You can actually see the grid here by zooming in on the second image. The tint-color I am trying to mimic seems to just wash out the image instead of overlaying and blending, which I think the Apple sample is doing.

I have tried to setup the filters according to comments made by @BradLarson in another SO question. Am I using the wrong GPUImage filters to reproduce this effect, or am I just setting them up wrong?

Injury answered 23/8, 2013 at 14:8 Comment(2)
Have you tried github.com/JagCesar/iOS-blur? It's great although it works only under iOS7.Lipid
iOS-Blur is not a viable solution. You don't know what apple is doing under the hood to make this work, and it doesn't work on iOS 6.Injury
L
32

OK, I've been working on something here for a little while, and I finally have it functional. I just rolled a number of changes to GPUImage's blur filters into the framework, and as a result I believe I have a reasonable replica of Apple's blur effect that they use for things like the control center view.

Previously, the blurs that I had in the framework used a single precalculated radius, and the only way to affect their intensity was to tweak the spacing at which they sampled pixels from the input image. With a limited number of samples per pixel, changing the multiple for the spacing between sampled pixels much above 1.5 started introducing serious blocking artifacts as pixels were skipped.

The new Gaussian blur implementation that I've built combines the performance benefits of precalculated Gaussian weights with the ability to use an arbitrary radius (sigma) for the Gaussian blur. It does this by generating shaders on the fly as they are needed for various radii. It also reduces the number of texture samples required for a given blur radius by using hardware interpolation to read two texels at a time for each sample point.

The new GPUImageiOSBlurFilter combines this tuned arbitrary-radius Gaussian blur filter with a color-correction filter that appears to replicate the adjustment Apple performs to the colors after they've been blurred. I added the below comparison to my answer here, but it shows Apple's built-in blurring from the control center view on the left, and my new GPUImage blur filter on the right:

Apple's blur GPUImage's blur

As a way of improving performance (Apple's blur appears to occur with a sigma of 48, which requires quite a large area to be sampled for each pixel), I use a 4X downsampling before the Gaussian blur, then a 4X upsampling afterward. This reduces the number of pixels that need to be blurred by 16X, and also reduces the blur sigma from 48 to 12. An iPhone 4S can blur the entire screen in roughly 30 ms using this filter.

Getting the blur right is one thing. Apple still does not provide a fast way of getting the image content behind your views, so that most likely will be your bottleneck here for rapidly changing content.

Laylalayman answered 19/10, 2013 at 22:29 Comment(11)
So I've switched to GPUImageiOSBlurFilter and it's resulted in much faster rendering for me. I don't use it for live blurring, but I do need it for blurring images to display in UITableViewCells, so has to happy fast and it does. Thanks!Autolycus
Brad, could you please update pod specs for GPUImage? The old 0.1.1 spec doesn't contain GPUImageiOSBlurFilter. Thanks!Rhizobium
@KyrDunenkoff - I assume you're referring to CocoaPods? I don't maintain anything there, or even use CocoaPods myself, so it's up to whoever is in charge of that to point to the latest build in the repository. I only maintain the GitHub repository. If you want the latest code, I recommend pulling directly from GitHub.Laylalayman
@BradLarson Hey Brad. I was wondering what the reason is for the blurring to always have a light-coloured background? Even when blurring black images, for example I always end up with a greyish colour after blurring.Autolycus
@Autolycus - I tried to mimic what Apple seemed to do with their control center blur (the "color correction filter" I reference above). They don't just blur, but they also seem to reduce the dynamic range on luminance in order to prevent the background from washing out text or controls placed on that blurred view. I can't guarantee that I match their color adjustment in all situations, but it seemed to track with the few test backgrounds I tried against.Laylalayman
@BradLarson Seems to make sense, I did notice they do the same thing when using the UIImage+Effects class they provided at WWDC, just wondered if such effect could be achieved without lightening the image.Autolycus
@Autolycus - Well, all you need to do is to remove the final color correction step (swapping out the GPUImageLuminanceRangeFilter with a passthrough GPUImageFilter inside GPUImageiOSBlurFilter) to get the blur by itself. The GPUImageiOSBlurFilter doesn't have a lot to it, so it's easy to clone and alter in that manner. I may even move the down/upsampling for the blur within the main Gaussian blur class, making this even simpler.Laylalayman
@BradLarson So I was considering updating the class with a BOOL for whether there should be a frosted effect or not (property to turn it off), would you be interested in merging this if I submit a pull request?Autolycus
@Autolycus - I could see making this an option, where it swapped out the color correction filter for a passthrough at the last step. Might have to check one or two things with the forced processing scale-up if switching this after the filter is constructed, but it wouldn't be too hard to do.Laylalayman
@BradLarson Yeah, removing the GPUImageLuminanceRangeFilter definitely lowers the quality of the output image, so I had to lower the downsampling from 4 to 2. I've updated my local copy for now - still deciding which effect to use, but the option is nice. But I'll let you look into it further and update the class as you please :)Autolycus
@BradLarson regarding the pod spec - CocoaPods doesn't need to update anything. The issue kyr-dunenkoff is running into is that in your GitHub repo your latest release tag is v0.1.1, which doesn't have the iOS7 blur filter. Your pod spec in the repo points to that version. When you make a v0.1.2 tag in GitHub and update the pod spec to point to that all will be well. In the mean time kyr-dunenkoff you can update your pod file to just fetch based on HEAD instead of latest tagged version.Dinky

© 2022 - 2024 — McMap. All rights reserved.