How to use a CIFilter on the layerClass instance of an UIView?
Asked Answered
B

2

5

My UIView is using an instance of TBPaperLayer for its layer.

+(Class)layerClass {
    return [TBPaperLayer class];
}

I would like to create a CIFilter to modify the appearance of this layer - especially apply a blur filter on it. How can I use this code to blur a part of this layer ? (code from: Blur CALayer's Superlayer)

CALayer *blurLayer = [CALayer layer];
CIFilter *blur = [CIFilter filterWithName:@"CIGaussianBlur"];
[blur setDefaults];
blurLayer.backgroundFilters = [NSArray arrayWithObject:blur];
[self.superlayer addSublayer:blurLayer];

There is no superlayer during -init.

Beatabeaten answered 27/3, 2013 at 21:41 Comment(1)
Note that the linked solution was for the Mac, where you've been able to provide Core Image filters to views for a while. I don't believe you can do this with UIViews and CALayers on iOS yet, but I could have missed something in the recent updates.Viaduct
S
10

This is not possible on iOS. From the CALayer class reference:

Special Considerations

This property is not supported on layers in iOS.

Presumably Apple don't feel that the current generation of iOS hardware is powerful enough to support live image filtering.

Sall answered 28/3, 2013 at 6:43 Comment(0)
H
1

For iOS 6.1 I wanted a view that encapsulated the shrinking inward and deblurring effect used in some motion graphics for title sequences. My resulting code (not all shown here) steadily decreases both the stretch factor (inwardly shrinking the horizontal scale of the text) and the blur amount. The text against which this effect is applied is rendered as a UIBezierPath and stored in self.myPath. A timer fires the method that decreases the two values and calls setNeedsDisplay.

- (void)displayLayer:(CALayer *)layer
{
    UIGraphicsBeginImageContext(self.bounds.size);
    CGContextRef ctx = UIGraphicsGetCurrentContext();
    CGAffineTransform stretch = CGAffineTransformMakeScale(self.stretchFactor + 1.0, 1.0);
    CGPathRef stretchedPath = CGPathCreateCopyByTransformingPath([self.myPath CGPath], &stretch);
    CGRect newBox = CGPathGetBoundingBox(stretchedPath);
    float deltaX = CGRectGetMidX(self.bounds) - CGRectGetMidX(newBox);
    float deltaY = CGRectGetMidY(self.bounds) - CGRectGetMidY(newBox);
    CGAffineTransform slide = CGAffineTransformMakeTranslation(deltaX, deltaY);
    CGPathRef centeredPath = CGPathCreateCopyByTransformingPath(stretchedPath, &slide);
    CGPathRelease(stretchedPath);
    CGContextAddPath(ctx, centeredPath);
    CGPathRelease(centeredPath);
    CGContextSetFillColorWithColor(ctx, [[UIColor blackColor] CGColor]);
    CGContextFillPath(ctx);
    UIImage *tmpImage = UIGraphicsGetImageFromCurrentImageContext();
    UIGraphicsEndImageContext();
    CIImage *inputImage = [CIImage imageWithCGImage:[tmpImage CGImage]];
    CIFilter *gBlurFilter = [CIFilter filterWithName:@"CIGaussianBlur"
                                   keysAndValues:@"inputRadius", [NSNumber numberWithFloat:self.blurFactor],
                                                 @"inputImage", inputImage, nil];
    CIImage *blurredImage = [gBlurFilter outputImage];
    CIContext *context = [CIContext contextWithOptions:nil];
    CGImageRef cgimg = [context createCGImage:blurredImage fromRect:[blurredImage extent]];
    [layer setContents:(__bridge id)cgimg];
    CGImageRelease(cgimg);
}

- (void)drawRect:(CGRect)rect
{
    // empty drawRect: to get the attention of UIKit
}

I haven't yet checked this code for leaks, so consider it "pseudo code" :-) As shown this could have been done within drawRect: and not used layers, but I have other things going on with this view not shown in this condensed version.

But since CIGaussianBlur takes a noticeable amount of time, I'm looking at image processing using the Accelerate framework to see about making my version more fluid.

Hydroxide answered 16/5, 2013 at 20:15 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.