I'm creating an app that requires real-time application of filters to images. Converting the UIImage
to a CIImage
, and applying the filters are both extremely fast operations, yet it takes too long to convert the created CIImage
back to a CGImageRef
and display the image (1/5 of a second, which is actually a lot if editing needs to be real-time).
The image is about 2500 by 2500 pixels big, which is most likely part of the problem
Currently, I'm using
let image: CIImage //CIImage with applied filters
let eagl = EAGLContext(API: EAGLRenderingAPI.OpenGLES2)
let context = CIContext(EAGLContext: eagl, options: [kCIContextWorkingColorSpace : NSNull()])
//this line takes too long for real-time processing
let cg: CGImage = context.createCGImage(image, fromRect: image.extent)
I've looked into using EAGLContext.drawImage()
context.drawImage(image, inRect: destinationRect, fromRect: image.extent)
Yet I can't find any solid documentation on exactly how this is done, or if it would be any faster
Is there any faster way to display a CIImage
to the screen (either in a UIImageView
, or directly on a CALayer
)? I would like to avoid decreasing the image quality too much, because this may be noticeable to the user.
var image: CIImage? { didSet { self.setNeedsDisplay() } }
Then whole setRenderImage method becomes obsolete. – Ehrlich