I am trying to apply filters to a video composition created with AVFoundation on iOS (filters could be, eg, blur, pixelate, sepia, etc). I need to both apply the effects in real-time and be able to render the composite video out to disk, but I'm happy to start with just one or the other.
Unfortunately, I can't seem to figure this one out. Here's what I can do:
- I can add a layer for animation to the UIView that's playing the movie, but it's not clear to me if I can process the incoming video image this way.
- I can add an array of CIFilters to the AVPlayerLayer, but it turns out these are ignored in iOS (it only works on Mac OS X).
- I can add an AVVideoCompositionCoreAnimationTool to the AVVideoCompopsition, but I'm not sure this would accomplish video processing (rather than animation) and it crashes with a message about not being designed for real-time playback anyway. I believe this is the solution for rendering animation when rendering to disk.
Other apps do this (I think), so I assume I'm missing something obvious.
note: I've looked into GPUImage and I'd love to use it, but it just doesn't work well with movies, especially movies with audio. See for example: