Using CIFilter with AVFoundation (iOS)
Asked Answered
H

2

7

I am trying to apply filters to a video composition created with AVFoundation on iOS (filters could be, eg, blur, pixelate, sepia, etc). I need to both apply the effects in real-time and be able to render the composite video out to disk, but I'm happy to start with just one or the other.

Unfortunately, I can't seem to figure this one out. Here's what I can do:

  • I can add a layer for animation to the UIView that's playing the movie, but it's not clear to me if I can process the incoming video image this way.
  • I can add an array of CIFilters to the AVPlayerLayer, but it turns out these are ignored in iOS (it only works on Mac OS X).
  • I can add an AVVideoCompositionCoreAnimationTool to the AVVideoCompopsition, but I'm not sure this would accomplish video processing (rather than animation) and it crashes with a message about not being designed for real-time playback anyway. I believe this is the solution for rendering animation when rendering to disk.

Other apps do this (I think), so I assume I'm missing something obvious.

note: I've looked into GPUImage and I'd love to use it, but it just doesn't work well with movies, especially movies with audio. See for example:

Herb answered 17/12, 2013 at 6:7 Comment(2)
Have you found something? Could you please provide some tutorial or code snippet? I've same issueCheek
I got some help and used GPUImage. It's very powerful, but difficult to get it to play nice with movies. I honestly can't remember all the steps I had to go through, but I'm sure if I did, it would be too long for an answer here :(Herb
C
6

You could use the AVVideoCompositing and AVAsynchronousVideoCompositionRequest protocol to implement a custom compositor.

CVPixelBufferRef pixelBuffer = [AVAsynchronousVideoCompositionRequest sourceFrameByTrackID:trackID];
CIImage *theImage = [CIImage imageWithCVPixelBuffer:pixelBuffer];
CIImage *motionBlurredImage = [[CIFilter *filterWithName:@"CIMotionBlur" keysAndValues:@"inputImage", theImage, nil] valueForKey:kCIOutputImageKey];
CIContext *someCIContext = [CIContext contextWithEAGLContext:eaglContext];
[someCIContext render:motionBlurredImage toCVPixelBuffer:outputBuffer];

Then render the pixel buffer using OpenGL as described in Apple's Documentation. This would allow you to implement any number of transitions or filters that you want. You can then set the AVAssetExportSession.videoCompostion and you will be able to export the composited video to disk.

Clypeus answered 27/8, 2014 at 17:25 Comment(4)
You render pixel buffer using OpenGL, rather than just passing the frame back using finishWithComposedVideoFrame:?Brogue
could you please provide some code snippet for using AVVideoCompositing and AVAsynchronousVideoCompositionRequest ?Cheek
Here is some sample code from apple - AVCustomEdit-iOS I think the classes that you are going to be interested in is APLCrossDissolveRenderer the method you might want to look at is - (void)renderPixelBuffer:(CVPixelBufferRef)destinationPixelBuffer usingForegroundSourceBuffer:(CVPixelBufferRef)foregroundPixelBuffer...Clypeus
Here is better version with pure CoreImage https://mcmap.net/q/1621378/-rendering-a-video-in-a-calayer-hierarchy-using-cifiltersThormora
M
3

You can read AVComposition (it's an AVAsset subclass) with AVAssetReader. Get pixelbuffers, pass it to CIFilter (setting it up so that it uses GPU for rendering (no color management etc.) and render it on screen/output buffer depending on your needs. I do not think that Blur can be achieved realtime unless you use directly GPU.

You can read about CIFilter application to video (Applying Filter to Video section):

https://developer.apple.com/library/ios/documentation/graphicsimaging/conceptual/CoreImaging/ci_tasks/ci_tasks.html#//apple_ref/doc/uid/TP30001185-CH3-BAJDAHAD

Mobcap answered 30/12, 2013 at 4:45 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.