I want to apply filters (effects) on a video file while the video is playing.
I'm currently using @BradLarson 's (great) GPUImage
framework to do so, the problem here is that the framework doesn't support audio playback while playing the video.
So I have two options:
1) Dive into the GPUImage
code and change the GPUImageMovie
so it will also process the audio buffers. This requires the knowledge of syncing the audio & video frames, and unfortunately i don't have it. I saw some hacks that try to play the audio with AVAudioPlayer
but with a lot of sync problems.
2) Use CoreImage
framework instead of GPUImage
.
So I want to take a look at the second option of using the native iOS CoreImage
and CIFilter
to do the job.
The problem is, I couldn't find any example of how to do this with CIFilter
, how do I apply filters on a video from a file?
Do I must use an AVAssetReader
to read the video and process each frame? if so I'm back to my first problem of syncing the audio & video.
Or is there a way to apply the filters chain directly on the video or on the preview layer?
Appreciate any help :)