I'm currently developing an iOS app that applies CoreImage to the camera feed in order to take photos and videos, and I've run into a bit of snag.
Up till now I've been using AVCaptureVideoDataOutput to obtain the sample buffers and manipulate them with CoreImage, and then displayed a simple preview, as well as using it to capture photos and saving them.
When I tried to implement Video Recording, by writing the SampleBuffers to a video as I received them from the AVCaptureVideoDataOutput, it had a very slow frame rate (probably because of the other image relating processing that was going on).
So I was wondering, is it possible to have an AVCaptureVideoDataOutput and a AVCaptureMoveFileOutput going on the same AVCaptureSession simultaneously?
I gave it a quick go, and found that when I added the extra output, my AVCaptureVideoDataOutput stopped receiving information.
If I can get it working, I'm hoping it means that I can simply use the 2nd output to record video at high frame rates, and do post-processing on the video after the user has stopped recording.
Any help will be greatly appreciated.