While trying to apply a simple vignette filter to the raw camera feed of an iPhone6, with the help of Metal and Core Image, I see a lot of lag between the frames being processed and rendered in an MTKView
The approach which I have followed is (MetalViewController.swift):
- Get raw camera output using
AVCaptureVideoDataOutputSampleBufferDelegate
- Convert
CMSampleBuffer
>CVPixelBuffer
>CGImage
- Create an
MTLTexture
with thisCGImage
.
Point no. 2 and 3 are inside the method named: fillMTLTextureToStoreTheImageData
- Apply a
CIFilter
to theCIImage
fetched from theMTLTexture
in theMTKViewDelegate
func draw(in view: MTKView) {
if let currentDrawable = view.currentDrawable {
let commandBuffer = self.commandQueue.makeCommandBuffer()
if let myTexture = self.sourceTexture{
let inputImage = CIImage(mtlTexture: myTexture, options: nil)
self.vignetteEffect.setValue(inputImage, forKey: kCIInputImageKey)
self.coreImageContext.render(self.vignetteEffect.outputImage!, to: currentDrawable.texture, commandBuffer: commandBuffer, bounds: inputImage!.extent, colorSpace: self.colorSpace)
commandBuffer?.present(currentDrawable)
commandBuffer?.commit()
}
}
}
The performance is not at all what Apple mentioned in this doc: https://developer.apple.com/library/archive/documentation/GraphicsImaging/Conceptual/CoreImaging/ci_tasks/ci_tasks.html#//apple_ref/doc/uid/TP30001185-CH3-TPXREF101
Am I missing something?
CGImage
from theCVPixelBuffer
and then the Metal texture from that, rather than going straight fromCVPixelBuffer
toMTLTexture
. You can useCVMetalTextureCache
to do that. – Absa