I have been following the apple's live stream camera editor code to get the hold of live video editing.
So far so good, but I need a way out to crop a sample buffer into 4 pieces and then process all four with different CIFilters. For instance, If the size of the image is 1000x1000, I want to crop the CMSampleBuffer into 4 images of size 250x250 and then apply unique filter to each, convert it back to CMSammpleBuffer and display on Metal View. Here is the code till which I could crop the CMSampleBuffer in a CGContext but could not convert it back to CMSampleBuffer:
let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!
CVPixelBufferLockBaseAddress(imageBuffer, .readOnly)
let baseAddress = CVPixelBufferGetBaseAddress(imageBuffer)
let bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer)
let cropWidth = 640
let cropHeight = 640
let colorSpace = CGColorSpaceCreateDeviceRGB()
let context = CGContext(data: baseAddress, width: cropWidth, height: cropHeight, bitsPerComponent: 8, bytesPerRow: bytesPerRow, space: colorSpace, bitmapInfo: CGImageAlphaInfo.noneSkipFirst.rawValue | CGBitmapInfo.byteOrder32Little.rawValue)
CVPixelBufferUnlockBaseAddress(imageBuffer, .readOnly)
// create image
let cgImage: CGImage = context!.makeImage()!
let image = UIImage(cgImage: cgImage)
I don't need CGImage, I need either CMSampleBuffer or CVImageBuffer so I can pass it to this function func render(pixelBuffer: CVPixelBuffer) -> CVPixelBuffer?
of FilterRenderer
class used in the sample code of this link from apple.