I have CVPixelBufferRef
from an AVAsset
. I'm trying to apply a CIFilter
to it. I use these lines:
CVPixelBufferRef pixelBuffer = ...
CVPixelBufferRef newPixelBuffer = // empty pixel buffer to fill
CIContex *context = // CIContext created from EAGLContext
CGAffineTransform preferredTransform = // AVAsset track preferred transform
CIImage *phase1 = [CIImage imageWithCVPixelBuffer:pixelBuffer];
CIImage *phase2 = [phase1 imageByApplyingTransform:preferredTransform];
CIImage *phase3 = [self applyFiltersToImage:phase2];
[context render:phase3 toCVPixelBuffer:newPixelBuffer bounds:phase3.extent colorSpace:CGColorSpaceCreateDeviceRGB()];
Unfortunately, the result I get have an incorrect orientation. For example, a video captured in the portrait mode is upside down. I guess the problem is in going from AVAsset
to CoreImage
coordinate system (showing a preview in XCode for phase2 also presents an incorrect result). How to fix it?