I'm using the AVFoundation framework. In my sample buffer delegate I have the following code:
-(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection{
CVPixelBufferRef pb = CMSampleBufferGetImageBuffer(sampleBuffer);
CIImage *ciImage = [CIImage imageWithCVPixelBuffer:pb];
self.imageView.image = [UIImage imageWithCIImage:ciImage];
}
I am able to use the CIImage to run the face detector etc. but it does not show up in the UIImageView ... the imageView remains white. Any ideas as to the problem? I am using the following to setup my session:
self.session = [[AVCaptureSession alloc] init];
self.session.sessionPreset = AVCaptureSessionPreset640x480;
self.videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
self.videoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:nil];
self.frameOutput = [[AVCaptureVideoDataOutput alloc] init];
self.frameOutput.videoSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey];