Why does capturing images with AVFoundation give me 480x640 images when the preset is 640x480?
Asked Answered
I

1

9

I have some pretty basic code to capture a still image using AVFoundation.

AVCaptureDeviceInput *newVideoInput = [[AVCaptureDeviceInput alloc] initWithDevice:[self backFacingCamera] error:nil];

    AVCaptureStillImageOutput *newStillImageOutput = [[AVCaptureStillImageOutput alloc] init];

    NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:
                                    AVVideoCodecJPEG, AVVideoCodecKey,
                                    nil];


    [newStillImageOutput setOutputSettings:outputSettings];
    [outputSettings release];


    AVCaptureSession *newCaptureSession = [[AVCaptureSession alloc] init];
    [newCaptureSession beginConfiguration];
    newCaptureSession.sessionPreset = AVCaptureSessionPreset640x480;

    [newCaptureSession commitConfiguration];
    if ([newCaptureSession canAddInput:newVideoInput]) {
        [newCaptureSession addInput:newVideoInput];
    }
    if ([newCaptureSession canAddOutput:newStillImageOutput]) {
        [newCaptureSession addOutput:newStillImageOutput];
    }
    self.stillImageOutput = newStillImageOutput;
    self.videoInput = newVideoInput;
    self.captureSession = newCaptureSession;

    [newStillImageOutput release];
    [newVideoInput release];
    [newCaptureSession release];

My method that captures the still image is also pretty simple and prints out the orientation which is AVCaptureVideoOrientationPortrait:

- (void) captureStillImage
{
    AVCaptureConnection *stillImageConnection = [AVCamUtilities connectionWithMediaType:AVMediaTypeVideo fromConnections:[[self stillImageOutput] connections]];

    if ([stillImageConnection isVideoOrientationSupported]){
        NSLog(@"isVideoOrientationSupported - orientation = %d", orientation);
        [stillImageConnection setVideoOrientation:orientation];
    }

    [[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:stillImageConnection
                                                         completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {


                                                             ALAssetsLibraryWriteImageCompletionBlock completionBlock = ^(NSURL *assetURL, NSError *error) {
                                                                 if (error) { // HANDLE }
                                                             };

                                                                 if (imageDataSampleBuffer != NULL) {

                                                                 CFDictionaryRef exifAttachments = CMGetAttachment(imageDataSampleBuffer, kCGImagePropertyExifDictionary, NULL);
                                                                 if (exifAttachments) {
                                                                     NSLog(@"attachements: %@", exifAttachments);
                                                                 } else { 
                                                                     NSLog(@"no attachments");
                                                                 }
                                                                 self.stillImageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];                                                                  
                                                                 self.stillImage = [UIImage imageWithData:self.stillImageData];

                                                                 UIImageWriteToSavedPhotosAlbum(self.stillImage, self, @selector(image:didFinishSavingWithError:contextInfo:), nil);
                                                     }
                                                             else
                                                                 completionBlock(nil, error);

                                                         }];
}

So the device understands it's in portrait mode as it should be, the exif attachements show me:

PixelXDimension = 640;
PixelYDimension = 480;

so it seems to know that we're in 640x480 and that means WxH (obviously...)

However when I email the photo to myself from Apples Photos app, I get a 480x640 image if I check the properties in Preview. This didn't make any sense to me until I dug further into image properties to find out that the image orientation is set to "6 (Rotated 90 degrees CCW)" I'm sure CCW is counter clockwise

So looking at the image in a browser: http://tonyamoyal.com/stuff/things_that_make_you_go_hmm/photo.JPG We see a the image rotated 90 degrees CCW and it is 640x480.

I'm really confused about this behavior. When I take a 640x480 still image using AVFoundation, I would expect the default to have no rotated orientation. I expect a 640x480 image oriented exactly as my eye sees the image in the preview layer. Can someone explain why this is happening and how to configure the capture so that when I save my image to the server to later display in a web view, it is not rotated 90 degrees CCW?

Imaret answered 7/8, 2011 at 22:26 Comment(0)
I
33

This happens because the orientation set in the metadata of the new image is being affected by the orientation of the AV system that creates it. The layout of the actual image data is, of course, different from the orientation mentioned in your metadata. Some image viewing programs respect the metadata orientation, some ignore it.

You can affect the metadata orientation of the AV system by calling:

AVCaptureConnection *videoConnection = ...;
if ([videoConnection isVideoOrientationSupported])
    [videoConnection setVideoOrientation:AVCaptureVideoOrientationSomething];

You can affect the metadata orientation of a UIImage by calling:

UIImage *rotatedImage = [[UIImage alloc] initWithCGImage:image.CGImage scale:1.0f orientation:UIImageOrientationSomething];

But the actual data from the AVCapture system will always appear with the wider dimension as X and the narrower dimension as Y, and will appear to be oriented in LandscapeLeft.

If you want the actual data to line up with what your metadata claims, you need to modify the actual data. You can do this by writing the image out to a new image using CGContexts and AffineTransforms. Or there is an easier workaround. Use the UIImage+Resize package as discussed here. And resize the image to it's current size by calling:

UIImage *rotatedImage = [image resizedImage:CGSizeMake(image.size.width, image.size.height) interpolationQuality:kCGInterpolationDefault];

This will rectify the data's orientation as a side effect.

If you don't want to include the whole UIImage+Resize thing you can check out it's code and strip out the parts where the data is transformed.

Intermingle answered 10/8, 2011 at 4:43 Comment(1)
Beautiful! Works like a charm.Delighted

© 2022 - 2024 — McMap. All rights reserved.