The Mac OS X app I'm coding is taking a photo capture using the macbook built-in facetime camera.
On MacBookAir3,2, MacBookPro8,2 and MacBookPro10,2 it works fine but on new macbooks it takes "dark" photos. I understand it's because of auto exposure but I have trouble to get it working. The AVCaptureDevice
adjustingExposure
is set to NO
but the captured photo is still completely dark.
The code: setupCamera
is called once during the app launch
-(void) setupCamera
{
session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPresetPhoto;
sessionInitialized = YES;
device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
[device lockForConfiguration:NULL];
if ([device isExposureModeSupported:AVCaptureExposureModeContinuousAutoExposure])
[device setExposureMode:AVCaptureExposureModeContinuousAutoExposure];
if ([device isFocusModeSupported:AVCaptureFocusModeContinuousAutoFocus])
[device setFocusMode:AVCaptureFocusModeContinuousAutoFocus];
if ([device isWhiteBalanceModeSupported:AVCaptureWhiteBalanceModeContinuousAutoWhiteBalance])
[device setWhiteBalanceMode:AVCaptureWhiteBalanceModeContinuousAutoWhiteBalance];
[device unlockForConfiguration];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if(error != nil) {
// ...
}
if([session canAddInput:input]) {
[session addInput:input];
} else {
// ...
}
output = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = @{ AVVideoCodecKey : AVVideoCodecJPEG };
[output setOutputSettings:outputSettings];
if([session canAddOutput:output]) {
[session addOutput:output];
} else {
// ...
}
}
... then each click on the snap button in the UI calls the shootPhoto
function:
-(void) shootPhoto
{
[session startRunning];
if([device lockForConfiguration:NULL]) {
if ([device isExposureModeSupported:AVCaptureExposureModeContinuousAutoExposure])
[device setExposureMode:AVCaptureExposureModeContinuousAutoExposure];
if ([device isFocusModeSupported:AVCaptureFocusModeContinuousAutoFocus])
[device setFocusMode:AVCaptureFocusModeContinuousAutoFocus];
if ([device isWhiteBalanceModeSupported:AVCaptureWhiteBalanceModeContinuousAutoWhiteBalance])
[device setWhiteBalanceMode:AVCaptureWhiteBalanceModeContinuousAutoWhiteBalance];
[device unlockForConfiguration];
}
if(device.adjustingFocus == NO && device.adjustingExposure == NO && device.adjustingWhiteBalance == NO) {
[self actuallyCapture];
} else {
[device addObserver:self forKeyPath:@"adjustingExposure" options:NSKeyValueObservingOptionNew context:MyAdjustingExposureObservationContext];
[device addObserver:self forKeyPath:@"adjustingFocus" options:NSKeyValueObservingOptionNew context:MyAdjustingFocusObservationContext];
[device addObserver:self forKeyPath:@"adjustingWhiteBalance" options:NSKeyValueObservingOptionNew context:MyAdjustingWhiteBalanceObservationContext];
}
}
-(void) actuallyCapture
{
if ([session isRunning] == NO)
return;
connection = [output connectionWithMediaType:AVMediaTypeVideo];
[output captureStillImageAsynchronouslyFromConnection:connection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
// save file etc ...
}];
}
The idea is to check if camera device is adjusting focus, exposure or white balance. If not call actuallyCapture
right away. If it is adjusting - add observers and call actuallyCapture
from the observeValueForKeyPath
.
The problem is that the addObserver
calls are never called because the device returns all adjustingX==NO
- but still, the captured photo is dark.
What might be the reason? Am I waiting for white balance and exposure adjustments properly?
It's hard to debug for me because I only own those devices that work fine.