The issue that I'm running into is that when a user takes a photo with our app, using AVCaptureSession
, I have no way of determining whether they took the photo in Portrait or Landscape mode. Our app only supports Portrait and I keep the Orientation Lock on when using my phone so I'm trying to build a solution assuming that others might do the same.
I looked into using [UIDevice currentDevice] beginGeneratingDeviceOrientationNotifications]
but when the Orientation Lock is on, no notifications are ever received. I know that this functionality is possible because the base Camera app and the camera in the Google Hangouts app can detect the rotation (animations on the Cancel and Flash buttons are apparent) when my phone has Orientation Lock on.
Is my best bet to use the accelerometer and detect the angle the phone is being rotated to? An old answer, Detect iPhone screen orientation, makes it very obvious that detecting the angle that way is easy to to do (obviously adapting the answer to use Core Motion instead of UIAccelerometer), but I'm curious if there is another way to do it.
captureStillImageAsynchronouslyFromConnection
i'm still only able to get orientation = 6. I enabled the Landscape modes on the app, but to no avail. I'm going to try it out in a brand to new app to ensure that there isn't a weird setting hidden somewhere that is causing this issue. – Zebadiah