How to improve camera quality in ARKit
Asked Answered
K

3

13

I am building an ARKit app where we want to be able to take a photo of the scene. I am finding the image quality of the ARCamera view is not good enough to take photos with on an iPad Pro.

Standard camera image: Standard camera image

ARCamera image: ARCamera image

I have seen an Apple forum post that mentions this could be iPad Pro 10.5 specific and is related to fixed lens position (https://forums.developer.apple.com/message/262950#262950).

Is there are public way to change the setting?

Alternatively, I have tried to use AVCaptureSession to take a normal photo and apply it to sceneView.scene.background.contents to switch out a blurred image for higher res image at the point the photo is taken but can't get AVCapturePhotoOutput to work with ARKit

Kleptomania answered 10/10, 2017 at 21:4 Comment(1)
Did you find a solution? I'm having same trouble with the True-Depth front camera on iPhone X with the ARFaceTracking session, I would like to take a high res out put still image from the session. Were you able to switch to a normal AVCaptureSession? What's the pause between the switch? Really want to know the options here...Stamps
M
16

Update: Congrats to whoever filed feature requests! In iOS 11.3 (aka "ARKit 1.5"), you can control at least some of the capture settings. And you now get 1080p with autofocus enabled by default.

Check ARWorldTrackingConfiguration.supportedVideoFormats for a list of ARConfiguration.VideoFormat objects, each of which defines a resolution and frame rate. The first in the list is the default (and best) option supported on your current device, so if you just want the best resolution/framerate available you don't have to do anything. (And if you want to step down for performance reasons by setting videoFormat, it's probably better to do that based on array order rather than hardcoding sizes.)

Autofocus is on by default in iOS 11.3, so your example picture (with a subject relatively close to the camera) should come out much better. If for some reason you need to turn it off, there's a switch for that.


There's still no API for changing the camera settings for the underlying capture session used by ARKit.

According to engineers back at WWDC, ARKit uses a limited subset of camera capture capabilities to ensure a high frame rate with minimal impact on CPU and GPU usage. There's some processing overhead to producing higher quality live video, but there's also some processing overhead to the computer vision and motion sensor integration systems that make ARKit work — increase the overhead too much, and you start adding latency. And for a technology that's supposed to show users a "live" augmented view of their world, you don't want the "augmented" part to lag camera motion by multiple frames. (Plus, on top of all that, you probably want some CPU/GPU time left over for your app to render spiffy 3D content on top of the camera view.)

The situation is the same between iPhone and iPad devices, but you notice it more on the iPad just because the screen is so much larger — 720p video doesn't look so bad on a 4-5" screen, but it looks awful stretched to fill a 10-13" screen. (Luckily you get 1080p by default in iOS 11.3, which should look better.)

The AVCapture system does provide for taking higher resolution / higher quality still photos during video capture, but ARKit doesn't expose its internal capture session in any way, so you can't use AVCapturePhotoOutput with it. (Capturing high resolution stills during a session probably remains a good feature request.)

Mycostatin answered 10/10, 2017 at 23:50 Comment(5)
Thanks rickster. I'll put in a feature request but still keep looking at how I can use the camera to capture the image and switch out the scene backgroundKleptomania
@EdwardFord have you managed to find any solution for this problem?Mature
I haven't yet but Apple have just announced a new version with improvements to the image resolution developer.apple.com/news/?id=01242018bKleptomania
Right, you can set the video capture resolution and frame rate in iOS 11.3, and it defaults to 1080p60 on most devices instead of 720p. But there’s still not a way to get high quality still photos.Mycostatin
1080p is still not good enough for still image output. I need the original 4:3 aspect ratio. The front camera is already 8MP nowadays. 720p and 1080p are crop of the 4:3 size... annoyingStamps
C
3
config.videoFormat = ARWorldTrackingConfiguration.supportedVideoFormats[1]

I had to look for a while on how to set the config, so maybe it will help somebody.

Caritta answered 6/2, 2020 at 10:11 Comment(1)
if for some reason the device does not have it it will crash, and also it does not mean it will be in the order you needBereave
M
1

This lets you pick the one with the highest resolution, you can change it so that it picks by most fps, etc...

    if let videoFormat = ARWorldTrackingConfiguration.supportedVideoFormats.sorted { ($0.imageResolution.width * $0.imageResolution.height) < ($1.imageResolution.width * $1.imageResolution.height) }.last{
        configuration.videoFormat = videoFormat
    }
Maddi answered 31/1, 2022 at 7:33 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.