In which cases ARSCNView.raycastQuery returns nil?
Asked Answered
G

1

6

In my renderer delegate I create a raycast query from the center of the view to track estimated plane and display a 3D pointer that follows the raycast result.

It is done via view.raycastQuery(from:allowing:alignment:) but is returns nil.

My question is why ? There is no documentation that says why this function would returns a nil value. I understand the raycast results could be empty, but why a query would not be created ?

func renderer(_ renderer: SCNSceneRenderer, updateAtTime time: TimeInterval) {
    guard view.session.currentFrame?.camera.trackingState == .normal else {
        return
    }
    DispatchQueue.main.async {
        let center = view.center
        DispatchQueue.global(qos: .userInitiated).async {
            if let query = view.raycastQuery(from: center, allowing: .estimatedPlane, alignment: .any) {
                let results = view.session.raycast(query)
                ...
            }
            else {
                // sometimes it gets here
            }
        }
    }
}
Gout answered 5/2, 2020 at 10:8 Comment(0)
N
4

Return type ARRaycastQuery? is initially optional

According to Apple documentation:

raycastQuery(from:allowing:alignment:) instance method creates a ray that extends in the positive z-direction from the argument screen space point, to determine if any of the argument targets exist in the physical environment anywhere along the ray. If so, ARKit returns a 3D position where the ray intersects the target.

If a ray doesn't intersect a target, there's no ARRaycastQuery object. So there's nil.

func raycastQuery(from point: CGPoint, 
             allowing target: ARRaycastQuery.Target, 
                   alignment: ARRaycastQuery.TargetAlignment) -> ARRaycastQuery?

What are two possible reasons of returning nil?

  • There's no detected plane that ray hits
  • Code's logic is wrong

Solution

Here's how you code may look like:

@IBOutlet var sceneView: ARSCNView!

@IBAction func onTap(_ sender: UIGestureRecognizer) {

    let tapLocation: CGPoint = sender.location(in: sceneView)
    let estimatedPlane: ARRaycastQuery.Target = .estimatedPlane      
    let alignment: ARRaycastQuery.TargetAlignment = .any

    let query: ARRaycastQuery? = sceneView.raycastQuery(from: tapLocation,
                                                    allowing: estimatedPlane,
                                                   alignment: alignment)
    
    if let nonOptQuery: ARRaycastQuery = query {

        let result: [ARRaycastResult] = sceneView.session.raycast(nonOptQuery)
        
        guard let rayCast: ARRaycastResult = result.first
        else { return }

        self.loadGeometry(rayCast)
    }
}

And here's a method for getting a model:

func loadGeometry(_ result: ARRaycastResult) {

    let scene = SCNScene(named: "art.scnassets/myScene.scn")!

    let node: SCNNode? = scene.rootNode.childNode(withName: "model", 
                                               recursively: true)

    node?.position = SCNVector3(result.worldTransform.columns.3.x,
                                result.worldTransform.columns.3.y,
                                result.worldTransform.columns.3.z)

    self.sceneView.scene.rootNode.addChildNode(node!)
}
Nightlong answered 27/3, 2020 at 12:25 Comment(3)
As you said, There's no detected plane where ray hits. But the raycast query should be raycast-ed before knowing there are no detected hits (aka raycast results). You talk about results, I talk about query. Plus, sceneView.session.raycast(query!) may crash due to force unwrap because query can be nil, which is what I want to know : why a query may be nil ?Gout
I've corrected my answer. it's raycastQuery() determines whether any target exists in the physical environment, not a raycast() method. That's why it may be ARRaycastQuery or nil.Nightlong
Hey @AndyJazz I’ve been digging around for information on whether we can raycast from the front camera and n ARFaceTrackingConfiguration. In my tests, code that returns results in world configuration always returns nil in ARFaceTrackingConfiguration. Do you know if it’s simply not supported on front camera? There has to be a way to get access to depth data at a specific pixel like a converted gesture tap, right? Thank you, your post here was helpful and I’m hoping you have this answer, too.Stancil

© 2022 - 2024 — McMap. All rights reserved.