LiDAR and RealityKit – Capture a Real World Texture for a Scanned Model
Asked Answered
P

2

32

Task

I would like to capture a real-world texture and apply it to a reconstructed mesh produced with a help of LiDAR scanner. I suppose that Projection-View-Model matrices should be used for that. A texture must be made from fixed Point-of-View, for example, from center of a room. However, it would be an ideal solution if we could apply an environmentTexturing data, collected as a cube-map texture in a scene.

enter image description here

Look at 3D Scanner App. It's a reference app allowing us to export a model with its texture.

I need to capture a texture with one iteration. I do not need to update it in a realtime. I realize that changing PoV leads to a wrong texture's perception, in other words, distortion of a texture. Also I realize that there's a dynamic tesselation in RealityKit and there's an automatic texture mipmapping (texture's resolution depends on a distance it captured from).

import RealityKit
import ARKit
import Metal
import ModelIO

class ViewController: UIViewController, ARSessionDelegate {
    
    @IBOutlet var arView: ARView!

    override func viewDidLoad() {
        super.viewDidLoad()

        arView.session.delegate = self
        arView.debugOptions.insert(.showSceneUnderstanding)

        let config = ARWorldTrackingConfiguration()
        config.sceneReconstruction = .mesh
        config.environmentTexturing = .automatic
        arView.session.run(config)
    }
}

Question

  • How to capture and apply a real world texture to a reconstructed 3D mesh?


Personally answered 8/9, 2020 at 12:25 Comment(5)
Have you checked this approach? developer.apple.com/forums/thread/654431Woollen
Yep, it's about solid color, not about real-world objects' texture.Personally
Oh! I misread your question. I thought you want to apply classification to the exported mesh.Woollen
Hi, Could you share your source with adding Object Capture API?Dimpledimwit
@DevSolution, click on a link below.Personally
P
19

Object Reconstruction

10 October 2023, Apple released iOS Reality Composer 1.6 app that is capable of capturing a real world model's mesh with texture in realtime using the LiDAR scanning process. But at the moment there's still no native programmatic API for that (but we are all looking forward to it).

Also, there's a methodology that allows developers to create textured models from series of shots.

Photogrammetry

Object Capture API, announced at WWDC 2021, provides developers with the long-awaited photogrammetry tool. At the output we get USDZ model with UV-mapped hi-rez texture. To implement Object Capture API you need macOS 12+ and Xcode 13+.

enter image description here

To create a USDZ model from a series of shots, submit all taken images to RealityKit's PhotogrammetrySession.

Here's a code snippet that spills a light on this process:

import RealityKit
import Combine

let pathToImages = URL(fileURLWithPath: "/path/to/my/images/")

let url = URL(fileURLWithPath: "model.usdz")

var request = PhotogrammetrySession.Request.modelFile(url: url, 
                                                   detail: .medium)

var configuration = PhotogrammetrySession.Configuration()
configuration.sampleOverlap = .normal
configuration.sampleOrdering = .unordered
configuration.featureSensitivity = .normal
configuration.isObjectMaskingEnabled = false

guard let session = try PhotogrammetrySession(input: pathToImages, 
                                      configuration: configuration)
else { return 
} 

var subscriptions = Set<AnyCancellable>()

session.output.receive(on: DispatchQueue.global())
              .sink(receiveCompletion: { _ in
                  // errors
              }, receiveValue: { _ in
                  // output
              }) 
              .store(in: &subscriptions)

session.process(requests: [request])

You can reconstruct USD and OBJ models with their corresponding UV-mapped textures.

Personally answered 2/10, 2020 at 9:35 Comment(5)
I am really intrigued with where you are going with this, as the idea of converting the ARFrame to a MTLTexture seemed like the most sensible approach. I struggled immensely in trying to figure out how to properly apply the texture coordinates to the model, so that the texture would appear as expected. Did you ever succeed with this? Going off of this question, I can convert the ARMeshGeometry to the SCNGeometry, it looks proper without texture, but I never got the coordinates right.Turnsole
Hi @AndyFedoroff Do you know whether it is possible to export the mesh model using RealityKit?Crookes
Hi @Randi, here's 3 answers about it including mine – #61064071Personally
I'm very new to Swift language. I don't know how to export models with textures, any sample's please. @AndyFedoroffEighth
Hi @Ramgg, Alas, there's no native solution for texture's export at the moment. Try Unity with ARMeshManager...Personally
P
7

How it can be done in Unity

I'd like to share some interesting info about the work of Unity's AR Foundation with a mesh coming from LiDAR. At the moment – November 01, 2020 – there's an absurd situation. It's associated with the fact that native ARKit developers cannot capture the texture of a scanned object using standard high-level RealityKit tools, however Unity's AR Foundation users (creating ARKit apps) can do this using the ARMeshManager script. I don't know whether this script was developed by AR Foundation team or just by developers of a small creative startup (and then subsequently bought), but the fact remains.

To use ARKit meshing with AR Foundation, you just need to add the ARMeshManager component to your scene. As you can see on the picture there are such features as Texture Coordinates, Color and Mesh Density.

enter image description here

If anyone has more detailed information on how this must be configured or scripted in Unity, please post about it in this thread.

Personally answered 31/10, 2020 at 20:20 Comment(10)
AFAIK texture coordinates are not supported by the ARKit subsystem in ARFoundation (even if there is a checkbox, it probably won't do anything). If anyone knows more please let us know! :)Tit
@Andy Fedoroff Thanks for your detailled answer, so in the end you'r doing this with Unity, or any update with AR Kit? I have a basic app on the store called PointCloudKit and I'm doing quite some work with points cloud but there might be easier ways... Mainly I'm reconstructing objects from points but I might be able to use the meshCia
@Andy Hi Andy, wondered if in the end you implemented your solution using Unity, or if anything changed in ARKit since then, and allowed users to capture the texture of scanned entities without AR FoundationCia
Alexandre, neither one nor the other. Waiting for RealityKi't's update. I'm 100% sure it brings the desired feature)Personally
It could be introduced in the upcoming WWDC, but you never know with Apple. Has someone managed to get it working like in the Polycam or Forge apps? I'd be valuable for many developers if they shared.Conquian
Can anyone confirm if texture capture would actually work using Unity?Labiate
Hi @AndyJazz any updates?Linkboy
Hi @ina, there's still no change...Personally
i assume one of you would have written a plugin for this already!Linkboy
@ina, you're right, however Apple, with its capabilities and resources, has not been able to release such an API for more than 4 years (and I'm sure that the 3d Scanner app is the brainchild of Cupertino).Personally

© 2022 - 2024 — McMap. All rights reserved.