ARKit – How to export OBJ from iPhone/iPad with LiDAR?
Asked Answered
Y

3

14

How can I export the ARMeshGeometry generated by the new SceneReconstruction API on the latest iPad Pro to an .obj file?

Here's SceneReconstruction documentation.

Yatzeck answered 6/4, 2020 at 15:42 Comment(0)
T
19

Starting with Apple's Visualising Scene Scemantics sample app, you can retrieve the ARMeshGeometry object from the first anchor in the frame.

The easiest approach to exporting the data is to first convert it to an MDLMesh:

extension ARMeshGeometry {
    func toMDLMesh(device: MTLDevice) -> MDLMesh {
        let allocator = MTKMeshBufferAllocator(device: device);

        let data = Data.init(bytes: vertices.buffer.contents(), count: vertices.stride * vertices.count);
        let vertexBuffer = allocator.newBuffer(with: data, type: .vertex);

        let indexData = Data.init(bytes: faces.buffer.contents(), count: faces.bytesPerIndex * faces.count * faces.indexCountPerPrimitive);
        let indexBuffer = allocator.newBuffer(with: indexData, type: .index);

        let submesh = MDLSubmesh(indexBuffer: indexBuffer,
                                 indexCount: faces.count * faces.indexCountPerPrimitive,
                                 indexType: .uInt32,
                                 geometryType: .triangles,
                                 material: nil);

        let vertexDescriptor = MDLVertexDescriptor();
        vertexDescriptor.attributes[0] = MDLVertexAttribute(name: MDLVertexAttributePosition,
                                                            format: .float3,
                                                            offset: 0,
                                                            bufferIndex: 0);
        vertexDescriptor.layouts[0] = MDLVertexBufferLayout(stride: vertices.stride);

        return MDLMesh(vertexBuffer: vertexBuffer,
                       vertexCount: vertices.count,
                       descriptor: vertexDescriptor,
                       submeshes: [submesh]);
    }
}

Once you have the MDLMesh, exporting to an OBJ file is a breeze:

    @IBAction func exportMesh(_ button: UIButton) {
        let meshAnchors = arView.session.currentFrame?.anchors.compactMap({ $0 as? ARMeshAnchor });

        DispatchQueue.global().async {

            let directory = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0];
            let filename = directory.appendingPathComponent("MyFirstMesh.obj");

            guard let device = MTLCreateSystemDefaultDevice() else {
                print("metal device could not be created");
                return;
            };

            let asset = MDLAsset();

            for anchor in meshAnchors! {
                let mdlMesh = anchor.geometry.toMDLMesh(device: device);
                asset.add(mdlMesh);
            }

            do {
                try asset.export(to: filename);
            } catch {
                print("failed to write to file");
            }
        }
    }
Tale answered 8/4, 2020 at 15:54 Comment(7)
Hi @swiftcoder! Thank you for you answer. It looks convincing. Have you tested it? Does OBJ export work? I can't test it 'cause I have no iPad with a LiDAR scanner.Agate
Yes, I used this code (added to the sample app), to scan objects in my apartment. Note that if you scan a large area, you will end up with multiple mesh anchors, so you need to run this code for each one, and add them all to the MDLAsset.Tale
Thanks @swiftcoder! Where do I place the code let mdlMesh = anchor.geometry.toMDLMesh()... in the example? Did you used an extra ibaction for that?Cerebroside
Yes, I added a new IBAction (I've updated the answer to include it), and then wired that up to an "Export" button in the UI.Tale
Where does the .obj is saved exactly? or how can I access it?Wight
The sample code saves the obj file in the documents directory. You should be able to find the documents directory in the iOS Files app.Tale
What sort of size do the files come out as for this? Are we talking a few MBs or GBs?Ahrendt
L
9

The answer of the @swiftcoder works great. But in the case of several anchors you need to convert the vertices coordinates to the world coordinate system based on the anchor transform. In the opposite case all meshes will be placed at zero position and you will have a mess.

The updated code looks like this:

extension ARMeshGeometry {
    func toMDLMesh(device: MTLDevice, transform: simd_float4x4) -> MDLMesh {
        let allocator = MTKMeshBufferAllocator(device: device)

        let data = Data.init(bytes: transformedVertexBuffer(transform), count: vertices.stride * vertices.count)
        let vertexBuffer = allocator.newBuffer(with: data, type: .vertex)

        let indexData = Data.init(bytes: faces.buffer.contents(), count: faces.bytesPerIndex * faces.count * faces.indexCountPerPrimitive)
        let indexBuffer = allocator.newBuffer(with: indexData, type: .index)

        let submesh = MDLSubmesh(indexBuffer: indexBuffer,
                                 indexCount: faces.count * faces.indexCountPerPrimitive,
                                 indexType: .uInt32,
                                 geometryType: .triangles,
                                 material: nil)

        let vertexDescriptor = MDLVertexDescriptor()
        vertexDescriptor.attributes[0] = MDLVertexAttribute(name: MDLVertexAttributePosition,
                                                            format: .float3,
                                                            offset: 0,
                                                            bufferIndex: 0)
        vertexDescriptor.layouts[0] = MDLVertexBufferLayout(stride: vertices.stride)

        return MDLMesh(vertexBuffer: vertexBuffer,
                       vertexCount: vertices.count,
                       descriptor: vertexDescriptor,
                       submeshes: [submesh])
    }

    func transformedVertexBuffer(_ transform: simd_float4x4) -> [Float] {
        var result = [Float]()
        for index in 0..<vertices.count {
            let vertexPointer = vertices.buffer.contents().advanced(by: vertices.offset + vertices.stride * index)
            let vertex = vertexPointer.assumingMemoryBound(to: (Float, Float, Float).self).pointee
            var vertextTransform = matrix_identity_float4x4
            vertextTransform.columns.3 = SIMD4<Float>(vertex.0, vertex.1, vertex.2, 1)
            let position = (transform * vertextTransform).position
            result.append(position.x)
            result.append(position.y)
            result.append(position.z)
        }
        return result
    }
}

extension simd_float4x4 {
    var position: SIMD3<Float> {
        return SIMD3<Float>(columns.3.x, columns.3.y, columns.3.z)
    }
}

extension Array where Element == ARMeshAnchor {
    func save(to fileURL: URL, device: MTLDevice) throws {
        let asset = MDLAsset()
        self.forEach {
            let mesh = $0.geometry.toMDLMesh(device: device, transform: $0.transform)
            asset.add(mesh)
        }
        try asset.export(to: fileURL)
    }
}

I am not an expert in ModelIO and maybe there is more simple way to transform vertex buffer :) But this code works for me.

Lafayette answered 20/4, 2020 at 16:40 Comment(5)
Looks great! Can you give us an full-example of your ViewController.swift or upload your project to github?Cerebroside
Sure, Florian, here you are. github.com/alexander-gaidukov/LiDarDetectorLafayette
This is great, is there a way to also save the texture for the model?Nuri
Unfortunately there is no vertex color or texturing support.Lafayette
I was able to add texture coordinates and export a mesh. Added the method here: https://mcmap.net/q/470047/-ipad-pro-lidar-export-geometry-amp-textureCorkhill
A
7

Exporting LiDAR-reconstructed geometry

This code allows you save LiDAR's geometry as USD and send it to Mac computer via AirDrop. You can export not only .usd but also .usda, .usdc, .obj, .stl, .abc, and .ply file formats.

Additionally you can use SceneKit's write(to:options:delegate:progressHandler:) method to save a .usdz version of file.

import RealityKit
import ARKit
import MetalKit
import ModelIO

@IBOutlet var arView: ARView!
var saveButton: UIButton!
let rect = CGRect(x: 50, y: 50, width: 100, height: 50)

override func viewDidLoad() {
    super.viewDidLoad()

    let tui = UIControl.Event.touchUpInside
    saveButton = UIButton(frame: rect)
    saveButton.setTitle("Save", for: [])
    saveButton.addTarget(self, action: #selector(saveButtonTapped), for: tui)
    self.view.addSubview(saveButton)
}

@objc func saveButtonTapped(sender: UIButton) {        
    print("Saving is executing...")
    
    guard let frame = arView.session.currentFrame
    else { fatalError("Can't get ARFrame") }
            
    guard let device = MTLCreateSystemDefaultDevice()
    else { fatalError("Can't create MTLDevice") }
    
    let allocator = MTKMeshBufferAllocator(device: device)        
    let asset = MDLAsset(bufferAllocator: allocator)       
    let meshAnchors = frame.anchors.compactMap { $0 as? ARMeshAnchor }
    
    for ma in meshAnchors {
        let geometry = ma.geometry
        let vertices = geometry.vertices
        let faces = geometry.faces
        let vertexPointer = vertices.buffer.contents()
        let facePointer = faces.buffer.contents()
        
        for vtxIndex in 0 ..< vertices.count {
            
            let vertex = geometry.vertex(at: UInt32(vtxIndex))                
            var vertexLocalTransform = matrix_identity_float4x4
            
            vertexLocalTransform.columns.3 = SIMD4<Float>(x: vertex.0,
                                                          y: vertex.1,
                                                          z: vertex.2,
                                                          w: 1.0)
            
            let vertexWorldTransform = (ma.transform * vertexLocalTransform).position                
            let vertexOffset = vertices.offset + vertices.stride * vtxIndex               
            let componentStride = vertices.stride / 3
            
            vertexPointer.storeBytes(of: vertexWorldTransform.x,
                           toByteOffset: vertexOffset,
                                     as: Float.self)
            
            vertexPointer.storeBytes(of: vertexWorldTransform.y,
                           toByteOffset: vertexOffset + componentStride,
                                     as: Float.self)
            
            vertexPointer.storeBytes(of: vertexWorldTransform.z,
                           toByteOffset: vertexOffset + (2 * componentStride),
                                     as: Float.self)
        }
        
        let byteCountVertices = vertices.count * vertices.stride            
        let byteCountFaces = faces.count * faces.indexCountPerPrimitive * faces.bytesPerIndex
        
        let vertexBuffer = allocator.newBuffer(with: Data(bytesNoCopy: vertexPointer, 
                                                                count: byteCountVertices, 
                                                          deallocator: .none), type: .vertex)
        
        let indexBuffer = allocator.newBuffer(with: Data(bytesNoCopy: facePointer, 
                                                               count: byteCountFaces, 
                                                         deallocator: .none), type: .index)
        
        let indexCount = faces.count * faces.indexCountPerPrimitive            
        let material = MDLMaterial(name: "material", 
                     scatteringFunction: MDLPhysicallyPlausibleScatteringFunction())
        
        let submesh = MDLSubmesh(indexBuffer: indexBuffer, 
                                  indexCount: indexCount, 
                                   indexType: .uInt32, 
                                geometryType: .triangles, 
                                    material: material)
        
        let vertexFormat = MTKModelIOVertexFormatFromMetal(vertices.format)
        
        let vertexDescriptor = MDLVertexDescriptor()
        
        vertexDescriptor.attributes[0] = MDLVertexAttribute(name: MDLVertexAttributePosition, 
                                                          format: vertexFormat, 
                                                          offset: 0, 
                                                     bufferIndex: 0)
        
        vertexDescriptor.layouts[0] = MDLVertexBufferLayout(stride: ma.geometry.vertices.stride)
        
        let mesh = MDLMesh(vertexBuffer: vertexBuffer, 
                            vertexCount: ma.geometry.vertices.count, 
                             descriptor: vertexDescriptor, 
                              submeshes: [submesh])

        asset.add(mesh)
    }

    let filePath = FileManager.default.urls(for: .documentDirectory, 
                                             in: .userDomainMask).first!
    
    let usd: URL = filePath.appendingPathComponent("model.usd")

    if MDLAsset.canExportFileExtension("usd") {
        do {
            try asset.export(to: usd)
            
            let controller = UIActivityViewController(activityItems: [usd],
                                              applicationActivities: nil)
            controller.popoverPresentationController?.sourceView = sender
            self.present(controller, animated: true, completion: nil)

        } catch let error {
            fatalError(error.localizedDescription)
        }
    } else {
        fatalError("Can't export USD")
    }
}

Tap Save button, and in Activity View Controller choose More and send ready-to-use model to Mac's Downloads folder via AirDrop.

P.S.

Here you can find an extra info on capturing real-world texture.

Agate answered 7/4, 2020 at 5:22 Comment(10)
You can, see @Tale 's answer. There is even example code in the documentation of ARMeshGeometryChipper
having to write a bit of code to do it, does not mean you can't. You would say NO, if Apple would be keeping this info for themselves, which is not true in this case.Chipper
Can you give us a git link for this sample? I am getting this error Value of type 'ARMeshGeometry' has no member 'vertex' on trying to run this codeEvent
You can also export .usdz .Bentonbentonite
@KeyhanKamangar, what approach/module are you using for exporting .usdz? Xcode 13 still print false when we call MDLAsset.canExportFileExtension("usdz").Agate
@AndyFedoroff I'm not exactly using your method for exporting. I'm using SceneKit .write(to:) method for saving the mesh.Bentonbentonite
Is a resulted usdz mesh Ok?Agate
@AndyFedoroff Yes it looks fine but I don't like the precision of the mesh itself! I don't know it's because I'm using a 2018 iPad Pro or it's the same in all of the devices.Bentonbentonite
Thanx a lot @KeyhanKamangar, I'll try it.Agate
Very nice, it works! Don't know why I didn’t try this option))Agate

© 2022 - 2024 — McMap. All rights reserved.