iPad Pro Lidar - Export Geometry & Texture
Asked Answered
C

2

11

I would like to be able to export a mesh and texture from the iPad Pro Lidar.

There's examples here of how to export a mesh, but Id like to be able to export the environment texture too

ARKit 3.5 – How to export OBJ from new iPad Pro with LiDAR?

ARMeshGeometry stores the vertices for the mesh, would it be the case that one would have to 'record' the textures as one scans the environment, and manually apply them?

This post seems to show a way to get texture co-ordinates, but I can't see a way to do that with the ARMeshGeometry: Save ARFaceGeometry to OBJ file

Any point in the right direction, or things to look at greatly appreciated!

Chris

Chiapas answered 1/5, 2020 at 8:0 Comment(0)
C
10

You need to compute the texture coordinates for each vertex, apply them to the mesh and supply a texture as a material to the mesh.

let geom = meshAnchor.geometry
let vertices = geom.vertices 
let size = arFrame.camera.imageResolution
let camera = arFrame.camera
    
let modelMatrix = meshAnchor.transform
 
let textureCoordinates = vertices.map { vertex -> vector_float2 in
    let vertex4 = vector_float4(vertex.x, vertex.y, vertex.z, 1)
    let world_vertex4 = simd_mul(modelMatrix!, vertex4)
    let world_vector3 = simd_float3(x: world_vertex4.x, y: world_vertex4.y, z: world_vertex4.z)
    let pt = camera.projectPoint(world_vector3,
        orientation: .portrait,
        viewportSize: CGSize(
            width: CGFloat(size.height),
            height: CGFloat(size.width)))
    let v = 1.0 - Float(pt.x) / Float(size.height)
    let u = Float(pt.y) / Float(size.width)
    return vector_float2(u, v)
}

// construct your vertices, normals and faces from the source geometry 
// directly and supply the computed texture coords to create new geometry
// and then apply the texture.
let scnGeometry = SCNGeometry(sources: [verticesSource, textureCoordinates, normalsSource], elements: [facesSource])

let texture = UIImage(pixelBuffer: frame.capturedImage)
let imageMaterial = SCNMaterial()
imageMaterial.isDoubleSided = false
imageMaterial.diffuse.contents = texture
scnGeometry.materials = [imageMaterial]
let pcNode = SCNNode(geometry: scnGeometry)

pcNode if added to your scene will contain the mesh with the texture applied.

Texture coordinates computation from here

Cheriecherilyn answered 14/5, 2020 at 5:44 Comment(24)
Would you be able to add a bit more of an example with usage alongside generating a .obj file from a mesh? Perhaps an example project would be usefulQuirk
Is there any chance you could share a more complete implementation, @Pavan K? This looks promising, but I'm a bit unsure where things like verticesSource, normalsSource, and facesSource are coming from.Humorous
I have trouble believing this could work in real-time, given that the conversion between the ARFrame's capturedImage from a CVPixelBuffer to a UIImage is capable of happening fast enough for it to be rendered as the SCNMaterial. Is this happening after the fact?Humorous
@Humorous you don't need to convert the CVPixelBuffer to an UImage .. that is an expensive operation and the above code was just for an example. imageMaterial.diffuse.contents takes an MTLTexture .. you can convert the CVPixelBuffer to a texture which works realtime if you needCheriecherilyn
That is a great tip, thank you @PavanK. I gave a try to converting the CVPixelBuffer to MTLTexture and definitely see a performance improvement. The actual texture itself does not look right (I'm thinking that textures for Y and CbCr need to be created, then somehow mixed together), though, more-so, I'm still not sure the verticesSource logic is working as you displayed. Do you have this working successfully?Humorous
Looking for a real time example using Lidar and iPad pro, does anyone know of one anywhere online?Tipster
@Tipster Still looking for that, as well. I have trouble imagining this could be achieved truly in real-time, but this is an area I'm unfamiliar with (with regards to texture - I've been pretty successful at generating the LiDAR-gathered model in real-time without a texture). Hope you find something!Humorous
Can any one provide gitrepo with the working solution for the same?Homophonic
How much data is captured here, is it in the MBs or GBs for a small room for instance? I'm wondering what size iPad to buy to accommodate this task if I were to scan an entire house in detail. Would 128GB be enough or should I get the higher storage ones?Uzzi
@PavanK Can you provide a git link for this sample? I am getting error Value of type 'ARGeometrySource' has no member 'map' when i try to use this codeAcetylate
@yaali you need to iterate through the vertices to calculate the texture coordinates.Juneberry
@PavanK this produces weird results after implementation. Dod you test it or can think of any thing that is left here?Juneberry
@AliAeman Did you try this? What are verticesSource, normalsSource, and facesSource here?Acetylate
@yaali you need to convert vertices, normals and faces given with ARMeshAnchor.geometry to SCNGeometrySource and faces to SCNGeometryElement from ARGeometry, so that they can be passed to SCNGeometry. They are in buffers and we can convert them easily.Juneberry
@AliAeman How to iterate the vertices to get texture coordinates. I am getting error Value of type 'ARGeometrySource' has no member 'map'Acetylate
you can iterate it like this: for vertexId in 0..<vertices.count { let vertex = geometry.vertex(at: UInt32(vertexId)) }Juneberry
Can anyone, please post the sample code on git.Copula
@AliAeman yep tried it but how to create textureCoordinates from that iteration? Can you pls post of code of iteration?Acetylate
this code works for one single frame, but how do you stitch the rest of the camera images to the other meshes?Deracinate
@yaali did you get the code working? I'm stuck on the vertices.map error.Giana
Why is it necessary to enter a height value for width in viewportSize? If the code uses projectPoint with Portrait, will a landscape image be used?Sinistrorse
Why is pt.x divided by height and pt.y divided by width? This is a different operation intuitively, but seems to be working correctly.Sinistrorse
@Sinistrorse that's because the camera image is rotated left for portrait in ARKit world. The image is landscape even though the capture is portrait. So you flip the width and height to correct the valuesCheriecherilyn
@PavanK Thanks. That is a clear explanation. I have one more question I would like to know. In ARKit, can I assume that the UV coordinate system is (0,0) on the lower left, U coordinate on the right and V coordinate on the top? I couldn't find any documentation explaining it.Sinistrorse
D
0

Check out my answer over here

It's a description of this project: MetalWorldTextureScan which demonstrates how to scan your environment and create a textured mesh using ARKit and Metal.

Degraded answered 24/2, 2022 at 19:25 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.