Extract face features from ARSCNFaceGeometry
Asked Answered
W

1

6

I've been trying without success to extract face features, for instance the mouth, from ARSCNFaceGeometry in order to change their color or add a different material. I understand I need to create an SCNGeometry for which I have the SCNGeometrySource but haven't been able to create the SCNGeometryElement. Have tried creating it from ARFaceAnchor in update(from faceGeometry: ARFaceGeometry) but so far have been unable. Would really appreciate someone help

Wormhole answered 14/11, 2017 at 4:53 Comment(1)
Did you manage to achieve this? If so can you pls post some sample?Sliver
O
10

ARSCNFaceGeometry is a single mesh. If you want different areas of it to be different colors, your best bet is to apply a texture map (which you do in SceneKit by providing images for material property contents).

There’s no semantic information associated with the vertices in the mesh — that is, there’s nothing that says “this point is the tip of the nose, these points are the edge of the upper lip, etc”. But the mesh is topologically stable, so if you create a texture image that adds a bit of color around the lips or a lightning bolt over the eye or whatever, it’ll stay there as the face moves around.

If you need help getting started on painting a texture, there are a couple of things you could try:

Create a dummy texture first

Make a square image and fill it with a double gradient, such that the red and blue component for each pixel is based on the x and y coordinate of that pixel. Or some other distinctive pattern. Apply that texture to the model, and see how it looks — the landmarks in the texture will guide you where to paint.

Export the model

Create a dummy ARSCNFaceGeometry using the init(blendShapes:) initializer and an empty blendShapes dictionary (you don’t need an active ARFaceTracking session for this, but you do need an iPhone X). Use SceneKit’s scene export APIs (or Model I/O) to write that model out to a 3D file of some sort (.scn, which you can process further on the Mac, or something like .obj).

Import that file into your favorite 3D modeling tool (Blender, Maya, etc) and use that tool to paint a texture. Then use that texture in your app with real faces.


Actually, the above is sort of an oversimplification, even though it’s the simple answer for common cases. ARSCNFaceGeometry can actually contain up to four submeshes if you create it with the init(device:fillMesh:) initializer. But even then, those parts aren’t semantically labeled areas of the face — they’re the holes in the regular face model, flat fill-ins for the places where eyes and mouth show through.

Openhanded answered 14/11, 2017 at 18:16 Comment(3)
This does indeed work. I have exported the base face mesh out as .obj to Maya. However, my next question goes even further. Now I have the face model in Maya I would like to change the UVs of the model. i.e do a planar projection onto the face (for tattoos for example) for better undestorted mapping for this end. How do I then get this new uv set onto the existing ARSCNFaceGeometry? Is there a method in Model I/O for copying UV Sets from one model to another? How could this be accomplished?Misbeliever
I’ve managed to get the base mesh out using Model I/O export. However I’ve not had any success extracting the individual blendshapes yet. Any help/suggestions would be greatly appreciated..Misbeliever
Hi @GeoffH, did you find any solution to your above questions?Syndrome

© 2022 - 2024 — McMap. All rights reserved.