SceneKit on OS X with thousands of objects
Asked Answered
E

2

8

I'm following this tutorial: http://blog.bignerdranch.com/754-scenekit-in-mountain-lion/

I'm interested in using Scene Kit, but my scenes might potentially have thousands of spheres. To stress-test Scene Kit I tried this:

SCNSphere *sphere = [SCNSphere sphereWithRadius:0.5];
for (int i=0; i<10; i++) {
    for(int j=0; j<10; j++){
        for(int k=0; k<10; k++){
            SCNNode *myNode = [SCNNode nodeWithGeometry:sphere];
            myNode.position = SCNVector3Make(i,j,k);
            [root addChildNode:myNode];
        }
    }
}

This works fine for, say, 1000 spheres (10^3) but fails (perhaps unsurprisingly) for 1,000,000 spheres (100^3). I don't mind not being able to use a million spheres, but I'd like to work out what the sensible upper bound is (5,000? 15,000?) and how to increase it.

What can I do to mitigate this? e.g. I've tried sphere.segmentCount = 3 and while that speeds up rendering, it doesn't have much effect on memory usage, which I suspect is the limiting factor.

Also, there doesn't seem to be a SCNPoint class. I was thinking about switching to just displaying a point when the number of spheres is too high, but I can't see from the SceneKit documentation how to display a simple point -- the simplest I can see is a triangle.

Any help is much appreciated.

Edit: @toyos suggested that the SCNSphere objects are merged into single SCNGeometry object (provided they don't need to be independently animated, which they don't), but I can't find an easy way to do this.

SCNGeometry is created by using [SCNGeometry geometryWithSources:(* NSArray)sources geometryWithElements:(* NSArray) elements] as documented here, but I'm not clear as to how to create an SCNGeometry object from my spheres.

e.g. for a single sphere, I could see using sphere.geometryElementCount to get the number of elements and then using that to populate an array using [sphere geometryElementAtIndex:(NSInteger)elementIndex] which would give me the elements, but I'm not sure how to get the "sources" (or what they even are). The method to get the geometry sources is [sphere geometrySourcesForSemantic:(NSString*) semantic], but what is this semantic string? (is it meant to be "normals" or "vertices" or is this something else? the documentation quite helpfully says the semantic is "The semantic value of the geometry source." without saying what possible values of the semantic are)

That's just for a single sphere, which would be fairly pointless (since SCNSphere is just a subclass of SCNGeometry anyway), so now I have to add multiple spheres. So would I have to manually translate the vertices of the sphere when adding them to my SCNGeometry object?

I'm just trying to figure out the most sensible way to do this.

Eccrinology answered 5/2, 2013 at 22:44 Comment(7)
I think you're pushing SceneKit beyond it's intended purpose. It would be better to draw 1,000,000 spheres using OpenGL, because with OpenGL you wouldn't need to store every sphere in memory. Using OpenGL, you would just draw the spheres to the screen at render time.Vardon
I think drawing 1000k spheres with GL or SceneKit would be equivalent here - in both cases only one sphere exists in memory (see the "sphere" object is shared to the 1000k nodes). I think the problem is due to the number of separate draw call (one per sphere). If many of your spheres don't need to be animated separately you can try to merge them into a single geometry (using SCNGeometry APIs).Caerleon
@Vardon agreed in principle, but most of the time I won't need anywhere near 1,000,000 spheres and SceneKit saves a lot of time compared to OpenGL (for me at least)Eccrinology
@Caerleon thanks -- will look that up. I didn't realise you could merge them into a single geometry. That sounds like it really help, especially since my spheres don't need to be animated separately.Eccrinology
Any guidance on using SCNGeometry? There don't seem to be any examples in the official docs. I found this cleoag.ru/2013/01/17/how-create-geometry-scenekit but that would suggest you need to manually define your vertices etc. Is there no way to automatically "import" my spheres into a SCNGeometry object?Eccrinology
your spheres are geometries - so no need to import. for instance to get the vertex: [mySphere geometrySourcesForSemantic:SCNGeometrySourceSemanticVertex]. Now the difficult part is to do the merge: you'll have to merge the vertex array (and apply a translation to the vertex) and merge the elements array (and offset the indices). and finally create a new geometry with the merged verge/normal/elementsCaerleon
@Caerleon that's actually really helpful, thank you. Will give it a go.Eccrinology
C
8

The semantic strings are SCNGeometrySourceSemanticVertex|Normal|Texcoord ...

For multiple spheres the answer is yes, you have to transform the vertex/normals with the current node transform before flattening.

Below is a simplified example (i.e it only supports merging the childs of "input" if they all have the same geometry)

- (SCNNode *) flattenNodeHierarchy:(SCNNode *) input
{
    SCNNode *result = [SCNNode node];

    NSUInteger nodeCount = [[input childNodes] count];
    if(nodeCount > 0){
    SCNNode *node = [[input childNodes] objectAtIndex:0];

        NSArray *vertexArray = [node.geometry geometrySourcesForSemantic:SCNGeometrySourceSemanticVertex];
        SCNGeometrySource *vertex = [vertexArray objectAtIndex:0];

        SCNGeometryElement *element = [node.geometry geometryElementAtIndex:0]; //todo: support multiple elements
        NSUInteger primitiveCount = element.primitiveCount;
        NSUInteger newPrimitiveCount = primitiveCount * nodeCount;
        size_t elementBufferLength = newPrimitiveCount * 3 * sizeof(int); //nTriangle x 3 vertex * size of int
        int* elementBuffer = (int*)malloc(elementBufferLength);

        /* simple case: here we consider that all the objects to flatten are the same
         In the regular case we should iterate on every geometry and accumulate the number of vertex/triangles etc...*/

        NSUInteger vertexCount = [vertex vectorCount];
        NSUInteger newVertexCount = vertexCount * nodeCount;

        SCNVector3 *newVertex = malloc(sizeof(SCNVector3) * newVertexCount);        
        SCNVector3 *newNormal = malloc(sizeof(SCNVector3) * newVertexCount); //assume same number of normal/vertex

        //fill
        NSUInteger vertexFillIndex = 0;
        NSUInteger primitiveFillIndex = 0;
        for(NSUInteger index=0; index< nodeCount; index++){
            NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];

            node = [[input childNodes] objectAtIndex:index];

            NSArray *vertexArray = [node.geometry geometrySourcesForSemantic:SCNGeometrySourceSemanticVertex];
            NSArray *normalArray = [node.geometry geometrySourcesForSemantic:SCNGeometrySourceSemanticNormal];
            SCNGeometrySource *vertex = [vertexArray objectAtIndex:0];
            SCNGeometrySource *normals = [normalArray objectAtIndex:0];

            if([vertex bytesPerComponent] != sizeof(float)){
                NSLog(@"todo: support other byte per component");
                continue;
            }

            float *vertexBuffer = (float *)[[vertex data] bytes];
            float *normalBuffer = (float *)[[normals data] bytes];

            CATransform3D t = [node transform];
            GLKMatrix4 matrix = MyGLKMatrix4FromCATransform3D(t);

            //append source
            for(NSUInteger vIndex = 0; vIndex < vertexCount; vIndex++, vertexFillIndex++){
                GLKVector3 v = GLKVector3Make(vertexBuffer[vIndex * 3], vertexBuffer[vIndex * 3+1], vertexBuffer[vIndex * 3 + 2]);
                GLKVector3 n = GLKVector3Make(normalBuffer[vIndex * 3], normalBuffer[vIndex * 3+1], normalBuffer[vIndex * 3 + 2]);

                //transform
                v = GLKMatrix4MultiplyVector3WithTranslation(matrix, v);
                n = GLKMatrix4MultiplyVector3(matrix, n);

                newVertex[vertexFillIndex] = SCNVector3Make(v.x, v.y, v.z);
                newNormal[vertexFillIndex] = SCNVector3Make(n.x, n.y, n.z);
            }

            //append elements
            //here we assume that all elements are SCNGeometryPrimitiveTypeTriangles
            SCNGeometryElement *element = [node.geometry geometryElementAtIndex:0];
            const void *inputPrimitive = [element.data bytes];
            size_t bpi = element.bytesPerIndex;

            NSUInteger offset = index * vertexCount;

            for(NSUInteger pIndex = 0; pIndex < primitiveCount; pIndex++, primitiveFillIndex+=3){                
                elementBuffer[primitiveFillIndex] = offset + _getIndex(inputPrimitive, bpi, pIndex*3);
                elementBuffer[primitiveFillIndex+1] = offset + _getIndex(inputPrimitive, bpi, pIndex*3+1);
                elementBuffer[primitiveFillIndex+2] = offset + _getIndex(inputPrimitive, bpi, pIndex*3+2);
            }

            [pool drain];
        }

        NSArray *sources = @[[SCNGeometrySource geometrySourceWithVertices:newVertex count:newVertexCount],
                             [SCNGeometrySource geometrySourceWithNormals:newNormal count:newVertexCount]];

        NSData *newElementData = [NSMutableData dataWithBytesNoCopy:elementBuffer length:elementBufferLength freeWhenDone:YES];
        NSArray *elements = @[[SCNGeometryElement geometryElementWithData:newElementData
                                                            primitiveType:SCNGeometryPrimitiveTypeTriangles
                                                           primitiveCount:newPrimitiveCount bytesPerIndex:sizeof(int)]];

        result.geometry = [SCNGeometry geometryWithSources:sources elements:elements];

        //cleanup
        free(newVertex);
        free(newNormal);
    }

    return result;
}

//helpers:
GLKMatrix4 MyGLKMatrix4FromCATransform3D(CATransform3D transform) {
    GLKMatrix4 m = {{transform.m11, transform.m12, transform.m13, transform.m14,
        transform.m21, transform.m22, transform.m23, transform.m24,
        transform.m31, transform.m32, transform.m33, transform.m34,
        transform.m41, transform.m42, transform.m43, transform.m44}};
    return m;
}



GLKVector3 MySCNVector3ToGLKVector3(SCNVector3 vector) {
    GLKVector3 v = {{vector.x, vector.y, vector.z}};
    return v;
}
Caerleon answered 14/2, 2013 at 14:26 Comment(1)
Thanks for taking the time to do this. Looks like a great solution! Will be testing it now :) and making sure I understand it line-by-line.Eccrinology
H
3

How best to do this depends on exactly what you're looking to accomplish.

Are these thousands of points (a star field backdrop for an outer space scene, perhaps) static, or do they need to move with respect to each other? Do they actually need to be spheres? How much detail do they need?

If they don't need to move independently, merging them into a single geometry is a good idea. On Mavericks (OS X 10.9) you don't need do mess with geometry data yourself to do that — create a node for each one, then parent them all to a single node (not your scene's root node), and call flattenedClone to get a copy of that node whose geometries are combined.

If they don't need to have much detail, there are a few options for improving performance.

One is to reduce the segmentCount of the sphere geometry — you don't need 5000 triangles to draw a sphere that'll only be a couple of pixels wide when rendered, which is about what you get with the default segment count of 48. (If you're going to mess with the geometry data or flatten nodes immediately after reducing the segment count, be sure to call [SCNTransaction flush] to make sure it gets updated.)

Another is to reduce the triangle count further. Are the stars (or whatever) small enough that they even need to be spheres? If your scene can be set up so they're always oriented toward the camera, SCNPlane might be better — with its minimum segment count it's just two triangles.

Do they even need to be triangles? Scene Kit can render points — there's not a SCNGeometry subclass for them because it's generally not useful to independently position and transform single points. But you can create a custom geometry using an array of vertex positions and the SCNGeometryPrimitiveTypePoint geometry element type. And if you want to customize the rendering of single points, you can attach shaders (or shader modifiers) to the geometry.

Haloid answered 5/3, 2014 at 19:30 Comment(2)
Thanks for the input – I've actually tried flattenedClone recently, but it only actually reduced memory usage for very large numbers of objects, and didn't seem to affect fps much, but perhaps it was because I needed call flush first or there was some other issue. I plan on revisiting this question soon. I think the segmentCount you suggested could really help, as could the new levelOfDetail option...Eccrinology
Is it the right way to draw only a point ? SCNVector3 point = SCNVector3Make(0, 0, 0); SCNGeometrySource * src = [SCNGeometrySource geometrySourceWithVertices:&point count:1]; SCNGeometryElement * element = [SCNGeometryElement geometryElementWithData:nil primitiveType:SCNGeometryPrimitiveTypePoint primitiveCount:1 bytesPerIndex:8]; SCNGeometry * ge = [SCNGeometry geometryWithSources:@[src] elements:@[element]];Rufous

© 2022 - 2024 — McMap. All rights reserved.