I have a C++ DirectX 11 renderer that I have been writing.
I have written a COLLADA 1.4.1 loader to import COLLADA data for use in supporting skeletal animations.
I'm validating the loader at this point (and I've supported COLLADA before in another renderer I've written previously using different technology) and I'm running into a problem matching up COLLADA with DX10/11.
I have 3 separate vertex buffers of data:
A vertex buffer of Unique vertex positions. A vertex buffer of Unique normals. A vertex buffer of Unique texture coordinates.
These vertex buffers contain different array length (positions has 2910 elements, normals has more than 9000, and texture coordinates has roughly 3200.)
COLLADA provides a triangle list which gives me the indices into each of these arrays for a given triangle (verbose and oddly done at first, but ultimately it becomes simple once you've worked with it.)
Knowing that DX10/11 support multiple vertex buffer I figured I would be filling the DX10/11 index buffer with indices into each of these buffers * and * (this is the important part), these indices could be different for a given point of a triangle.
In other words, I could set the three vertex buffers, set the correct input layout, and then in the index buffer I would put the equivalent of:
l_aIndexBuffer[ NumberOfTriangles * 3 ]
for( i = 0; i < NumberOfTriangles; i++ )
{
l_aIndexBufferData.add( triangle[i].Point1.PositionIndex )
l_aIndexBufferData.add( triangle[i].Point1.NormalIndex )
l_aIndexBufferData.add( triangle[i].Point1.TextureCoordinateIndex )
}
The documentation regarding using multiple vertex buffers in DirectX doesn't seem to give any information about how this affects the index buffer (more on this later.)
Running the code that way yield strange rendering results where I could see the mesh I had being drawn intermittently correctly (strange polygons but about a third of the points were in the correct place - hint - hint)
I figured I'd screwed up my data or my indices at this point (yesterday) so I painstakingly validated it all, and so I figured I was screwing upon my input or something else. I eliminated this by using the values from the normal and texture buffers to alternatively set the color value used by the pixel shader, the colors were correct so I wasn't suffering a padding issue.
Ultimately I came to the conclusion that DX10/11 must be expect the data ordered in a different fashion, so I tried storing the indices in this fashion:
indices.add( Point1Position index )
indices.add( Point2Position index )
indices.add( Point3Position index )
indices.add( Point1Normal index )
indices.add( Point2Normal index )
indices.add( Point3Normal index )
indices.add( Point1TexCoord index )
indices.add( Point2TexCoord index )
indices.add( Point3TexCoord index )
Oddly enough, this yielded a rendered mesh that looked 1/3 correct - hint - hint.
I then surmised that maybe DX10/DX11 wanted the indices stored 'by vertex buffer' meaning that I would add all the position indices for all the triangles first, then all the normal indices for all the triangles, then all the texture coordinate indices for all the triangles.
This yielded another 1/3 correct (looking) mesh.
This made me think - well, surely DX10/11 wouldn't provide you with the ability to stream from multiple vertex buffers and then actually expect only one index per triangle point?
Only including indices into the vertex buffer of positions yields a properly rendered mesh that unfortunately uses the wrong normals and texture coordinates.
It appears that putting the normal and texture coordinate indices into the index buffer caused erroneous drawing over the properly rendered mesh.
Is this the expected behavior?
Multiple Vertex Buffers - One Index Buffer and the index buffer can only have a single index for a point of a triangle?
That really just doesn't make sense to me.
Help!