Retrieve Vertices Data in THREE.js
Asked Answered
S

1

7

I'm creating a mesh with a custom shader. Within the vertex shader I'm modifying the original position of the geometry vertices. Then I need to access to this new vertices position from outside the shader, how can I accomplish this?

Sully answered 14/3, 2015 at 20:33 Comment(0)
S
9

In lieu of transform feedback (which WebGL 1.0 does not support), you will have to use a passthrough fragment shader and floating-point texture (this requires loading the extension OES_texture_float). That is the only approach to generate a vertex buffer on the GPU in WebGL. WebGL does not support pixel buffer objects either, so reading the output data back is going to be very inefficient.

Nevertheless, here is how you can accomplish this:

This will be a rough overview focusing on OpenGL rather than anything Three.js specific.

First, encode your vertex array this way (add a 4th component for index):

Vec4 pos_idx  :  xyz = Vertex Position,  w = Vertex Index (0.0 through NumVerts-1.0)

Storing the vertex index as the w component is necessary because OpenGL ES 2.0 (WebGL 1.0) does not support gl_VertexID.

Next, you need a 2D floating-point texture:

MaxTexSize = Query GL_MAX_TEXTURE_SIZE

Width  = MaxTexSize;
Height = min (NumVerts / MaxTexSize, 1);

Create an RGBA floating-point texture with those dimensions and use it as FBO color attachment 0.

Vertex Shader:

#version 100

attribute vec4 pos_idx;

uniform int width;  // Width of floating-point texture
uniform int height; // Height of floating-point texture

varying vec4 vtx_out;

void main (void)
{
  float idx = pos_idx.w;

  // Position this vertex so that it occupies a unique pixel
  vec2 xy_idx = vec2 (float ((int (idx) % width)) / float (width),
                      floor (idx / float (width)) / float (height)) * vec2 (2.0) - vec2 (1.0);
  gl_Position = vec4 (xy_idx, 0.0f, 1.0f);


  //
  // Do all of your per-vertex calculations here, and output to vtx_out.xyz
  //


  // Store the index in the W component
  vtx_out.w = idx;
}

Passthrough Fragment Shader:

#version 100

varying vec4 vtx_out;

void main (void)
{
  gl_FragData [0] = vtx_out;
}

Draw and Read Back:

// Draw your entire vertex array for processing (as `GL_POINTS`)
glDrawArrays (GL_POINTS, 0, NumVerts);

// Bind the FBO's color attachment 0 to `GL_TEXTURE_2D`

// Read the texture back and store its results in an array `verts`
glGetTexImage (GL_TEXTURE_2D, 0, GL_RGBA, GL_FLOAT, verts);
Selimah answered 14/3, 2015 at 22:11 Comment(5)
Some code in Three.js that does roughly what you want can be found here. It differs slightly in that it keeps vertices on the GPU at all times, but if you examine this answer and that as some sample code you should be able to throw something together.Selimah
Thanks a lot for such detailed explanation! So I understand the only way is to use the fragment shader to retrieve a texture (based on the vertex positions) and then translate again the RGBA colors from texture to positions, right?Sully
@MickeyMouse: Yes, however... if you do not need these positions on the CPU side of things, you can actually do a texture lookup in the vertex shader of another render pass. If you examine the source code for the example I listed, it shows how to do that. I was not sure if you needed the vertices off the GPU or not, and the answer I described was to download the vertex data from the GPU.Selimah
Yes, I guess I need them in CPU. I'm trying to make a gaussian blur and apply it to vertices position (height) value (not for colors). In order to optimize the process, I'm carrying out the task in two steps, horizontal and vertical (gamerendering.com/2008/10/11/gaussian-blur-filter-shader). Perhaps for these two steps I don't need them in CPU (?). And third step to normalize the values, which implies I have to iterate over all vertices, so I figure out I need them in CPU for last step.Sully
@MickeyMouse: I'm really at a loss as to how you can apply a blur filter to vertex data. You could probably get a point cloud back, but the way I'm picturing it in my head you would lose structures like triangles when you insert the additional vertices blurring would create. In any case, if you have your vertex positions stored in a texture, you can do multiple fetches in a fragment shader (you will have to use the texture coordinate indexing logic from the Three.js example I linked earlier to fetch the vertices from the right texel) and probably implement the entire thing on the GPU.Selimah

© 2022 - 2024 — McMap. All rights reserved.