Coloring pixels by distance from vertices of triangle
Asked Answered
S

15

0

If each vertex of a triangle mesh has a vertex colour, how do you colour a pixel using the vertex color of the nearest vertex?

Linear interpolation: If I set fragment ALBEDO.rgb = COLOR.rgb in my fragment function then pixels are colored with a linear interpolation between vertex colors.

Flat coloring: If instead, my vertex function stored COLOR.rgb in a varying flat vec3 then all pixels in a triangle will be colored the same, based on the vertex color of one of the three vertices... though I'm not sure how it decides which one.

What I'm trying to achieve: Ideally I want to choose a color per-pixel based on the three vertex colors and the pixels distances from each of those vertices. The pixels closest to a vertex will be all colored that color. Halfway between two vertices I want the color to swap from one vertex color to the other. At the midpoint of the triangle there will therefore be a boundary where the three solid colors meet. Later I intend to add some noise to the distance function so that the color boundary is wobbly.

Effectively, I want to reproduce the effect described here: https://www.redblobgames.com/x/1730-terrain-shader-experiments/noisy-hex-rendering.html

But my biggest confusion is that I assumed the fragment function (when run for each pixel) would know about each of the surrounding three vertices. It seems to have access to a vertex, but I can't even tell how it's deciding which one.

Can anyone point me in the right direction?

Stevestevedore answered 16/9, 2021 at 18:14 Comment(0)
S
0

I think part of my problem is that Godot does not have custom shader attributes, which is problematic for passing barycentric coordinates to the pixel shader.

From reading the sourcecode from the above blog post (which is in WebGL, so I find it a little confusing) and a Stack Overflow answer about accessing barycentric coordinates in GLSL I think the GLSL process is this:

  • For each of the three vertices in the triangle, pass a barycentric extreme (1,0,0), (0,1,0), or (0,0,1) into the vertex shader via a custom attribute
  • In the vertex shader, write the vertex's barycentric coordinate to a varying so it is exposed to the fragment shader
  • In the fragment shader, GLSL will have interpolated the varying barycentric coordinates vector based on the pixel's position, giving three scalars representing the pixel's barycentric coordinates

But I still have a few things I don't understand:

  1. Without custom attributes, might another built-in attribute be usable to pass in the vec3 barycentric value for each vertex?
  2. If a vertex is shared by two triangles, would I need to pass in different barycentric coordinates to the same vertex with respect to two separate triangles?
  3. Once I have a pixel's barycentric coordinates, I still need to independently access the individual vertex colors for all three verts in order to create a custom mix (/interpolation)... and I still don't understand how these end up in the fragment shader in three separate varying variables.

The RedBlobGames blog actually has another page that shows lots of cool effects achieved by working with barycentric coordinates in the fragment shader: https://www.redblobgames.com/x/1730-terrain-shader-experiments/index.html

If only I could figure out how he's getting them in there!

Stevestevedore answered 18/9, 2021 at 19:50 Comment(0)
P
0

I think part of my problem is that Godot does not have custom shader attributes, which is problematic for passing barycentric coordinates to the pixel shader.

Custom vertex attributes were added in the upcoming Godot 4.0, but this can't be backported to 3.x.

Poem answered 18/9, 2021 at 20:50 Comment(0)
B
0

I used barycentric coordinates for a wireframe shader. You do need to pass them in from the CPU at some point, e.g. by processing the model with the MeshDataTool. You don't necessarily need custom attributes. In my case I think I stored them in the color, since I wasn't using vertex colors anyhow. If you need the color, it might be a little more complicated, for example using the normal to store the coordinates (but then you will lose lighting).

Butyrin answered 19/9, 2021 at 0:57 Comment(0)
S
0

Thanks both!

After reading this StackOverflow question about problems caused by shared vertices I realised that I did need to assign a single 'barycentric extreme' to each mesh vertex, which runs the risk of a triangle having two of the same barycentric extremes... or so I thought....

My mesh is an approximate sphere, created through triangular subdivision of an octahedron. After playing around with it I realised that each vertex can be given one of the three 'barycentric extremes' in a pattern that ensures no two adjacent verts are given the same one. Hooray! So I'm going to do this as a pre-processing step when the mesh verts are procedurally defined at the start of the program.

I think I will be able to use the UV2 channel to pass the data in, even though it is only two-dimensional:

  • UV (1, 0) indicates barycentric coordinate (1, 0, 0)
  • UV (0, 1) indicates (0, 1, 0)
  • UV (0, 0) indicates (0, 0, 1)

So the vertex shader will check which case it is in UV2 and write a Vec3 to a varying accordingly.

That means I don't have to lose vertex colors like in your wireframe case, @cybereality.

But my main problem now is this: the vertex shader needs to write 3 colors to varyings, i.e. each of the three vertex colors. But I haven't figured out how to get this information into the vertex shader.

If I was to try to pass it in through a built-in attribute then I would need each vertex to be given the vertex colors for each triangle that vertex is part of. In my mesh some verts are part of six triangles forming a hexagon, so each vertex would need to know about six vertex colors and know which triangle they correspond to.

Unless there's a way for the fragment function to access the non-interpolated colors for each of the VERTEX colors.

Stevestevedore answered 19/9, 2021 at 1:38 Comment(0)
S
0

Oh wait, if my Vertex function is already distinguishing between the three vertices by the UV2 encoding that I described then I could have three Vec3 color varyings and the UV encoding can also tell which of the three color varyings it should write its own vertex color to.

e.g. If the vertex function sees a UV2 value of (0, 0) then it knows the vertex's barycentric coordinate is (0, 0, 1) and it should write this vertex color to the THIRD color varying.

My only concern is that if it leaves the other two blank, will they default to black i.e. Vec3(0, 0, 0) and then will the fragment interpolate those three color varyings and result in values that are a blend between the intended vertex color and the black default values...

It seems like this just all depends on how default interpolation is implemented.

Stevestevedore answered 19/9, 2021 at 1:46 Comment(0)
B
0

I think what you want to do is have the barycentric coordinate value as a varying, and don't do anything really in the vertex shader but set it. Then in the pixel shader, you can do the actual math. If the whole mesh uses the same 3 colors, then just pass that in as a shader variable from GDScript. You then use the barycentric coordinate (which has been interpolated, which is what you want) to determine which color in the array to chose. You can also pass the colors in as a small texture (like 3 pixels by 1 pixel) and sample into the texture. Not sure which way would be easier or faster, though if you had a lot of colors it may make sense to use a texture.

Butyrin answered 19/9, 2021 at 7:24 Comment(0)
S
0

This produced a pretty interesting result:

shader_type spatial;

varying vec3 color_one;
varying vec3 color_two;
varying vec3 color_three;

varying vec3 barycentric_coordinates;

void vertex() {
	if (UV2.x == 1.0) {
		barycentric_coordinates = vec3(1, 0, 0);
		color_one = COLOR.rgb;
	}
	else if (UV2.y == 1.0) {
		barycentric_coordinates = vec3(0, 1, 0);
		color_two = COLOR.rgb;
	}
	else {
		barycentric_coordinates = vec3(0, 0, 1);
		color_three = COLOR.rgb;
	}
}

void fragment() {
	ALBEDO.rgb = barycentric_coordinates.x * color_one + barycentric_coordinates.y * color_two + barycentric_coordinates.z * color_three;
}

Here's what I had before: Globe rendered without barycentric coordinates (i.e. ALBEDO.rgb = COLOR.rgb; Globe rendered without barycentric coordinates (i.e. ALBEDO.rgb = COLOR.rgb;

And now (With the above barycentric coordinate code): With the above barycentric coordinate code in which the pixels around each vert are definitely being rendered THAT vertex color. Which is great!

However the centre of the triangle isn't the average of the vert colors but instead is almost black. I'm guessing this is because I'm not setting all three colors for each case in the vertex shader and so it's blending in a whole bunch of black vec3(0, 0, 0) by default.

This causes other problems too - the triangles also flash with random colors for some reason: All triangles flashing at the same time for some reason... most frames it's only a few triangles that are like this

All triangles flashing at the same time for some reason... most frames it's only a few triangles that are like this.

I think this is due to junk data (again, likely from not providing values for all three color varyings. The flashing gets worse when my computer is doing things (like trying to use Windows snipping tool to take a screenshot) which I think supports the guess that its from junk data that the vertex shader isn't writing over.

I've also messed up my southern hemisphere indices somehow.

It feels close to getting what I want, so I just need to figure out how the color setting should be working. Oh and to get the sharp color change I want I need to change from my linear interpolation to a "choosing the nearest" instead.

Stevestevedore answered 19/9, 2021 at 11:28 Comment(0)
S
0

After fixing my indices (southern hemisphere was using the incorrect pattern for assigning barycentric extremes), I now have this:

shader_type spatial;

uniform vec4 blank_col = vec4(0, 0, 0, 0);

varying vec4 color_one;
varying vec4 color_two;
varying vec4 color_three;

// barycentric coordinates
varying vec3 bc;

void vertex() {
	if (UV2.x == 1.0) {
		bc = vec3(1, 0, 0);
		color_one = COLOR.rgba;
		color_two = blank_col;
		color_three = blank_col;
	}
	else if (UV2.y == 1.0) {
		bc = vec3(0, 1, 0);
		color_one = blank_col;
		color_two = COLOR.rgba;
		color_three = blank_col;
	}
	else {
		bc = vec3(0, 0, 1);
		color_one = blank_col;
		color_two = blank_col;
		color_three = COLOR.rgba;
	}
}

void fragment() {
	// Color taken from nearest vertex (largest barycentric coordinate)
	if (bc.x > bc.y && bc.x > bc.z) {
		// Vertex one is closest
		ALBEDO.rgb = color_one.rgb;
	}
	else if (bc.y > bc.z) {
		// Vertex two is closest
		ALBEDO.rgb = color_two.rgb;
	}
	else {
		// Vertex three is closest
		ALBEDO.rgb = color_three.rgb;
	}
}

Which produces this:

I'm writing a 'blank color' to two of the color varyings in each case, in an attempt to prevent vertex one from altering the vertex color from vertex two. This fixed the random flashing issue, however this is not having the desired effect overall.

This does some things correctly: - Pixels around a vertex are very strongly that vertex's color - Clear boundaries halfway between vertices, where the color changes, producing hexagonal shapes

But some things it does incorrectly: - Pixels are 'brightest' at vertices, and 'darker' further from vertices (i.e. triangle centres and halfway between vertices on color change boundaries).

This is occurring because each varying that I store vertex color in is being interpolated/blended with my 'blank' color. Therefore pixels close to the triangle centre are only using 1/3rd of the desired color (and taking 1/3rd from each of the other two vertex's colors in the same varying... which are zero so no additive contribution is made, but causes an overall dimming effect.

Question: Is there any way for a varying to get values from some vertices but no value from other vertices... and resultingly only interpolate between valid values?

Stevestevedore answered 19/9, 2021 at 15:57 Comment(0)
S
0

When writing the above I realised what was necessary:

// Color taken from nearest vertex (largest barycentric coordinate)
if (bc.x > bc.y && bc.x > bc.z) {
	// Vertex one is closest
	ALBEDO.rgb = color_one.rgb * (1.0/bc.x);
}
else if (bc.y > bc.z) {
	// Vertex two is closest
	ALBEDO.rgb = color_two.rgb * (1.0/bc.y);
}
else {
	// Vertex three is closest
	ALBEDO.rgb = color_three.rgb * (1.0/bc.z);
}

I used the barycentric coordinate to determine how much the desired color was being scaled by, e.g. by the triangle centre the color was 1/3 from each vertex (with two vertices providing zero values). That meant I could reverse this by mutiplying by 1 / barycentric_scalar, e.g. 1 / (1/3). This then brings the color values back to whatever they were at the vertex itself.

Which produces this:

Stevestevedore answered 19/9, 2021 at 16:6 Comment(0)
B
0

Nice work. I forgot to mention, there is something called interpolation qualifiers. https://docs.godotengine.org/en/3.0/tutorials/shading/shading_language.html#interpolation-qualifiers This should allow you to pass colors from vertex to pixel shaders without the interpolation (by setting it to flat).

Butyrin answered 20/9, 2021 at 23:27 Comment(0)
S
0

I experimented with interpolation qualifiers, but there's something I couldn't figure out: for flat interpolation, which vertex does it use to decide the value?

For triangles, where you have three vertices, each interpolated pixel ends up with a weighted blend of the three vertex colors.

For flat, does it just pick one vertex and use its vertex color for all of the pixels in the triangle? (And if so, how does it decide which vertex?)

Or does it blend the colors to create an average-color and use that for all the pixels? (Which effectively would be the same color as the centre-pixel when interpolating).

Stevestevedore answered 21/9, 2021 at 20:16 Comment(0)
B
0

I just did a test. It looks like "flat" always uses the color of the 2nd vertex, which is not really useful in this case.

Butyrin answered 21/9, 2021 at 21:27 Comment(0)
S
0

Thanks for testing that, @cybereality. Always using the second vertex is a strange default behaviour!

My barycentric coordinates are working well for allowing me to blend vertex colors, and I've added noise so that where two vertex colors meet the boundary is jagged (like a coastline of land meeting sea). However to add the noise I needed the fragment shader to have a value that is consistent across the whole triangle (but ideally unique to each triangle). I stored a vertex index in UV, and reconstructed it in the fragment function.

However when I change the "vertex index" values I get some very strange effects, where my sharp colour boundaries start to become fuzzy. Which is not what I want!

Here's a demo I created:

My MeshInstance is created like this:

extends MeshInstance

var mdt = MeshDataTool.new()

func _ready():
	build_mesh()

func build_mesh():
	var st = SurfaceTool.new();
	st.begin(Mesh.PRIMITIVE_TRIANGLES);
	st.add_smooth_group(true)

	# V1
	st.add_uv(Vector2(100.0, 0.0)) # Encode vertex id in UV.x
	st.add_uv2(Vector2(1, 0)) # Encode barycentric indicator in UV2
	st.add_color(Color(1, 0, 0))
	st.add_vertex(Vector3(-1, 0, 0))
	# V2
	st.add_uv(Vector2(200.0, 0.0))
	st.add_uv2(Vector2(0, 1))
	st.add_color(Color(0, 1, 0))
	st.add_vertex(Vector3(0, sqrt(3), 0))
	# V3
	st.add_uv(Vector2(30000.0, 0.0))
	st.add_uv2(Vector2(0, 0))
	st.add_color(Color(0, 0, 1))
	st.add_vertex(Vector3(1, 0, 0))

	st.generate_normals();
	st.generate_tangents();
	var mesh = st.commit();
	self.mesh = mesh
	mdt.create_from_surface(mesh, 0)

And my shader looks like this:

shader_type spatial;

uniform vec3 blank_col = vec3(0, 0, 0);

varying vec3 color_one;
varying vec3 color_two;
varying vec3 color_three;

varying float v_id_one;
varying float v_id_two;
varying float v_id_three;

// barycentric coordinates
varying vec3 bc;

// Noise function by Morgan McGuire https://www.shadertoy.com/view/4dS3Wd
float hash(float p) { p = fract(p * 0.011); p *= p + 7.5; p *= p + p; return fract(p); }
float hash2(vec2 p) {vec3 p3 = fract(vec3(p.xyx) * 0.13); p3 += dot(p3, p3.yzx + 3.333); return fract((p3.x + p3.y) * p3.z); }

// Noise function by Morgan McGuire https://www.shadertoy.com/view/4dS3Wd
float noise(float x) {
    float i = floor(x);
    float f = fract(x);
    float u = f * f * (3.0 - 2.0 * f);
    return mix(hash(i), hash(i + 1.0), u);
}

// Noise function by Morgan McGuire https://www.shadertoy.com/view/4dS3Wd
float fbm(float x) {
	float v = 0.0;
	float a = 0.5;
	float shift = float(100);
	for (int i = 0; i < 5; ++i) {
		v += a * noise(x);
		x = x * 2.0 + shift;
		a *= 0.5;
	}
	return v;
}

float large_wave(float x, float offset) {
	float amplitude = 1.0;
	float period = 4.0;
	return sin(x + offset) * sin(period * 2.0 * x + offset * 2.0) * amplitude;
}

float noisy_detail(float x, float offset) {
	return fbm(6.0*x + offset);
}

float mid_pass_filter(float x) {
	return sin(3.0 * x) * 0.05;
}

float bias(float x, float offset) {
	return mid_pass_filter(x) * (large_wave(x, offset)/2.0 + noisy_detail(2.0*x, offset-1.0));
}

void vertex() {
	if (UV2.x == 1.0) {
		bc = vec3(1.0, 0, 0);
		color_one = COLOR.rgb;
		color_two = blank_col;
		color_three = blank_col;
		v_id_one = UV.x;
		v_id_two = 0.0;
		v_id_three = 0.0;
	}
	else if (UV2.y == 1.0) {
		bc = vec3(0, 1.0, 0);
		color_one = blank_col;
		color_two = COLOR.rgb;
		color_three = blank_col;
		v_id_one = 0.0;
		v_id_two = UV.x;
		v_id_three = 0.0;
	}
	else {
		bc = vec3(0, 0, 1.0);
		color_one = blank_col;
		color_two = blank_col;
		color_three = COLOR.rgb;
		v_id_one = 0.0;
		v_id_two = 0.0;
		v_id_three = UV.x;
	}
}

void fragment() {
	// Reconstruct individual vertex colors
	vec3 col_one = color_one * (1.0/bc.x);
	vec3 col_two = color_two * (1.0/bc.y);
	vec3 col_three = color_three * (1.0/bc.z);

	// Reconstruct individual vertex IDs
	float id_one = v_id_one * (1.0/bc.x);
	float id_two = v_id_two * (1.0/bc.y);
	float id_three = v_id_three * (1.0/bc.z);

	// Act separately on each min-triangle
	if (bc.x <= bc.y && bc.x <= bc.z) {
		// x min, therefore blend between y and z
		float half = (bc.y + bc.z)/2.0;
		float t = bc.x * 3.0;  // Scales boundary length to [0.0, 1.0] interval
		float k = (id_two + id_three)*id_one;
		float bias_yz = bias(t, k);
		if (bc.y - bias_yz < half) {
			ALBEDO = col_three;
		}
		else {
			ALBEDO = col_two;
		}
	}
	if (bc.y <= bc.x && bc.y <= bc.z) {
		// y min, therefore blend between x and z
		float half = (bc.x + bc.z)/2.0;
		float t = bc.y * 3.0;
		float k = (id_one + id_three) * id_two;
		float bias_xz = bias(t, k);
		if (bc.x + bias_xz < half) {
			ALBEDO = col_three;
		}
		else {
			ALBEDO = col_one;
		}
	}
	if (bc.z <= bc.x && bc.z <= bc.y) {
		// z min, therefore blend between x and y
		float half = (bc.x + bc.y)/2.0;
		float t = bc.z * 3.0;
		float k = fbm(id_one + id_two)+TIME;
		float bias_xy = bias(t, k);
		if (bc.x - bias_xy < half) {
			ALBEDO = col_two;
		}
		else {
			ALBEDO = col_one;
		}
	}

}

In the MeshInstance code the result looks fine if UV.x values are: 100.0, 200.0, 300.0 for example. However if vertex 3's UV.x gets too large then the result starts to look fuzzy and eventually it breaks completely. I wondered if the case was float instability when reconstructing vertex IDs, and it seems like this may well be a problem.

For example, if I add the following to the end of the fragment shader:

	if (id_three != 300.0) {
		ALBEDO = vec3(1.0, 1.0, 1.0);
	}

Then many pixels get colored white, even though a successful reconstruction would mean that all pixels agree that id_three == 300.0.

That therefore has left me trying to figure out of Godot's shader language will let me pass per-vertex integers into the fragment function so I don't have to reconstruct them and end up with this problem.

Stevestevedore answered 26/9, 2021 at 0:13 Comment(0)
S
0

Here's a screenshot of the white debug pixels I described:

Each white pixel is a pixel which calculated the incorrect value for id_three (i.e. it was set as 300.0 in the MeshInstance code, but reconstructed to something that was not 300.0 in the fragment shader).

Stevestevedore answered 26/9, 2021 at 0:17 Comment(0)
B
0

So I'm not sure I 100% understand the code. But two things come to mind. With float values, you can almost never compare for equality and expect it to work. So if you store "300.0" and then access it again, it may be "300.000001" or "299.999999" or something like that. And if you compare "300.000001" to "300.0" it will not be equal. What you need is to use an epsilon like this:

float eps = 0.001 // good default but adjust if needed
if (id_three > 300.0 - eps && id_three < 300.0 + eps) {
   // they are really close to equal
}

Also, UV values typically store numbers between 0 and 1 (or maybe slightly outside that, but not far outside). I'm not sure internally what precision they use, but storing large values may introduce too much instability. It would be better to keep the values low, maybe 0 to 100, or something reasonable and then have each id be 0.01 away from each other. In that case, you could multiply the UV value by 100 in the fragment shader to get the value you want. I've never tried this, but I think it should be possible.

Butyrin answered 26/9, 2021 at 0:57 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.