Dot Product and Luminance/ Findmyicone
Asked Answered
L

1

1

All,

I have a basic question that I am struggling with here. When you look at the findmyicone sample code from WWDC 2010, you will see this:

static const uint8_t orangeColor[] = {255, 127, 0};
uint8_t referenceColor[3];

// Remove luminance
static inline void normalize( const uint8_t colorIn[], uint8_t colorOut[] ) {

// Dot product
int sum = 0;
for (int i = 0; i < 3; i++)
sum += colorIn[i] / 3;

for (int j = 0; j < 3; j++)
colorOut[j] = (float) ((colorIn[j] / (float) sum) * 255);
}

And then it is called:

normalize(orangeColor, referenceColor);

Running the debugger, it is converting BGRA: (Red 255, Green 127, Blue 0) to (Red 0, Green 255, Blue 0). I have looked on the web and SO to find details on luminance and dot product and there is really no information.

1- Can someone guide me on what this function is doing?

2- Can you guide me to some helpful topics/primer online as well?

Thanks again KMB

Lovejoy answered 10/10, 2012 at 17:31 Comment(0)
B
4

What they're trying to do is track a particular color across variations in brightness, so they're normalizing for the luminance of the color. I do something similar in the fragment shader I use in a color tracking example based on a GPU Gems paper from Apple, as well as the ColorObjectTracking sample application in my GPUImage framework:

vec3 normalizeColor(vec3 color)
{
    return color / max(dot(color, vec3(1.0/3.0)), 0.3);
}

vec4 maskPixel(vec3 pixelColor, vec3 maskColor)
{
    float  d;
    vec4   calculatedColor;

    // Compute distance between current pixel color and reference color
    d = distance(normalizeColor(pixelColor), normalizeColor(maskColor));

    // If color difference is larger than threshold, return black.
    calculatedColor =  (d > threshold)  ?  vec4(0.0)  :  vec4(1.0);

    //Multiply color by texture
    return calculatedColor;
}

The above calculation takes the average of the three color components by multiplying each channel by 1/3 and then summing them (that's what the dot product does here). It then divides each color channel by this average to arrive at a normalized color.

The distance between this normalized color and the target one is calculated, and if it is within a certain threshold the pixel is marked as being of that color.

This is just one way of determining proximity of one color to another. Another way is to convert the RGB values into Y, Cr, and Cb (Y, U, and V) components and then take the distance between just the chrominance portions (Cr and Cb):

 vec4 textureColor = texture2D(inputImageTexture, textureCoordinate);
 vec4 textureColor2 = texture2D(inputImageTexture2, textureCoordinate2);

 float maskY = 0.2989 * colorToReplace.r + 0.5866 * colorToReplace.g + 0.1145 * colorToReplace.b;
 float maskCr = 0.7132 * (colorToReplace.r - maskY);
 float maskCb = 0.5647 * (colorToReplace.b - maskY);

 float Y = 0.2989 * textureColor.r + 0.5866 * textureColor.g + 0.1145 * textureColor.b;
 float Cr = 0.7132 * (textureColor.r - Y);
 float Cb = 0.5647 * (textureColor.b - Y);

 float blendValue = 1.0 - smoothstep(thresholdSensitivity, thresholdSensitivity + smoothing, distance(vec2(Cr, Cb), vec2(maskCr, maskCb)));

This code is what I use in a chroma keying shader, and it's based on a similar calculation that Apple uses in one of their sample applications. Which one is best can depend on the particular situation you're facing.

Bellbird answered 11/10, 2012 at 20:16 Comment(1)
Brad, Thanks. A lot of info here. I am digging into your ColorTracking app and your video and will come back with comment/Questions. ThanksLovejoy

© 2022 - 2024 — McMap. All rights reserved.