CIAreaHistogram inputScale factor
Asked Answered
C

1

8

I'm building an application that uses the CIAreaHistogram Core Image filter. I use an inputCount value (number of buckets) of 10 for testing, and an inputScale value of 1.

I get the CIImage for the histogram itself, which I then run through a custom kernel (see end of post) to set alpha values to 1 (since otherwise the alpha value from the histogram calculations is premultiplied) and then convert it to an NSBitmapImageRep.

I then scan through the image rep's buffer and print the RGB values (skipping the alpha values). However, when I do this, the sum of the R, G, and B values across the 10 do not necessarily add up to 255.

For example, with a fully black image, I apply the histogram, then the custom kernel, and get the following output:

RGB: 255 255 255
RGB: 0 0 0
RGB: 0 0 0
RGB: 0 0 0
RGB: 0 0 0
RGB: 0 0 0
RGB: 0 0 0
RGB: 0 0 0
RGB: 0 0 0
RGB: 0 0 0

This is as I expect, since all pixels are black, so everything is in the first bucket. However, if I run the same algorithm with a color image, I get the following:

RGB: 98 76 81
RGB: 164 97 87
RGB: 136 161 69
RGB: 100 156 135
RGB: 80 85 185
RGB: 43 34 45
RGB: 31 19 8
RGB: 19 7 3
RGB: 12 5 2
RGB: 16 11 11

Add up the values for R, G, and B - they don't add up to 255. This causes problems because I need to compare two of these histograms, and my algorithm expects the sums to be between 0 and 255. I could obviously scale these values, but I want to avoid that extra step for performance reasons.

I noticed something else interesting that might give some clue as to why this is happening. In my custom kernel, I simply set the alpha value to 1. I tried a second kernel (see end of post) that sets all pixels to red. Clearly, green and blue values are zero. However, I get this result when checking the values from the bitmap rep:

RGB: 255 43 25

But I just set G and B to zero! This seems to be part of the problem, which indicates color management. But since I explicitly set the values in the kernel, there's only one block of code where this can be happening - the conversion to an NSBitmapImageRep from the CIImage from the filter:

NSBitmapImageRep *bitmapRep = [[NSBitmapImageRep alloc] initWithCIImage:kernelOutput];
unsigned char *buf = [bitmapRep bitmapData];

Once I set the pixels to RGB 255 0 0, then execute those lines, then read the buffer, the RGB values are all 255 43 25. I have further tried setting the color space of the original CGImageRef on which the entire workflow is based to kCGColorSpaceGenericRGB, thinking the color profile may be carrying through, but to no avail.

Can anyone tell me why a CIFilter kernel would behave this way, and how I could solve it?

As mentioned before, here are copies of the CIFilter kernel functions I use. First, the one that sets alpha to 1:

kernel vec4 adjustHistogram(sampler src)
{
    vec4 pix = sample(src, destCoord());
    pix.a = 1.0;
    return pix;
}

And next, the one that sets all pixels to RGB 255 0 0 but that ends up 255 43 25 once it converts to NSBitmapImageRep:

kernel vec4 adjustHistogram(sampler src)
{
    vec4 pix = sample(src, destCoord());
    pix.r = 1.0; pix.g = 0.0; pix.b = 0.0;
    pix.a = 1.0;
    return pix;
}

Thanks in advance for your help.

Cicada answered 20/3, 2013 at 19:29 Comment(0)
H
0

You only need one line of code to generate and display a histogram when using a custom Core Image filter (or whenever you are creating a new CIImage object or are replacing an existing one):

return [CIFilter filterWithName:@"CIHistogramDisplayFilter" keysAndValues:kCIInputImageKey, self.inputImage, @"inputHeight", @100.0, @"inputHighLimit", @1.0, @"inputLowLimit", @0.0, nil].outputImage;
Helldiver answered 19/7, 2015 at 9:44 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.