Recently, I've been trying to set up a CIColorCube on a CIImage to create a custom effect. Here's what I have now:
uint8_t color_cube_data[8*4] = {
0, 0, 0, 1,
255, 0, 0, 1,
0, 255, 0, 1,
255, 255, 0, 1,
0, 0, 255, 1,
255, 0, 255, 1,
0, 255, 255, 1,
255, 255, 255, 1
};
NSData * cube_data =[NSData dataWithBytes:color_cube_data length:8*4*sizeof(uint8_t)];
CIFilter *filter = [CIFilter filterWithName:@"CIColorCube"];
[filter setValue:beginImage forKey:kCIInputImageKey];
[filter setValue:@2 forKey:@"inputCubeDimension"];
[filter setValue:cube_data forKey:@"inputCubeData"];
outputImage = [filter outputImage];
I've checked out the WWDC 2012 Core Image session, and what I have still doesn't work. I've also checked the web, and there are very few resources available on this issue. My code above just returns a black image.
In Apple's developer library, it says:
This filter applies a mapping from RGB space to new color values that are defined in inputCubeData. For each RGBA pixel in inputImage the filter uses the R,G and B values to index into a thee dimensional texture represented by inputCubeData. inputCubeData contains floating point RGBA cells that contain linear premultiplied values. The data is organized into inputCubeDimension number of xy planes, with each plane of size inputCubeDimension by inputCubeDimension. Input pixel components R and G are used to index the data in x and y respectively, and B is used to index in z. In inputCubeData the R component varies fastest, followed by G, then B.
However, this makes no sense to me. How does my inputCubeData
need to be formatted?