I am using CIFilter and the CIHueBlendMode in order to blend an image (foreground) and a red layer (background)
I am doing the exact same thing in Photoshop CS6 with the Hue Blend Mode (copied the foreground image and used the same red to fill the background layer)
Unfortunately the results are very different: (and the same applies to comparing CIColorBlendMode, CIDifferenceBlendMode and CISaturationBlendMode with their Photoshop counterparts)
My question is: Is it me? Am I doing something wrong here? Or are Core Image Blend Modes and Photoshop Blend Modes altogether different things?
// Blending the input image with a red image
CIFilter* composite = [CIFilter filterWithName:@"CIHueBlendMode"];
[composite setValue:inputImage forKey:@"inputImage"];
[composite setValue:redImage forKey:@"inputBackgroundImage"];
CIImage *outputImage = [composite outputImage];
CGImageRef cgimg = [context createCGImage:outputImage fromRect:[outputImage extent]];
imageView.image = [UIImage imageWithCGImage:cgimg];
CGImageRelease(cgimg);
// This is how I create the red image:
- (CIImage *)imageWithColor:(UIColor *)color inRect:(CGRect)rect
{
UIGraphicsBeginImageContext(rect.size);
CGContextRef _context = UIGraphicsGetCurrentContext();
CGContextSetFillColorWithColor(_context, [color CGColor]);
CGContextFillRect(_context, rect);
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return [[CIImage alloc] initWithCGImage:image.CGImage options:nil];
}