I have an UIImage and want to shift it's saturation about +10%. Are there standard methods or functions that can be used for this?
There's a CoreImage filter for this. CIColorControls
Just set the inputSaturation
to < 1.0 to desaturate or > 1.0 to increase saturation...
eg. Here's a method I've added in a category on UIImage
to desaturate an image.
-(UIImage*) imageDesaturated {
CIContext *context = [CIContext contextWithOptions:nil];
CIImage *ciimage = [CIImage imageWithCGImage:self.CGImage];
CIFilter *filter = [CIFilter filterWithName:@"CIColorControls"];
[filter setValue:ciimage forKey:kCIInputImageKey];
[filter setValue:@0.0f forKey:kCIInputSaturationKey];
CIImage *result = [filter valueForKey:kCIOutputImageKey];
CGImageRef cgImage = [context createCGImage:result fromRect:[result extent]];
UIImage *image = [UIImage imageWithCGImage:cgImage];
CGImageRelease(cgImage);
return image;
}
UIImage *image = [UIImage imageWithCGImage:cgImage scale:self.scale orientation:self.imageOrientation];
as it ensures the scale and orientation are maintained from the original, which is especially important when dealing with retina devices. –
Quixote Starting with a View-based Application Template, create a new subclass of UIView like so:
// header file
@interface DesatView : UIView {
UIImage *image;
float saturation;
}
@property (nonatomic, retain) UIImage *image;
@property (nonatomic) float desaturation;
@end
// implementation file
#import "DesatView.h"
@implementation DesatView
@synthesize image, desaturation;
-(void)setSaturation:(float)sat;
{
saturation = sat;
[self setNeedsDisplay];
}
- (id)initWithFrame:(CGRect)frame {
if (self = [super initWithFrame:frame]) {
self.backgroundColor = [UIColor clearColor]; // else background is black
desaturation = 0.0; // default is no effect
}
return self;
}
- (void)drawRect:(CGRect)rect {
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextSaveGState(context);
CGContextTranslateCTM(context, 0.0, self.bounds.size.height); // flip image right side up
CGContextScaleCTM(context, 1.0, -1.0);
CGContextDrawImage(context, rect, self.image.CGImage);
CGContextSetBlendMode(context, kCGBlendModeSaturation);
CGContextClipToMask(context, self.bounds, image.CGImage); // restricts drawing to within alpha channel
CGContextSetRGBFillColor(context, 0.0, 0.0, 0.0, desaturation);
CGContextFillRect(context, rect);
CGContextRestoreGState(context); // restore state to reset blend mode
}
@end
Now in your view controller's viewDidLoad method, put the view on screen and set it's saturation like this:
- (void)viewDidLoad {
[super viewDidLoad];
DesatView *dv = [[DesatView alloc] initWithFrame:CGRectZero];
dv.image = [UIImage imageNamed:@"someImage.png"];
dv.frame = CGRectMake(0, 0, dv.image.size.width, dv.image.size.height);
dv.center = CGPointMake(160, 240); // put it mid-screen
dv.desaturation = 0.2; // desaturate by 20%,
[self.view addSubview:dv]; // put it on screen
}
Change the saturation like this:
dv.saturation = 0.8; // desaturate by 80%
Obviously if you want to use it outside of a single method, you should make dv an ivar of the view controller. Hope this helps.
Swift 5
extension UIImage {
func withSaturationAdjustment(byVal: CGFloat) -> UIImage {
guard let cgImage = self.cgImage else { return self }
guard let filter = CIFilter(name: "CIColorControls") else { return self }
filter.setValue(CIImage(cgImage: cgImage), forKey: kCIInputImageKey)
filter.setValue(byVal, forKey: kCIInputSaturationKey)
guard let result = filter.value(forKey: kCIOutputImageKey) as? CIImage else { return self }
guard let newCgImage = CIContext(options: nil).createCGImage(result, from: result.extent) else { return self }
return UIImage(cgImage: newCgImage, scale: UIScreen.main.scale, orientation: imageOrientation)
}
}
Here is an implementation of Bessey's hack (put this code in a UIImage category). It ain't fast and it definitely shifts hues, but it sort of works.
+ (CGFloat) clamp:(CGFloat)pixel
{
if(pixel > 255) return 255;
else if(pixel < 0) return 0;
return pixel;
}
- (UIImage*) saturation:(CGFloat)s
{
CGImageRef inImage = self.CGImage;
CFDataRef ref = CGDataProviderCopyData(CGImageGetDataProvider(inImage));
UInt8 * buf = (UInt8 *) CFDataGetBytePtr(ref);
int length = CFDataGetLength(ref);
for(int i=0; i<length; i+=4)
{
int r = buf[i];
int g = buf[i+1];
int b = buf[i+2];
CGFloat avg = (r + g + b) / 3.0;
buf[i] = [UIImage clamp:(r - avg) * s + avg];
buf[i+1] = [UIImage clamp:(g - avg) * s + avg];
buf[i+2] = [UIImage clamp:(b - avg) * s + avg];
}
CGContextRef ctx = CGBitmapContextCreate(buf,
CGImageGetWidth(inImage),
CGImageGetHeight(inImage),
CGImageGetBitsPerComponent(inImage),
CGImageGetBytesPerRow(inImage),
CGImageGetColorSpace(inImage),
CGImageGetAlphaInfo(inImage));
CGImageRef img = CGBitmapContextCreateImage(ctx);
CFRelease(ref);
CGContextRelease(ctx);
return [UIImage imageWithCGImage:img];
}
Anyone have any ideas on how to improve this without doing a full HSV conversion? Or better yet, a true implementation for:
- (UIImage*) imageWithHueOffset:(CGFloat)h saturation:(CGFloat)s value:(CGFloat)v
Nothing quite that straightforward. The easiest solution is probably to make an in-memory CGContext with a known pixel format, draw the image into that context, then read/modify the pixels in the known-format buffer.
I don't think CG supports a color space with a separate saturation channel, so you'll have to either convert from RGB to HSV or HSL, or do the calculations directly in the RGB space.
One way to do the calculation directly in RGB might be something like this:
average = (R + G + B) / 3;
red_delta = (R - average) * 11 / 10;
green_delta = (G - average) * 11 / 10;
blue_delta = (B - average) * 11 / 10;
R = average + red_delta;
G = average + green_delta;
B = average + blue_delta;
// clip R,G,B to be in 0-255 range
This will move the channels that are away from the mean about 10% further away. This is almost like increasing the saturation, though it'll give hue shifts for some colors.
I've just tested Mike Pollard's method and it is correct.
Here is the swift 3 version
let ciimage = CIImage.init(cgImage: self.givenImage.cgImage)
let filter = CIFilter.init(name: "CIColorControls")
filter?.setValue(ciimage, forKey: kCIInputImageKey)
filter?.setValue(0.0, forKey: kCIInputSaturationKey)
let result = filter?.value(forKey: kCIOutputImageKey) as! CIImage
let cgimage = CIContext.init(options: nil).createCGImage(result, from: result.extent)
let image = UIImage.init(cgImage: cgimage!)
Since iOS 13.0 you can use built-in type-safe API (no more hardcoding string):
extension UIImage {
/// Returns desaturated image.
func desaturated() -> UIImage? {
let filter = CIFilter.colorControls()
filter.inputImage = CIImage(image: self)
filter.saturation = 0 // 0 means no saturation
guard let outputCIImage = filter.outputImage else { return nil }
return UIImage(ciImage: outputCIImage)
}
}
© 2022 - 2024 — McMap. All rights reserved.
cgImage
, though insertingCFAutorelease(cgImage);
before thereturn
should fix that. You can also usekCIInputImageKey
andkCIInputSaturationKey
instead of@"inputImage"
and@"inputSaturation"
, and@0.0f
instead of[NSNumber numberWithFloat:0.0f]
. – Pasadis