How to get the pixel color on touch?
Asked Answered
S

6

26

I know this is a common question and there are a lot of answers of this question. I've used some of this. Although many of them are the same. But the sad thing for me is that none of them worked for me. The following codes i've used till now.

-(void)getRGBAsFromImage:(UIImage*)image atX:(int)xx andY:(int)yy
{
    // First get the image into your data buffer
    CGImageRef imageRef = [image CGImage];
    NSUInteger width = CGImageGetWidth(imageRef);
    NSUInteger height = CGImageGetHeight(imageRef);
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    unsigned char *rawData = (unsigned char*) calloc(height * width * 4, sizeof(unsigned char));
    NSUInteger bytesPerPixel = 4;
    NSUInteger bytesPerRow = bytesPerPixel * width;
    NSUInteger bitsPerComponent = 8;
    CGContextRef context = CGBitmapContextCreate(rawData, width, height,
                                                 bitsPerComponent, bytesPerRow, colorSpace,
                                                 kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);
    CGColorSpaceRelease(colorSpace);

    CGContextDrawImage(context, CGRectMake(0, 0, width, height), imageRef);
    CGContextRelease(context);

    // Now your rawData contains the image data in the RGBA8888 pixel format.
    int byteIndex = (bytesPerRow * yy) + xx * bytesPerPixel;

    CGFloat red   = (rawData[byteIndex]     * 1.0) / 255.0;
    CGFloat green = (rawData[byteIndex + 1] * 1.0) / 255.0;
    CGFloat blue  = (rawData[byteIndex + 2] * 1.0) / 255.0;
    CGFloat alpha = (rawData[byteIndex + 3] * 1.0) / 255.0;
    byteIndex += 4;
    NSLog(@"the vale of the rbg of red is %f",red);

    demoColor.tintColor = [UIColor colorWithRed:red green:green blue:blue alpha:alpha];
    free(rawData);
}

Here is another approach i've used -

- (CGContextRef) createARGBBitmapContextFromImage:(CGImageRef) inImage {

    CGContextRef    context = NULL;
    CGColorSpaceRef colorSpace;
    void *          bitmapData;
    int             bitmapByteCount;
    int             bitmapBytesPerRow;

    // Get image width, height. We'll use the entire image.
    size_t pixelsWide = CGImageGetWidth(inImage);
    size_t pixelsHigh = CGImageGetHeight(inImage);

    // Declare the number of bytes per row. Each pixel in the bitmap in this
    // example is represented by 4 bytes; 8 bits each of red, green, blue, and
    // alpha.
    bitmapBytesPerRow   = (pixelsWide * 4);
    bitmapByteCount     = (bitmapBytesPerRow * pixelsHigh);

    // Use the generic RGB color space.
    //colorSpace = CGColorSpaceCreateWithName(kCGColorSpaceGenericRGB);
    colorSpace = CGColorSpaceCreateDeviceRGB();
    if (colorSpace == NULL)
    {
        fprintf(stderr, "Error allocating color space\n");
        return NULL;
    }

    // Allocate memory for image data. This is the destination in memory
    // where any drawing to the bitmap context will be rendered.
    bitmapData = malloc( bitmapByteCount );
    if (bitmapData == NULL)
    {
        fprintf (stderr, "Memory not allocated!");
        CGColorSpaceRelease( colorSpace );
        return NULL;
    }

    // Create the bitmap context. We want pre-multiplied ARGB, 8-bits
    // per component. Regardless of what the source image format is
    // (CMYK, Grayscale, and so on) it will be converted over to the format
    // specified here by CGBitmapContextCreate.
    context = CGBitmapContextCreate (bitmapData,
                                     pixelsWide,
                                     pixelsHigh,
                                     8,      // bits per component
                                     bitmapBytesPerRow,
                                     colorSpace,
                                     kCGImageAlphaPremultipliedFirst);
    if (context == NULL)
    {
        free (bitmapData);
        fprintf (stderr, "Context not created!");
    }

    // Make sure and release colorspace before returning
    CGColorSpaceRelease( colorSpace );

    return context;
}


- (UIColor*) getPixelColorAtLocation:(CGPoint)point {
    UIColor* color = nil;
    //CGImageRef inImage = self.image.CGImage;
    CGImageRef inImage = [AppDelegate getInstance].capturedImage.CGImage;
    // Create off screen bitmap context to draw the image into. Format ARGB is 4 bytes for each pixel: Alpa, Red, Green, Blue
    CGContextRef cgctx = [self createARGBBitmapContextFromImage:inImage];
    if (cgctx == NULL) { return nil; /* error */ }

    size_t w = CGImageGetWidth(inImage);
    size_t h = CGImageGetHeight(inImage);
    CGRect rect = {{0,0},{w,h}}; 

    // Draw the image to the bitmap context. Once we draw, the memory
    // allocated for the context for rendering will then contain the
    // raw image data in the specified color space.
    CGContextDrawImage(cgctx, rect, inImage); 

    // Now we can get a pointer to the image data associated with the bitmap
    // context.
    unsigned char* data = CGBitmapContextGetData (cgctx);
    if (data != NULL) {
        //offset locates the pixel in the data from x,y.
        //4 for 4 bytes of data per pixel, w is width of one row of data.
        int offset = 4*((w*round(point.y))+round(point.x));
        int alpha =  data[offset];
        int red = data[offset+1];
        int green = data[offset+2];
        int blue = data[offset+3];
        NSLog(@"offset: %i colors: RGB A %i %i %i  %i",offset,red,green,blue,alpha);
        color = [UIColor colorWithRed:(red/255.0f) green:(green/255.0f) blue:(blue/255.0f) alpha:(alpha/255.0f)];
    }

    // When finished, release the context
    CGContextRelease(cgctx);
    // Free image data memory for the context
    if (data) { free(data); }

    return color;
}

But none of these has been worked for me. Please help me out to work with this. Is there anything i'm missing ?

I have 2 UIImageView in my UI. The one which is in back contains the image from which i need to pick the color of a pixel that is touched. And the other UIImageView is to paint over the back image with the picked color.

Please help. Any help will be very much appreciated.

Sergiosergipe answered 7/10, 2012 at 15:37 Comment(2)
none of these has worked isn't much of an explanation. What hasn't worked? Does it crash? Return the color of the wrong pixel?Allwein
Sorry for not being specific. It didn't crashed. It is returning wrong pixel color. Even i can not assume which pixel color is it returning.Sergiosergipe
A
52

This is the one I've used, and it looks simpler than the methods you've tried.

In my custom view class, I have this:

- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
    UITouch *touch = [[event allTouches] anyObject];
    CGPoint loc = [touch locationInView:self];
    self.pickedColor = [self colorOfPoint:loc];
}

colorOfPoint is a method in a category on UIView, with this code:

#import "UIView+ColorOfPoint.h"
#import <QuartzCore/QuartzCore.h>

@implementation UIView (ColorOfPoint)

-(UIColor *) colorOfPoint:(CGPoint)point
    {
    unsigned char pixel[4] = {0};
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef context = CGBitmapContextCreate(pixel,
            1, 1, 8, 4, colorSpace, (CGBitmapInfo)kCGImageAlphaPremultipliedLast);

    CGContextTranslateCTM(context, -point.x, -point.y);

    [self.layer renderInContext:context];

    CGContextRelease(context);
    CGColorSpaceRelease(colorSpace);
    UIColor *color = [UIColor colorWithRed:pixel[0]/255.0
        green:pixel[1]/255.0 blue:pixel[2]/255.0
        alpha:pixel[3]/255.0];
    return color;
    }

Don't forget to import the category into the custom view class and add the QuartzCore framework.


Trivial note for 2013: cast that last argument as (CGBitmapInfo) to avoid an implicit conversion warning: example here. Hope it helps.

Aerotherapeutics answered 7/10, 2012 at 16:36 Comment(1)
This does not take alpha into account. If alpha is ≠ 1.0, the color components are off due to premultipliedness.Nunhood
S
16

Swift 4, Xcode 9 - With a UIImageView Extension

This is a combination of all of the answers above but in an extension

extension UIImageView {
    func getPixelColorAt(point:CGPoint) -> UIColor{
        
        let pixel = UnsafeMutablePointer<CUnsignedChar>.allocate(capacity: 4)
        let colorSpace = CGColorSpaceCreateDeviceRGB()
        let bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.premultipliedLast.rawValue)
        let context = CGContext(data: pixel, width: 1, height: 1, bitsPerComponent: 8, bytesPerRow: 4, space: colorSpace, bitmapInfo: bitmapInfo.rawValue)
        
        context!.translateBy(x: -point.x, y: -point.y)
        layer.render(in: context!)
        let color:UIColor = UIColor(red: CGFloat(pixel[0])/255.0,
                                    green: CGFloat(pixel[1])/255.0,
                                    blue: CGFloat(pixel[2])/255.0,
                                    alpha: CGFloat(pixel[3])/255.0)
        
        pixel.deallocate(capacity: 4)
        return color
    }
}

How to use

override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
    let touch = touches.first
    if let point = touch?.location(in: view) {
        let color = myUIImageView.getPixelColorAt(point: point)
        print(color)
    }
}
Superintend answered 4/5, 2017 at 2:3 Comment(5)
What is layer.render(in: context!) ? it says unresolved identifier layerDramatic
Wow thx! The code works. But I keep wondering is there no simpler way? Or is there no easy way to access pixels because we are not supposed to do that?Banderillero
@Illep, layer is a property of the UIImageView. You will get this error if you didn't use the word "extension" and used "class" instead at the very beginning.Superintend
Be aware you may want to call super.touchesBegan() in the override. See discussion at developer.apple.com/documentation/uikit/uiresponder/…Burnight
you are the best 😊🙏- @MarkMoeykensHereld
I
13

Thank you for @Aggressor's post the code above

Swift 2.1

func getPixelColorAtPoint(point:CGPoint) -> UIColor{

   let pixel = UnsafeMutablePointer<CUnsignedChar>.alloc(4)
   let colorSpace = CGColorSpaceCreateDeviceRGB()
   let bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.PremultipliedLast.rawValue)
   let context = CGBitmapContextCreate(pixel, 1, 1, 8, 4, colorSpace, bitmapInfo.rawValue)

   CGContextTranslateCTM(context, -point.x, -point.y)
   view.layer.renderInContext(context!)
   let color:UIColor = UIColor(red: CGFloat(pixel[0])/255.0, green: CGFloat(pixel[1])/255.0, blue: CGFloat(pixel[2])/255.0, alpha: CGFloat(pixel[3])/255.0)

   pixel.dealloc(4)
   return color
}

Swift 3, Xcode Version 8.2 (8C38) and Swift 4, Xcode Version 9.1 (9B55)

 func getPixelColorAtPoint(point:CGPoint, sourceView: UIView) -> UIColor{

    let pixel = UnsafeMutablePointer<CUnsignedChar>.allocate(capacity: 4)
    let colorSpace = CGColorSpaceCreateDeviceRGB()
    let bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.premultipliedLast.rawValue)
    let context = CGContext(data: pixel, width: 1, height: 1, bitsPerComponent: 8, bytesPerRow: 4, space: colorSpace, bitmapInfo: bitmapInfo.rawValue)
    var color: UIColor? = nil

    if let context = context {
        context.translateBy(x: -point.x, y: -point.y)
        sourceView.layer.render(in: context)

        color = UIColor(red: CGFloat(pixel[0])/255.0,
                        green: CGFloat(pixel[1])/255.0,
                        blue: CGFloat(pixel[2])/255.0,
                        alpha: CGFloat(pixel[3])/255.0)

        pixel.deallocate(capacity: 4)
    }
    return color
}
Insalivate answered 25/12, 2015 at 8:16 Comment(4)
Your code returns correct pixel value on simulator but on device it returns wrong valuesDeltoro
@Deltoro Thank for notice . I will fix it soon .Insalivate
Hello, I just test the code above, it seems correct on both simulator and device. Here is my test linkInsalivate
You are weclome @DeltoroInsalivate
M
9

Great answer rdelmar this helped me A LOT!

Here is how I did the above in Swift:

override func touchesBegan(touches: NSSet, withEvent event: UIEvent)
    {
        var touch:UITouch = event.allTouches()!.anyObject() as UITouch
        var loc = touch.locationInView(self)
        var color:UIColor = getPixelColorAtPoint(loc)
        println(color)
    }

    //returns the color data of the pixel at the currently selected point
    func getPixelColorAtPoint(point:CGPoint)->UIColor
    {
        let pixel = UnsafeMutablePointer<CUnsignedChar>.alloc(4)
        var colorSpace = CGColorSpaceCreateDeviceRGB()
        let bitmapInfo = CGBitmapInfo(CGImageAlphaInfo.PremultipliedLast.rawValue)
        let context = CGBitmapContextCreate(pixel, 1, 1, 8, 4, colorSpace, bitmapInfo)

        CGContextTranslateCTM(context, -point.x, -point.y)
        layer.renderInContext(context)
        var color:UIColor = UIColor(red: CGFloat(pixel[0])/255.0, green: CGFloat(pixel[1])/255.0, blue: CGFloat(pixel[2])/255.0, alpha: CGFloat(pixel[3])/255.0)

        pixel.dealloc(4)
        return color
    }
Mcminn answered 2/1, 2015 at 18:34 Comment(0)
R
2

First, I'd like to thank the author of this code, it helped me a lot for my game project, as I was looking for this function to do a pixel-perfect hitbox (excluding where the aplha is O). Here's a little update for Swift 5:

// Fonction permettant de retourner les valeurs RGBA d'un pixel d'une vue
func getPixelColor(atPosition:CGPoint) -> UIColor{

    var pixel:[CUnsignedChar] = [0, 0, 0, 0];
    let colorSpace = CGColorSpaceCreateDeviceRGB();
    let bitmapInfo = CGBitmapInfo(rawValue:    CGImageAlphaInfo.premultipliedLast.rawValue);
    let context = CGContext(data: &pixel, width: 1, height: 1, bitsPerComponent: 8, bytesPerRow: 4, space: colorSpace, bitmapInfo: bitmapInfo.rawValue);

    context!.translateBy(x: -atPosition.x, y: -atPosition.y);
    layer.render(in: context!);
    let color:UIColor = UIColor(red: CGFloat(pixel[0])/255.0,
                                green: CGFloat(pixel[1])/255.0,
                                blue: CGFloat(pixel[2])/255.0,
                                alpha: CGFloat(pixel[3])/255.0);

    return color;

}

I had somme issues with pixel.dealloc(4), as in Swift 5 it seems that you can't dealloc with capacity parameter anymore. I removed the (4), but it had some weird behavior (as the dealloc() didn't dealloc the whole array).

I didn't do an extension of UIView as in my project I have my own subclass, but this can be easily done.

The way I implement the code:

// Méthode déterminant si le "touch" est validé par l'objet (par défaut, exclut les zones transparentes et les objets invisibles). A surcharger si nécessaire.
func isHit(atPosition position:CGPoint) -> Bool
{

    // Si l'objet n'est pas caché (paramètre isHidden) et si la zone touchée correspond à une zone effectivement dessinée (non transparente), retourne true.
    if (!self.isHidden && self.getPixelColor(atPosition: position).cgColor.alpha != 0) {return true}
    else {return false}

}

I hope this can help.

Radicand answered 22/6, 2019 at 21:31 Comment(0)
C
1
import UIKit

class ViewController: UIViewController {
    
    let imageV = UIImageView(frame: CGRect(x: 0, y: 0, width: 223, height: 265))
    override func viewDidLoad() {
        super.viewDidLoad()
        imageV.center = view.center
        imageV.image = UIImage(named: "color_image")
        view.addSubview(imageV)
        // Do any additional setup after loading the view.
        
        let tapGestureRecognizer = UITapGestureRecognizer(target: self, action: #selector(imageTapped(tapGestureRecognizer:)))
        imageV.isUserInteractionEnabled = true
        imageV.addGestureRecognizer(tapGestureRecognizer)
    }
    
    
    @objc func imageTapped(tapGestureRecognizer: UITapGestureRecognizer)
    {
        
        let cgpoint = tapGestureRecognizer.location(in: view)
        let color : UIColor = colorOfPoint(point: cgpoint)
        print("Picked Color is:",color)
        let new = UIView(frame: CGRect(x: 10, y: 10, width: 50, height: 50))
        new.backgroundColor = color
        view.addSubview(new)
    }
    
    func colorOfPoint(point:CGPoint) -> UIColor
    {
        let colorSpace:CGColorSpace = CGColorSpaceCreateDeviceRGB()
        let bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.premultipliedLast.rawValue)
        
        var pixelData:[UInt8] = [0, 0, 0, 0]
        
        let context = CGContext(data: &pixelData, width: 1, height: 1, bitsPerComponent: 8, bytesPerRow: 4, space: colorSpace, bitmapInfo: bitmapInfo.rawValue)
        context!.translateBy(x: -point.x, y: -point.y);
        self.view.layer.render(in: context!)
        
        let red:CGFloat = CGFloat(pixelData[0])/CGFloat(255.0)
        let green:CGFloat = CGFloat(pixelData[1])/CGFloat(255.0)
        let blue:CGFloat = CGFloat(pixelData[2])/CGFloat(255.0)
        let alpha:CGFloat = CGFloat(pixelData[3])/CGFloat(255.0)
        
        let color:UIColor = UIColor(red: red, green: green, blue: blue, alpha: alpha)
        return color
    }
}
Carpentry answered 21/10, 2021 at 6:44 Comment(1)
While this code may solve the question, including an explanation of how and why this solves the problem would really help to improve the quality of your post, and probably result in more up-votes. Remember that you are answering the question for readers in the future, not just the person asking now. Please edit your answer to add explanations and give an indication of what limitations and assumptions apply.Easily

© 2022 - 2024 — McMap. All rights reserved.