cocoa: Read pixel color of NSImage
Asked Answered
M

2

5

I have an NSImage. I would like to read the NSColor for a pixel at some x and y. Xcode seems to thing that there is a colorAtX:y: method on NSImage, but this causes a crash saying that there is no such method for NSImage. I have seen some examples where you create an NSBitmapImageRep and call the same method on that, but I have not been able to successfully convert my NSImage to an NSBitmapImageRep. The pixels on the NSBitmapImageRep are different for some reason.

There must be a simple way to do this. It cannot be this complicated.

Mungovan answered 15/3, 2012 at 18:36 Comment(3)
In what way are the "pixels on the NSBitmapImageRep" different? I have done it that way before and gotten good results.Become
See Rob Keniger's answer below. The anchor point (origin) of NSImage is top-left, and NSBitmapImageRep is bottom-left. When you convert from one to the other, the image coordinates are flipped vertically.Mungovan
possible duplicate of Get pixels and colours from NSImagePhobos
S
11

Without seeing your code it's difficult to know what's going wrong.

You can draw the image to an NSBitmapImageRep using the initWithData: method and pass in the image's TIFFRepresentation.

You can then get the pixel value using the method colorAtX:y:, which is a method of NSBitmapImageRep, not NSImage:

NSBitmapImageRep* imageRep = [[NSBitmapImageRep alloc] initWithData:[yourImage TIFFRepresentation]];
NSSize imageSize = [yourImage size];
CGFloat y = imageSize.height - 100.0;
NSColor* color = [imageRep colorAtX:100.0 y:y];
[imageRep release];

Note that you must make an adjustment for the y value because the colorAtX:y method uses a coordinate system that starts in the top left of the image, whereas the NSImage coordinate system starts at the bottom left.

Alternatively, if the pixel is visible on-screen then you can use the NSReadPixel() function to get the color of a pixel in the current coordinate system.

Stoop answered 16/3, 2012 at 1:18 Comment(4)
Ok so I am trying it this way, but something seems off. Is the coordinate system of an NSBitmapImageRep different than an NSImage? Everything seems to be flipped vertically. It would seem that NSImage is being drawn from the bottom left (like view's are by default) and the NSBitmapImageRep is drawing from the top left (thus flipping the image). I'm not actually drawing to the screen, so I can't see.Mungovan
It appears you are correct, the coordinates are flipped. I've added code to account for that to my answer.Stoop
Thanks for your help Rob. I ran into another road block while editing the image, and found that I had to create a new NSImage instance from the altered NSBitmapImageRep. So my code ended up being along the lines of create NSBitmapImageRep from NSImage, read pixels, change pixels, create NSImage from altered NSBitmapImageRep, assign new NSImage to view. Thanks for the help!Mungovan
Note that you might want to use imageRep.pixelsWide and imageRep.pixelsHigh, as these are resolution independent. When using the size as in this example it doesn't necessarily correspond to the pixels you are trying to read.Longdrawnout
N
1

Function colorAtX of NSBitmapImageRep seems not to use the device color space, which may lead to color values that are slightly different from what you actually see. Use this code to get the correct color in the current device color space:

[yourImage lockFocus]; // yourImage is just your NSImage variable
NSColor *pixelColor = NSReadPixel(NSMakePoint(1, 1)); // Or another point
[yourImage unlockFocus];
Necessarian answered 23/1, 2018 at 11:15 Comment(2)
Could you give a little more info? What does 'not always lead to the exact correct color' mean?Fears
@walteronassis: I changed my answer a bitNecessarian

© 2022 - 2024 — McMap. All rights reserved.