I have an NSImage
. I would like to read the NSColor
for a pixel at some x and y. Xcode seems to thing that there is a colorAtX:y:
method on NSImage
, but this causes a crash saying that there is no such method for NSImage. I have seen some examples where you create an NSBitmapImageRep
and call the same method on that, but I have not been able to successfully convert my NSImage to an NSBitmapImageRep
. The pixels on the NSBitmapImageRep
are different for some reason.
There must be a simple way to do this. It cannot be this complicated.