I'm trying to read pixels out of the screen buffer, I'm creating a CGImageRef
with CGDisplayCreateImage
, but the values for CGImageGetWidth
and CGImageGetBytesPerRow
Don't make sense together, dividing the bytes per row by the bytes per pixel gives me 1376 pixels per row, but the width of the image is 1366.
What's going on here? Is there some kind of padding in the image? How do I read from the data I'm getting out of it safely, and with the correct results?
Edit: The minimal code needed to reproduce this is the following:
#import <Foundation/Foundation.h>
#import <ApplicationServices/ApplicationServices.h>
int main(int argc, const char * argv[])
{
@autoreleasepool {
CGImageRef image = CGDisplayCreateImage(CGMainDisplayID());
size_t width = CGImageGetWidth(image);
size_t bpr = CGImageGetBytesPerRow(image);
size_t bpp = CGImageGetBitsPerPixel(image);
size_t bpc = CGImageGetBitsPerComponent(image);
size_t bytes_per_pixel = bpp / bpc;
NSLog(@"%li %li", bpr/bytes_per_pixel, width);
CGImageRelease(image);
}
return 0;
}