CGImageRef width doesn't agree with bytes-per-row
Asked Answered
R

1

5

I'm trying to read pixels out of the screen buffer, I'm creating a CGImageRef with CGDisplayCreateImage, but the values for CGImageGetWidth and CGImageGetBytesPerRow Don't make sense together, dividing the bytes per row by the bytes per pixel gives me 1376 pixels per row, but the width of the image is 1366.

What's going on here? Is there some kind of padding in the image? How do I read from the data I'm getting out of it safely, and with the correct results?

Edit: The minimal code needed to reproduce this is the following:

#import <Foundation/Foundation.h>
#import <ApplicationServices/ApplicationServices.h>

int main(int argc, const char * argv[])
{

    @autoreleasepool {
        CGImageRef image = CGDisplayCreateImage(CGMainDisplayID());

        size_t width = CGImageGetWidth(image);
        size_t bpr = CGImageGetBytesPerRow(image);
        size_t bpp = CGImageGetBitsPerPixel(image);
        size_t bpc = CGImageGetBitsPerComponent(image);
        size_t bytes_per_pixel = bpp / bpc;

        NSLog(@"%li %li", bpr/bytes_per_pixel, width);

        CGImageRelease(image);

    }
    return 0;
}
Rations answered 7/9, 2014 at 1:50 Comment(3)
I'm just getting the image from the display buffer.Rations
What are the image dimensions supposed to be?Scalene
Really the only thing that matters here is that the width of the data is different from the width of the content. I shouldn't get 1376 for number of pixels in the row, and have the width of the image be 1366.Rations
D
19

The bytes per row (also called the “stride”) can be larger than the width of the image. The extra bytes at the end of each row are simply ignored. The bytes for the pixel at (x, y) start at offset y * bytesPerRow + x * bytesPerPixel.

Notice that 1376 is exactly divisible by 32 (and all smaller powers of 2), while 1366 is not. The CPUs in modern Macs have instructions that operate on 16 or 32 or 64 bytes at a time, so the CGImage algorithms can be more efficient if the image's stride is a multiple of 16 or 32 or 64. CGDisplayCreateImage was written by someone who knows this. (Same for CGBitmapContextCreate / CGContext.init if you pass 0 for bytesPerRow.)

Dynamometer answered 7/9, 2014 at 2:26 Comment(2)
Thank you, this is exactly what I needed, the explanation of why it does it is also very welcome.Rations
As usual. Rob has stupendous answers.Genethlialogy

© 2022 - 2024 — McMap. All rights reserved.