Render a CVPixelBuffer to an NSView (macOS)
Asked Answered
B

1

2

I have a CVPixelBuffer that I'm trying to efficiently draw on screen.

The not-efficient way of turning into an NSImage works but is very slow, dropping about 40% of my frames.

Therefore, I've tried rendering it on-screen using CIContext's drawImage:inRect:fromRect. The CIContext was initialized with a NSOpenGLContext who's view was set to my VC's view. When I have a new image, I call the drawImage method which doesn't spit out any errors... but doesn't display anything on screen either (it did log errors when my contexts were not correctly setup).

I've tried to find an example of how this is done on MacOS, but everything seems to be for iOS nowadays.

EDIT:

Here's some of the code I am using. I've left out irrelevant sections

On viewDidLoad I init the GL and CI contexts

NSOpenGLPixelFormatAttribute pixelFormatAttr[] = {
  kCGLPFAAllRenderers, 0
};
NSOpenGLPixelFormat *glPixelFormat = [[NSOpenGLPixelFormat alloc] initWithAttributes: pixelFormatAttr];
NSOpenGLContext *glContext = [[NSOpenGLContext alloc] initWithFormat:glPixelFormat shareContext:nil];
glContext.view = self.view;

self.ciContext = [CIContext contextWithCGLContext:glContext.CGLContextObj pixelFormat:glPixelFormat.CGLPixelFormatObj colorSpace:nil options:nil];

Then, when a new frame is ready, I do:

dispatch_async(dispatch_get_main_queue(), ^{
       [vc.ciContext drawImage:ciImage inRect:vc.view.bounds fromRect:ciImage.extent];
        vc.isRendering = NO;
});

I am not sure I'm calling draw in the right place, but I can't seem to find out where is this supposed to go.

Beardless answered 19/9, 2017 at 12:48 Comment(5)
I'm not very skilled with OSX, but I think you must set the gl-context as current (use [glContext makeCurrentContext];. And don't forget to swap buffers ( [glContext flushBuffer];) after your rendering.Mettle
There's logic to that, but it doesn't work though (tried it already). I did have a little breakthrough by moving all of that code inside an NSOpenGLView and calling the drawImage: method in it's drawRect: method... and calling a glFlush() at the end. However, the image was displayed in the top right quad of the screen and really blurry. The positioning reminds me of how the GL renderer on iOS works... so I do have an idea on how to fix that.Beardless
It seems to me you're mixing OpenGL and OpenGL-ES (that "CI context"). There is plenty of info at Apple Developers siteMettle
I got my info from Apple’s documentation. The gist of it is that CIContext’s drawImage: only works with contexts made from CGLContext or a CGContext. The glFlush() is a standard OpenGL call, which admittedly didn’t have a place in my setup per se... anyway, if I don’t manage to find a simple way of rendering this through a CIContext, I’ll go about making a pure OpenGL implementation and be done with it.Beardless
CGLContext is Cocoa. AGLContext is CarbonBeardless
C
3

If the CVPixelBuffer has the kCVPixelBufferIOSurfaceCoreAnimationCompatibilityKey attribute, the backing IOSurface (retrieved via CVPixelBufferGetIOSurface) can be passed directly to the contents property of a CALayer.

This is probably the most efficient way to display a CVPixelBuffer.

Combustible answered 22/1, 2019 at 3:19 Comment(1)
That’s interesting, I’ll have a look as soon as possible to validate if this works or not. This is quite old, but I remember finding a solution by going totally OpenGL (writing a renderer with a custom shader and all that) - worked flawlessly and very fast, but there was a lot of code to it.Beardless

© 2022 - 2024 — McMap. All rights reserved.