How do I update a CALayer with a CVPixelBuffer/IOSurface?
Asked Answered
D

4

9

I have an IOSurface-backed CVPixelBuffer that is getting updated from an outside source at 30fps. I want to render a preview of the image data in an NSView -- what's the best way for me to do that?

I can directly set the .contents of a CALayer on the view, but that only updates the first time my view updates (or if, say, I resize the view). I've been poring over the docs but I can't figure out the correct invocation of needsDisplay on the layer or view to let the view infrastructure know to refresh itself, especially when updates are coming from outside the view.

Ideally I'd just bind the IOSurface to my layer and any changes I make to it would be propagated, but I'm not sure if that's possible.

class VideoPreviewController: NSViewController, VideoFeedConsumer {
    let customLayer : CALayer = CALayer()
    
    override func viewDidLoad() {
        super.viewDidLoad()
        // Do view setup here.
        print("Loaded our video preview")
        
        view.layer?.addSublayer(customLayer)
        customLayer.frame = view.frame
        
        // register our view with the browser service
        VideoFeedBrowser.instance.registerConsumer(self)
    }
    
    override func viewWillDisappear() {
        // deregister our view from the video feed
        VideoFeedBrowser.instance.deregisterConsumer(self)

        super.viewWillDisappear()
    }
    
    // This callback gets called at 30fps whenever the pixelbuffer is updated
    @objc func updateFrame(pixelBuffer: CVPixelBuffer) {

        guard let surface = CVPixelBufferGetIOSurface(pixelBuffer)?.takeUnretainedValue() else {
            print("pixelbuffer isn't IOsurface backed! noooooo!")
            return;
        }

        // Try and tell the view to redraw itself with new contents?
        // These methods don't work
        //self.view.setNeedsDisplay(self.view.visibleRect)
        //self.customLayer.setNeedsDisplay()
        self.customLayer.contents = surface

    }
    
}

Here's my attempt of a scaling version that's NSView rather than NSViewController-based, that also doesn't update correctly (or scale correctly for that matter):

class VideoPreviewThumbnail: NSView, VideoFeedConsumer {
   

    required init?(coder decoder: NSCoder) {
        super.init(coder: decoder)
        self.wantsLayer = true
        
        // register our view with the browser service
        VideoFeedBrowser.instance.registerConsumer(self)
    }
    
    override init(frame frameRect: NSRect) {
        super.init(frame: frameRect)
        self.wantsLayer = true
        
        // register our view with the browser service
        VideoFeedBrowser.instance.registerConsumer(self)
    }
    
    deinit{
        VideoFeedBrowser.instance.deregisterConsumer(self)
    }
    
    override func updateLayer() {
        // Do I need to put something here?
        print("update layer")
    }
    
    @objc
    func updateFrame(pixelBuffer: CVPixelBuffer) {
        guard let surface = CVPixelBufferGetIOSurface(pixelBuffer)?.takeUnretainedValue() else {
            print("pixelbuffer isn't IOsurface backed! noooooo!")
            return;
        }
        self.layer?.contents = surface
        self.layer?.transform = CATransform3DMakeScale(
            self.frame.width / CGFloat(CVPixelBufferGetWidth(pixelBuffer)),
            self.frame.height / CGFloat(CVPixelBufferGetHeight(pixelBuffer)),
            CGFloat(1))
    }

}

What am I missing?

Darrondarrow answered 16/11, 2020 at 19:24 Comment(0)
K
1

Maybe I'm wrong, but I think you are you updating your NSView on a background thread. (I suppose that the callback to updateFrame is on a background thread)

If I'm right, when you want to update the NSView, convert your pixelBuffer to whatever you want (NSImage?), and then dispatch it on the main thread.

Pseudocode (I don't work often with CVPixelBuffer so I'm not sure this is the right way to convert to an NSImage)

let ciImage = CIImage(cvImageBuffer: pixelBuffer)
let context = CIContext(options: nil)

let width = CVPixelBufferGetWidth(pixelBuffer)
let height = CVPixelBufferGetHeight(pixelBuffer)

let cgImage = context.createCGImage(ciImage, from: CGRect(x: 0, y: 0, width: width, height: height))

let nsImage = NSImage(cgImage: cgImage, size: CGSize(width: width, height: height))

DispatchQueue.main.async {
    // assign the NSImage to your NSView here
}

Another catch: I did some tests, and it seems that you cannot assign an IOSurface directly to the contents of a CALayer.

I tried with this:

    let textureImageWidth = 1024
    let textureImageHeight = 1024

    let macPixelFormatString = "ARGB"
    var macPixelFormat: UInt32 = 0
    for c in macPixelFormatString.utf8.reversed() {
       macPixelFormat *= 256
       macPixelFormat += UInt32(c)
    }

    let ioSurface = IOSurfaceCreate([kIOSurfaceWidth: textureImageWidth,
                    kIOSurfaceHeight: textureImageHeight,
                    kIOSurfaceBytesPerElement: 4,
                    kIOSurfaceBytesPerRow: textureImageWidth * 4,
                    kIOSurfaceAllocSize: textureImageWidth * textureImageHeight * 4,
                    kIOSurfacePixelFormat: macPixelFormat] as CFDictionary)!

    IOSurfaceLock(ioSurface, IOSurfaceLockOptions.readOnly, nil)
    let test = CIImage(ioSurface: ioSurface)
    IOSurfaceUnlock(ioSurface, IOSurfaceLockOptions.readOnly, nil)
    
    v1?.layer?.contents = ioSurface

Where v1 is my view. No effect

Even with a CIImage no effect (just last few lines)

    IOSurfaceLock(ioSurface, IOSurfaceLockOptions.readOnly, nil)
    let test = CIImage(ioSurface: ioSurface)
    IOSurfaceUnlock(ioSurface, IOSurfaceLockOptions.readOnly, nil)
    
    v1?.layer?.contents = test

If I create a CGImage it works

    IOSurfaceLock(ioSurface, IOSurfaceLockOptions.readOnly, nil)
    let test = CIImage(ioSurface: ioSurface)
    IOSurfaceUnlock(ioSurface, IOSurfaceLockOptions.readOnly, nil)
    
    let context = CIContext.init()
    let img = context.createCGImage(test, from: test.extent)
    v1?.layer?.contents = img
Kareykari answered 4/12, 2020 at 11:43 Comment(4)
That's a good thought. I've tried running all this from the main thread -- no difference. The other point is that you can directly assign an IOSurface to the .contents of a CALayer, (which works), it's just the updating that doesn't happen.Darrondarrow
can you try to add a CATransaction.flush() after the .contents = ... ? Always on the main thread. I'm looking in some of my old projects (objective C) and I found it...maybe it helpsKareykari
@Darrondarrow I did some tests and added more details to the answer. I think that you cannot assign directly the IOSurface to the CALayer contents, in my case it doesn't work even the first time, but I have always to pass through a CGImage. Look at "another catch" section in my answer with example code.Kareykari
did you try defining a PixelBuffer with a CoreAnimation compatible IOSurface? That's what worked for me. Try var bufferAttributes = [kCVPixelBufferIOSurfaceCoreAnimationCompatibilityKey:true] as CFDictionary; let err = CVPixelBufferCreate(kCFAllocatorDefault, width, height, kCVPixelFormatType_32BGRA, bufferAttributes, &pb) See russbishop.net/cross-process-renderingDarrondarrow
R
1

I encountered this problem myself and the solution is to double buffer the IOSurface source: use two IOSurface objects instead of one and render to the current surface, set the surface to the layer contents and then on the next rendering pass use the alternate (back/front) surface and then swap.

It would appear that setting the CALayer.contents twice to the same CVPixelBufferRef has no effect. However, if you alternate between two IOSurfaceRef it works wonderfully.

It maybe also possible to invalidate the layer contents by setting it to nil and then reset. I did not try that case but am using the double buffer technique.

Rugby answered 4/12, 2021 at 5:32 Comment(0)
C
0

If you have some IBActions that update it then create an observed variable with the didSet block and whenever the IBAction is triggered, change its value. Also remember to write the code you want to run when updated in that block.

I'd suggest making the variable an Int, set its default value to 0 and add 1 to it every time it updates.

And you can cast the NSView into an NSImageView for the part where you ask about showing the IMAGE data on an NSView so that does the job.

Checked answered 30/11, 2020 at 4:27 Comment(0)
H
0

You need to convert the pixel buffer to CGImage and convert it to a layer so that you can change the layer of the main view. Please try this code

@objc
func updateFrame(pixelBuffer: CVPixelBuffer) {
    guard let surface = CVPixelBufferGetIOSurface(pixelBuffer)?.takeUnretainedValue() else {
        print("pixelbuffer isn't IOsurface backed! noooooo!")
        return;
    }
    void *baseAddr = CVPixelBufferGetBaseAddress(pixelBuffer);
    size_t width = CVPixelBufferGetWidth(pixelBuffer);
    size_t height = CVPixelBufferGetHeight(pixelBuffer);
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef cgContext = CGBitmapContextCreate(baseAddr, width, height, 8, CVPixelBufferGetBytesPerRow(pixelBuffer), colorSpace, kCGImageAlphaNoneSkipLast);
    CGImageRef cgImage = CGBitmapContextCreateImage(cgContext);
    CGContextRelease(cgContext);
    
    let outputImage = UIImage(cgImage: outputCGImage, scale: 1, orientation: img.imageOrientation)
    let newLayer:CGLayer = CGLayer.init(cgImage: outputImage)
    self.layer = newLayer
    
    CVPixelBufferUnlockBaseAddress(pixelBuffer, kCVPixelBufferLock_ReadOnly);
    CVPixelBufferRelease(pixelBuffer);
}
Hobble answered 5/12, 2020 at 15:30 Comment(1)
This is similar to a solution I have working by using VTCreateCGImageFromPixelBuffer, but it's very CPU intense. You can directly assign an IOSurface to CALayer.contents and it renders -- it's just the marking things dirty that I'm having a hard time with.Darrondarrow

© 2022 - 2025 — McMap. All rights reserved.