Core Image - rendering a transparent image on CMSampleBufferRef result in black box around it
Asked Answered
H

1

4

I'm trying to add a watermark/logo on a video that I'm recording using AVFoundation's AVCaptureVideoDataOutput. My class is set as the sampleBufferDelegate and receives the CMSamplebufferRefs. I already apply some effects to the CMSampleBufferRefs CVPixelBuffer and pass it back to the AVAssetWriter.

The logo in the top left corner is delivered using a transparent PNG. The problem I'm having is that the transparent parts of the UIImage are black once written to the video. Anyone have an idea of what I'm doing wrong or could be forgetting?

Code snippets below:

//somewhere in the init of the class;   
 _eaglContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2];
_ciContext = [CIContext contextWithEAGLContext:_eaglContext
                                       options: @{ kCIContextWorkingColorSpace : [NSNull null] }];

//samplebufferdelegate method:
- (void) captureOutput:(AVCaptureOutput *)captureOutput
 didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
        fromConnection:(AVCaptureConnection *)connection {

    CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    CVPixelBufferLockBaseAddress(pixelBuffer, 0);

....

UIImage *logoImage = [UIImage imageNamed:@"logo.png"];
CIImage *renderImage = [[CIImage alloc] initWithCGImage:logoImage.CGImage];
CGColorSpaceRef cSpace = CGColorSpaceCreateDeviceRGB();

[_ciContext render:renderImage
   toCVPixelBuffer:pixelBuffer
            bounds: [renderImage extent]
        colorSpace:cSpace];

CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
CGColorSpaceRelease(cSpace);

....
}

It looks like the CIContext does not draw the CIImages alpha. Any ideas?

Horme answered 4/7, 2014 at 14:14 Comment(0)
H
8

For developers who've encountered the same issue:

It appears anything rendered on the GPU and written to the video ends up making a black hole in the video. Instead, I removed the above code, created a CGContextRef, like you would do when editing images, and drew on that context.

Code:

....
CVPixelBufferLockBaseAddress( pixelBuffer, 0 );

CGContextRef context = CGBitmapContextCreate(CVPixelBufferGetBaseAddress(pixelBuffer),
                                             CVPixelBufferGetWidth(pixelBuffer),
                                             CVPixelBufferGetHeight(pixelBuffer),
                                             8,
                                             CVPixelBufferGetBytesPerRow(pixelBuffer),
                                             CGColorSpaceCreateDeviceRGB(),
                                             (CGBitmapInfo)
                                             kCGBitmapByteOrder32Little |
                                             kCGImageAlphaPremultipliedFirst);

CGRect renderBounds = ...
CGContextDrawImage(context, renderBounds, [overlayImage CGImage]);

CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
CGColorSpaceRelease(cSpace);
....

And off course, the global EAGLContext and the CIContext are not needed anymore.

Horme answered 20/8, 2014 at 12:45 Comment(4)
By any chance can you link to some sample code. I pretty much understand what is going on, but it's a bit over my head. Trying to watermark a video while recording. So this is EXACTLY what i need.Haddington
@Haddington This codes goes in the captureOutput:didOutputSampleBuffer:fromConnection:-method of AVCaptureVideoDataOutput's sampleBufferDelegate. Documentation: developer.apple.com/Library/ios/documentation/AVFoundation/…Horme
It didn't work for me. I'll clean up the example I created and share it.Haddington
how do you get this than back to the sampleBuffer ?Acceptance

© 2022 - 2024 — McMap. All rights reserved.