Encoding H.264 Compression Session with CGDisplayStream
Asked Answered
E

1

6

I'm trying to create an H.264 Compression Session with the data from my screen. I've created a CGDisplayStreamRef instance like so:

displayStream = CGDisplayStreamCreateWithDispatchQueue(0, 100, 100, k32BGRAPixelFormat, nil, self.screenCaptureQueue, ^(CGDisplayStreamFrameStatus status, uint64_t displayTime, IOSurfaceRef frameSurface, CGDisplayStreamUpdateRef updateRef) {
    //Call encoding session here
});

Below is how I currently have the encoding function setup:

- (void) encode:(CMSampleBufferRef )sampleBuffer {
    CVImageBufferRef imageBuffer = (CVImageBufferRef)CMSampleBufferGetImageBuffer(sampleBuffer);
    CMTime presentationTimeStamp = CMTimeMake(frameID++, 1000);
    VTEncodeInfoFlags flags;
    OSStatus statusCode = VTCompressionSessionEncodeFrame(EncodingSession,
                                                          imageBuffer,
                                                          presentationTimeStamp,
                                                          kCMTimeInvalid,
                                                          NULL, NULL, &flags);
    if (statusCode != noErr) {
        NSLog(@"H264: VTCompressionSessionEncodeFrame failed with %d", (int)statusCode);

        VTCompressionSessionInvalidate(EncodingSession);
        CFRelease(EncodingSession);
        EncodingSession = NULL;
        return;
    }
    NSLog(@"H264: VTCompressionSessionEncodeFrame Success");
}

I'm trying to understand how I can convert the data from my screen into a CMSampleBufferRef so I can properly call my encode function. So far, I've not been able to determine if this is possible, or the right approach for what I'm trying to do. Does anyone have any suggestions?

EDIT: I've gotten my IOSurfaceconverted to a CMBlockBuffer, but haven't yet figured out how to convert that to a CMSampleBufferRef:

void *mem = IOSurfaceGetBaseAddress(frameSurface);
size_t bytesPerRow = IOSurfaceGetBytesPerRow(frameSurface);
size_t height = IOSurfaceGetHeight(frameSurface);
size_t totalBytes = bytesPerRow * height;

CMBlockBufferRef blockBuffer;

CMBlockBufferCreateWithMemoryBlock(kCFAllocatorNull, mem, totalBytes, kCFAllocatorNull, NULL, 0, totalBytes, 0, &blockBuffer);

EDIT 2

Some more progress:

CMSampleBufferRef *sampleBuffer;

OSStatus sampleStatus = CMSampleBufferCreate(
                             NULL, blockBuffer, TRUE, NULL, NULL,
                             NULL, 1, 1, NULL,
                             0, NULL, sampleBuffer);

[self encode:*sampleBuffer];
Ettaettari answered 10/3, 2017 at 14:57 Comment(0)
C
2

Possibly, I'm a bit late but nevertheless, it could be helpful for others:

CGDisplayStreamCreateWithDispatchQueue(CGMainDisplayID(), 100, 100, k32BGRAPixelFormat, nil, self.screenCaptureQueue, ^(CGDisplayStreamFrameStatus status, uint64_t displayTime, IOSurfaceRef frameSurface, CGDisplayStreamUpdateRef updateRef) {
    // The created pixel buffer retains the surface object.
    CVPixelBufferRef pixelBuffer;
    CVPixelBufferCreateWithIOSurface(NULL, frameSurface, NULL, &pixelBuffer);

    // Create the video-type-specific description for the pixel buffer.
    CMVideoFormatDescriptionRef videoFormatDescription;
    CMVideoFormatDescriptionCreateForImageBuffer(NULL, pixelBuffer, &videoFormatDescription);

    // All the necessary parts for creating a `CMSampleBuffer` are ready.
    CMSampleBufferRef sampleBuffer;
    CMSampleTimingInfo timingInfo;
    CMSampleBufferCreateReadyWithImageBuffer(NULL, pixelBuffer, videoFormatDescription, &timingInfo, &sampleBuffer);

    // Do the stuff

    // Release the resources to let the frame surface be reused in the queue
    // `kCGDisplayStreamQueueDepth` is responsible for the size of the queue 
    CFRelease(sampleBuffer);
    CFRelease(pixelBuffer);
});
Cougar answered 17/5, 2017 at 15:0 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.