Playing video and retrieving pixel buffer with ios5
Asked Answered
A

0

6

What I would like to is to play a video (either from a local file and from a remote URL) and its audio track and retrieve the pixel buffer of each frame of the video to draw it to an OpenGL texture.

Here is the code I use in iOS 6 (it works fine) :

Init the video

- (void) readMovie:(NSURL *)url {
    NSLog(@"Playing video %@", param.url);

    AVURLAsset * asset = [AVURLAsset URLAssetWithURL:url options:nil];
    [asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:@"tracks"] completionHandler:
     ^{
         dispatch_async(dispatch_get_main_queue(),
                        ^{
                            NSError* error = nil;
                            AVKeyValueStatus status = [asset statusOfValueForKey:@"tracks" error:&error];
                            if (status == AVKeyValueStatusLoaded) {
                                NSDictionary* settings = @{ (id)kCVPixelBufferPixelFormatTypeKey : [NSNumber numberWithInt:kCVPixelFormatType_32BGRA] };
                                AVPlayerItemVideoOutput* output = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:settings];
                                AVPlayerItem* playerItem = [AVPlayerItem playerItemWithAsset:asset];
                                [playerItem addOutput:output];
                                AVPlayer* player = [AVPlayer playerWithPlayerItem:playerItem];

                                [self setPlayer:player];
                                [self setPlayerItem:playerItem];
                                [self setOutput:output];

                                [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(bufferingVideo:) name:AVPlayerItemPlaybackStalledNotification object:nil];
                                [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(videoEnded:) name:AVPlayerItemDidPlayToEndTimeNotification object:nil];
                                [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(videoFailed:) name:AVPlayerItemFailedToPlayToEndTimeNotification object:nil];

                                [[self player] addObserver:self forKeyPath:@"rate" options:0 context:nil];
                                [[self player] addObserver:self forKeyPath:@"status" options:0 context:NULL];

                                [player play];
                            } else {
                                NSLog(@"%@ Failed to load the tracks.", self);
                            }
                        });
     }];
}

Read the video buffer (in a update function called each frame)

- (void) readNextMovieFrame {

    CMTime outputItemTime = [[self playerItem] currentTime];

    float interval = [self maxTimeLoaded];

    CMTime t = [[self playerItem] currentTime];
    CMTime d = [[self playerItem] duration];
    NSLog(@"Video : %f/%f (loaded : %f) - speed : %f", (float)t.value / (float)t.timescale, (float)d.value / (float)d.timescale, interval, [self player].rate);

    [videoBar updateProgress:(interval / CMTimeGetSeconds(d))];
    [videoBar updateSlider:(CMTimeGetSeconds(t) / CMTimeGetSeconds(d))];

    if ([[self output] hasNewPixelBufferForItemTime:outputItemTime]) {
        CVPixelBufferRef buffer = [[self output] copyPixelBufferForItemTime:outputItemTime itemTimeForDisplay:nil];

        // Lock the image buffer
        CVPixelBufferLockBaseAddress(buffer, 0);

        // Get information of the image
        uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(buffer);
        size_t width = CVPixelBufferGetWidth(buffer);
        size_t height = CVPixelBufferGetHeight(buffer);

        // Fill the texture
        glBindTexture(GL_TEXTURE_2D, texture);
        glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_BGRA_EXT, GL_UNSIGNED_BYTE, baseAddress);

        // Unlock the image buffer
        CVPixelBufferUnlockBaseAddress(buffer, 0);
        //CFRelease(sampleBuffer);
        CVBufferRelease(buffer);

    }
}

So this code is working fine with iOS 6 and I would like it to run on iOS 5 but AVPlayerItemVideoOutput is not part of iOS 5 so I can still play the video but I don't know how to retrieve the pixel buffer for each frame of the video.

Do you have an idea of what I can use instead of AVPlayerItemVideoOutput to retrieve the pixel buffer for each frame of the video ? (it must work with both local and remote video and I also want to play the audio track).

Thank you very much for your help !

Autumnautumnal answered 1/6, 2013 at 13:14 Comment(1)
Do u find a solution? Have similar problem, my code is almost same, but I want to convert CVPixelBufferRef to UIImage - and it's not workArterio

© 2022 - 2024 — McMap. All rights reserved.