iOS: AVPlayer - getting a snapshot of the current frame of a video
Asked Answered
T

3

16

I have spent the whole day and went through a lot of SO answers, Apple references, documentations, etc, but no success.

I want a simple thing: I am playing a video using AVPlayer and I want to pause it and get the current frame as UIImage. That's it.

My video is a m3u8 file located on the internet, it is played normally in the AVPlayerLayer without any problems.

What have I tried:

  1. AVAssetImageGenerator. It is not working, the method copyCGImageAtTime:actualTime: error: returns null image ref. According to the answer here AVAssetImageGenerator doesn't work for streaming videos.
  2. Taking snapshot of the player view. I tried first renderInContext: on AVPlayerLayer, but then I realized that it is not rendering this kind of "special" layers. Then I found a new method introduced in iOS 7 - drawViewHierarchyInRect:afterScreenUpdates: which should be able to render also the special layers, but no luck, still got the UI snapshot with blank black area where the video is shown.
  3. AVPlayerItemVideoOutput. I have added a video output for my AVPlayerItem, however whenever I call hasNewPixelBufferForItemTime: it returns NO. I guess the problem is again streaming video and I am not alone with this problem.
  4. AVAssetReader. I was thinking to try it but decided not to lose time after finding a related question here.

So isn't there any way to get a snapshot of something that I am anyway seeing right now on the screen? I can't believe this.

Tasimeter answered 17/9, 2016 at 20:33 Comment(0)
A
10

AVPlayerItemVideoOutput works fine for me from an m3u8. Maybe it's because I don't consult hasNewPixelBufferForItemTime and simply call copyPixelBufferForItemTime? This code produces a CVPixelBuffer instead of a UIImage, but there are answers that describe how to do that.

This answer mostly cribbed from here

#import "ViewController.h"
#import <AVFoundation/AVFoundation.h>

@interface ViewController ()

@property (nonatomic) AVPlayer *player;
@property (nonatomic) AVPlayerItem *playerItem;
@property (nonatomic) AVPlayerItemVideoOutput *playerOutput;

@end

@implementation ViewController
- (void)setupPlayerWithLoadedAsset:(AVAsset *)asset {
    NSDictionary* settings = @{ (id)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA) };
    self.playerOutput = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:settings];
    self.playerItem = [AVPlayerItem playerItemWithAsset:asset];
    [self.playerItem addOutput:self.playerOutput];
    self.player = [AVPlayer playerWithPlayerItem:self.playerItem];

    AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];
    playerLayer.frame = self.view.frame;
    [self.view.layer addSublayer:playerLayer];

    [self.player play];
}

- (IBAction)grabFrame {
    CVPixelBufferRef buffer = [self.playerOutput copyPixelBufferForItemTime:[self.playerItem currentTime] itemTimeForDisplay:nil];
    NSLog(@"The image: %@", buffer);
}

- (void)viewDidLoad {
    [super viewDidLoad];


    NSURL *someUrl = [NSURL URLWithString:@"http://qthttp.apple.com.edgesuite.net/1010qwoeiuryfg/sl.m3u8"];
    AVURLAsset *asset = [AVURLAsset URLAssetWithURL:someUrl options:nil];

    [asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:@"tracks"] completionHandler:^{

        NSError* error = nil;
        AVKeyValueStatus status = [asset statusOfValueForKey:@"tracks" error:&error];
        if (status == AVKeyValueStatusLoaded)
        {
            dispatch_async(dispatch_get_main_queue(), ^{
                [self setupPlayerWithLoadedAsset:asset];
            });
        }
        else
        {
            NSLog(@"%@ Failed to load the tracks.", self);
        }
    }];
}

@end
Apical answered 18/9, 2016 at 0:39 Comment(5)
I will try it now and let you know. A short question: should I setup the AVPlayerItemVideoOutput object right from the beginning? My code is playing a video without the output added and then, whenever I need a snapshot, I quickly create a AVPlayerItemVideoOutput object, add it to the player item and try to read the pixel buffer. I also tried adding output a bit earlier - whenever my special snapshot gesture was starting the touches but not yet recognized. Is this important?Tasimeter
I think you must set up the AVPlayerItemVideoOutput from the beginning, probably before you start playback.Apical
Thanks for your solution, I just checked and it works! The trick was to add AVPlayerItemVideoOutput before I start to play, as you said. Seems a bit inefficient to have a video output added the whole time just for one screenshot somewhere in the future, which in most of the cases will not even be taken, but at least it works!Tasimeter
You're welcome. I guess you're right - attaching an ARGB AVPlayerItemVideoOutput to what may very well be a YUV flow could be expensive. I'd never thought of that.Apical
Did this solution work for you with FairPlay protected HLS? I've tried copyPixelBufferForItemTime and it works great with unprotected streams, but once you use FairPlay it returns NULL. #42840331Melcher
H
12

AVAssetImageGenerator is the best way to snapshot a video, this method return asynchronously a UIImage :

import AVFoundation

// ...

var player:AVPlayer? = // ...

func screenshot(handler:@escaping ((UIImage)->Void)) {
    guard let player = player ,
        let asset = player.currentItem?.asset else {
            return
    }
    
    let imageGenerator = AVAssetImageGenerator(asset: asset)
    imageGenerator.appliesPreferredTrackTransform = true
    let times = [NSValue(time:player.currentTime())]
    
    imageGenerator.generateCGImagesAsynchronously(forTimes: times) { _, image, _, _, _ in
        if let img = image {
            handler(UIImage(cgImage: img))
        }
    }
}

(It's Swift 4.2)

Hillell answered 11/11, 2016 at 15:25 Comment(4)
@ Axel Guilmin This captures only Avplayer. What if i want to take screenshot of both Avplayer and UiView?Helaine
I don't think my answer would be the right approach to capture a UIView. I did not test it but this answer seems better : https://mcmap.net/q/99108/-how-to-capture-uiview-to-uiimage-without-loss-of-quality-on-retina-displayHillell
@ Axel Guilmin Thank you for your reply. See this is my problem: stackoverflow.com/questions/42085479/…Helaine
@Anessence were you able to find an answer that captures from livestreams?Highsmith
A
10

AVPlayerItemVideoOutput works fine for me from an m3u8. Maybe it's because I don't consult hasNewPixelBufferForItemTime and simply call copyPixelBufferForItemTime? This code produces a CVPixelBuffer instead of a UIImage, but there are answers that describe how to do that.

This answer mostly cribbed from here

#import "ViewController.h"
#import <AVFoundation/AVFoundation.h>

@interface ViewController ()

@property (nonatomic) AVPlayer *player;
@property (nonatomic) AVPlayerItem *playerItem;
@property (nonatomic) AVPlayerItemVideoOutput *playerOutput;

@end

@implementation ViewController
- (void)setupPlayerWithLoadedAsset:(AVAsset *)asset {
    NSDictionary* settings = @{ (id)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA) };
    self.playerOutput = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:settings];
    self.playerItem = [AVPlayerItem playerItemWithAsset:asset];
    [self.playerItem addOutput:self.playerOutput];
    self.player = [AVPlayer playerWithPlayerItem:self.playerItem];

    AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:self.player];
    playerLayer.frame = self.view.frame;
    [self.view.layer addSublayer:playerLayer];

    [self.player play];
}

- (IBAction)grabFrame {
    CVPixelBufferRef buffer = [self.playerOutput copyPixelBufferForItemTime:[self.playerItem currentTime] itemTimeForDisplay:nil];
    NSLog(@"The image: %@", buffer);
}

- (void)viewDidLoad {
    [super viewDidLoad];


    NSURL *someUrl = [NSURL URLWithString:@"http://qthttp.apple.com.edgesuite.net/1010qwoeiuryfg/sl.m3u8"];
    AVURLAsset *asset = [AVURLAsset URLAssetWithURL:someUrl options:nil];

    [asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:@"tracks"] completionHandler:^{

        NSError* error = nil;
        AVKeyValueStatus status = [asset statusOfValueForKey:@"tracks" error:&error];
        if (status == AVKeyValueStatusLoaded)
        {
            dispatch_async(dispatch_get_main_queue(), ^{
                [self setupPlayerWithLoadedAsset:asset];
            });
        }
        else
        {
            NSLog(@"%@ Failed to load the tracks.", self);
        }
    }];
}

@end
Apical answered 18/9, 2016 at 0:39 Comment(5)
I will try it now and let you know. A short question: should I setup the AVPlayerItemVideoOutput object right from the beginning? My code is playing a video without the output added and then, whenever I need a snapshot, I quickly create a AVPlayerItemVideoOutput object, add it to the player item and try to read the pixel buffer. I also tried adding output a bit earlier - whenever my special snapshot gesture was starting the touches but not yet recognized. Is this important?Tasimeter
I think you must set up the AVPlayerItemVideoOutput from the beginning, probably before you start playback.Apical
Thanks for your solution, I just checked and it works! The trick was to add AVPlayerItemVideoOutput before I start to play, as you said. Seems a bit inefficient to have a video output added the whole time just for one screenshot somewhere in the future, which in most of the cases will not even be taken, but at least it works!Tasimeter
You're welcome. I guess you're right - attaching an ARGB AVPlayerItemVideoOutput to what may very well be a YUV flow could be expensive. I'd never thought of that.Apical
Did this solution work for you with FairPlay protected HLS? I've tried copyPixelBufferForItemTime and it works great with unprotected streams, but once you use FairPlay it returns NULL. #42840331Melcher
M
1

Swift version of Rhythmic Fistman's solution which still works as expected in Xcode 15 and iOS 17:

private var playerOutput: AVPlayerItemVideoOutput?
(...)
self.playerOutput = AVPlayerItemVideoOutput(pixelBufferAttributes: [kCVPixelBufferPixelFormatTypeKey as String: Int(kCVPixelFormatType_32BGRA)])
(...)
self.playerItem?.add(self.playerOutput!)
(...)
if let playerItem = self?.playerItem, let imageBuffer = self?.playerOutput?.copyPixelBuffer(forItemTime: playerItem.currentTime(), itemTimeForDisplay: nil), let image = UIImage(pixelBuffer: imageBuffer) {
  // do what you want with the `image`
} else {
  print("Failed to grab frame")
}
Masonry answered 4/11, 2023 at 20:47 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.