I have spent the whole day and went through a lot of SO answers, Apple references, documentations, etc, but no success.
I want a simple thing: I am playing a video using AVPlayer and I want to pause it and get the current frame as UIImage
. That's it.
My video is a m3u8 file located on the internet, it is played normally in the AVPlayerLayer
without any problems.
What have I tried:
AVAssetImageGenerator
. It is not working, the methodcopyCGImageAtTime:actualTime: error:
returns null image ref. According to the answer hereAVAssetImageGenerator
doesn't work for streaming videos.- Taking snapshot of the player view. I tried first
renderInContext:
onAVPlayerLayer
, but then I realized that it is not rendering this kind of "special" layers. Then I found a new method introduced in iOS 7 -drawViewHierarchyInRect:afterScreenUpdates:
which should be able to render also the special layers, but no luck, still got the UI snapshot with blank black area where the video is shown. AVPlayerItemVideoOutput
. I have added a video output for myAVPlayerItem
, however whenever I callhasNewPixelBufferForItemTime:
it returnsNO
. I guess the problem is again streaming video and I am not alone with this problem.AVAssetReader
. I was thinking to try it but decided not to lose time after finding a related question here.
So isn't there any way to get a snapshot of something that I am anyway seeing right now on the screen? I can't believe this.
AVPlayerItemVideoOutput
object right from the beginning? My code is playing a video without the output added and then, whenever I need a snapshot, I quickly create aAVPlayerItemVideoOutput
object, add it to the player item and try to read the pixel buffer. I also tried adding output a bit earlier - whenever my special snapshot gesture was starting the touches but not yet recognized. Is this important? – Tasimeter