I have spent the whole day and went through a lot of SO answers, Apple references, documentations, etc, but no success.
I want a simple thing: I am playing a video using AVPlayer and I want to pause it and get the current frame as UIImage. That's it.
My video is a m3u8 file located on the internet, it is played normally in the AVPlayerLayer without any problems.
What have I tried:
AVAssetImageGenerator. It is not working, the methodcopyCGImageAtTime:actualTime: error:returns null image ref. According to the answer hereAVAssetImageGeneratordoesn't work for streaming videos.- Taking snapshot of the player view. I tried first
renderInContext:onAVPlayerLayer, but then I realized that it is not rendering this kind of "special" layers. Then I found a new method introduced in iOS 7 -drawViewHierarchyInRect:afterScreenUpdates:which should be able to render also the special layers, but no luck, still got the UI snapshot with blank black area where the video is shown. AVPlayerItemVideoOutput. I have added a video output for myAVPlayerItem, however whenever I callhasNewPixelBufferForItemTime:it returnsNO. I guess the problem is again streaming video and I am not alone with this problem.AVAssetReader. I was thinking to try it but decided not to lose time after finding a related question here.
So isn't there any way to get a snapshot of something that I am anyway seeing right now on the screen? I can't believe this.