How can I perform hardware-accelerated H.264 encoding and decoding for streaming? [closed]
Asked Answered
C

1

14

I am able to get the RGBA frame data from the camera, and I want to encode it in H.264 format. I've used FFmpeg to encode and decode H.264 video, but at a frame size of 640x480 it's too slow for my needs.

I'd like to use hardware acceleration to speed up the encoding and decoding, so how would I do that?

Also, I need to be able to stream the encoded video across the network and decode it on the other end. How can this be done?

Codon answered 25/4, 2012 at 9:56 Comment(5)
There is a library that allows to encode raw frames to H.264: foxitsolutions.com/iphone_h264_sdk.html. This library uses hardware encoder and gives separate H.264 frames. Please see this question and answer: #7980342Applaud
The question's title is "..video decoding", but the actual question is about encoding, right? Maybe, you should edit the title?Applaud
Maybe a duplicate: #25197669Flatboat
The fact that this question was closed is .. insaneSausauce
For anyone googling here a decade later. (1) to access the hardware encoder/decoder in an iphone, you use VTDecompressionSession and the end result is CVPixelBuffer. (2) the difficult part is building up the needed format before sending it to VTDecompressionSession. (3) To do that, see all of WWDC 513. (4) It seems that these days, it's possible to use fmmpeg - actually build it as a library to include in the Xcode project - which then gives a lot of easy usage of VTDecompressionSession, saving you the vast amount of work of discombobulating streams.Sausauce
M
19

If you want to do hardware-accelerated video encoding and decoding of H.264 video on iOS, the only way to go is AVFoundation. Don't use third-party libraries for the encoding or decoding, because they all currently are CPU-bound, and are much slower than what you get from AVFoundation. The one reason to use a third-party encoder or decoder would be if you are working with a format not supported in iOS by default.

For hardware-accelerated decoding, you'll want to use an AVAssetReader instance (or one of the player classes for pure playback). With an AVAssetReader, I regularly get 2X or higher playback speeds for reading H.264-encoded video, so the iOS devices use some pretty good hardware acceleration for that.

Similarly, for accelerated encoding, you'll use an AVAssetWriter. There are some tricks to getting AVAssetWriter to encode at the best speed (feeding in BGRA frames, using a pixel buffer pool, using the iOS 5.0 texture caches if reading from OpenGL ES), which I describe in detail within this answer.

If you want to see some code that uses the fastest paths I've found for accelerated encoding and decoding, you can look at my open source GPUImage framework, which, unlike the one linked by Anastasia, is totally free to use.

Mathur answered 26/4, 2012 at 14:57 Comment(7)
Thank you for your source code and rough looked, but did not find the function or parameters of access to encoding or decoding of data, I need to get the encoding of data sent over the network, and then at the other end of the terminal equipmentdecoding, if you already have the interface or function, but I did not find hope to get your help, thank you very muchCodon
Your lib is great but it is for applying filters and process the raw video but it doesn't help anything with actual question which is hardware encoding. If user wants to send compressed data over the network then none of the above api (asset reader and asset writer) won't help.Dextrocular
@Codon - That might be detail you want to add to the question, because you did not specify that you wanted access to the raw input and output streams for transfer across the network. My answer applies to encoding and decoding of files, where AVAssetReader and AVAssetWriter are clearly the way to go. Still, you'll use the AVFoundation to do your encoding here, as described by mohsenr: https://mcmap.net/q/138053/-streaming-video-from-an-iphone .Mathur
@Raghu - The hardware encoding and decoding is handled by AVFoundation and classes like AVAssetWriter and AVAssetReader. The need to stream the resulting video over the network was originally unspecified by king, so I assumed they were asking about encoding and decoding to local H.264 files. For streaming, you'll need to capture the recorded file output and send chunks over the network to be reconstituted at the other end. This is how Steve did it here: https://mcmap.net/q/138053/-streaming-video-from-an-iphoneMathur
@Brad Steve's way wont work really for real time live streaming. The commercial guys are doing something different. I am not sure what way it is but it is working real time.Dextrocular
No after a lot of research I actually think this is exactly how the commercial guys do it.Junitajunius
say @BradLarson ! Is AVAssetReader suitable for H.264 arriving via a stream? Or is it really only for reading files? Thanks!Sausauce

© 2022 - 2024 — McMap. All rights reserved.