How to use hardware accelerated video decoding on Android?
Asked Answered
C

3

36

I need hardware-accelerated H.264 decoding for a research project, to test a self-defined protocol.

As I have Search on the web, I have found a few ways to perform hardware-accelerated video decoding on Android.

  1. Use ffmpeg libstagefright (overview of libstagefright) or use libstagefright in the OS directly, like here.
  2. Use OpenMax on specific hardware platform. like here about samsung device and here about Qualcomm Snapdragon series
  3. Some people mentioned PVplayer,

Some people "say" libstagefright is the only way while Qualcomm guys have made success obviously.

Currently I am not sure which way could work. I am a little confused now. If all could work, I would certainly prefer a hardware independent method.

As I have tested a few video players of their H/W acceleration with Galaxy Tab 7.7(3.2 & Enxyos), VLC, Mobo, Rock, vplayer, rock and mobo work fine, VLC doesn't work, vplayer seems to have a rendering bug which costs its performance.

Anyway, I did an 'operation' on Rockplayer and deleted all its .so libs in data\data\com.redirecting\rockplayer, and software decoding crashes while hw decoding works still fine! I wonder how they did that. It appears to me that hw acceleration could be independent of hardware platforms.

Can someone nail this problem? Or provide any reference with additional information or better details?

Colunga answered 4/7, 2012 at 2:51 Comment(6)
I am bit confused! Do you want to direct access (without Android media APIs) to H/W accelerated decoder to decode your bit-streams? Because all modern phone SOCs decode H.264 using H/w Acceleration.Bently
@OakBytes,I want to implement H/W accelerated decoding however it's done. Now I only know how to decode the stream with ffmpeg software decoding. H/W acceleration refers to level of performance at 1080P@30fps while software decoding is much weaker. I have avoided referring to software decoding as using CPU because the H/W acceleration module is also part of the CPU. What do you mean by all modern phones already used H/W acceleration?Colunga
When Gallery Media Player is used to play H.264 clips, all recent Android phones use H/W accelerated H.264 decoder. I guess you plan to use H.264 decoder to decode raw H.264 bitstream and get decoded output, rather than play a file contain H.264 video and some audio.Bently
@OakBytes You are right. That's exactly what I wanted. Just raw bitsreams and no mkv or mp4 containers. Sorry I didn't make it clearer. I want to call H/W decoding based on NAL or frame level over raw bitstream, rather than setting up a media player for files.Colunga
@Holyglenn - have you succeeded with your project? Maybe you found some new information about subject?Datolite
libstagefright is dead in ffmpeg #9833003Chorizo
B
23

To answer the above question, let me introduce few concepts related to Android

OpenMAX Android uses OpenMAX for codec interface. Hence all native codecs (hardware accelerated or otherwise) provide OpenMAX interface. This interface is used by StageFright(Player framework) for decoding media using codec

NDK Android allows Java Applications to interact with underlying C/C++ native libraries using NDK. This requires using JNI (Java Native Interface).

Now coming to your question How to tap native decoder to decode raw video bitstream?

In Android 4.0 version and below, Android did not provide access to underlying video decoders at Java layer. You would need to write native code to directly interact with OMX decoder. Though this is possible, it is not trivial as it would need knowledge of how OMX works and how to map this OMX to application using NDK.

In 4.1 (Jelly Bean version), Android seems to provide access to hardware accelerated decoders at application level through JAVA APIs. More details about new APIs at http://developer.android.com/about/versions/android-4.1.html#Multimedia

Bently answered 7/7, 2012 at 22:19 Comment(5)
Thank you for your clarification on concepts and answers. I have looked through references only to find it too true that I must dig into the OS framework. This would definitely work.Colunga
However, as the operation on Rockplayer suggests, in which I deleted all the .so libraries and the hardware decoding still works while software dec fails, there could be some simpler way to this in Android 4.0 version and belows. As to my raw bitstream decoding and all, I might have to figure out the whole OMX thing anyway. Can you give the Java API in Jelly Bean?Colunga
@Holygenn I have added link to Media Player APIs in Jelly Bean. In case of RockPlayer, does it directly display video using Hardware Accelerated Decoders or does it provide you output buffers? It is easier to do the former than latter with hardware accelerated decodersBently
RockPlayer is close sourced, only the config of ffmpeg is open hence I am not sure which it used. My guess is in Rockplayer after demuxing, the raw video is fed into a switch, letting the user choose from sw 3rd party decoder or hw system decoder as its menu suggested, then processed and displayed. I did Rockplayer experiment under Honeycomb 3.2. So it seems there would be a way to tap system codecs without NDK/JNI, not saying NDK is too much trouble but exploring a posibility.Colunga
As I understand it, Jelly bean MediaCodec provides system codecs(SW/HW) for raw bitstream. That seems to be exactly what I need -- I need to decode(with hw) and display raw video stream. However as curiosity drives it, I am even more eager to know how I can achieve HW accelerated decoding without Jelly Bean. Thank you very much for your help so far.Colunga
F
1

Use ExoPlayer (github).

Its a Google-sponsored open-source project that replaces the platform's MediaPlayer. Each component in the pipeline is extensible including the sample source (how the H.264 frames are extracted from your custom protocol) to the rendering (to a Surface, SurfaceTexture, etc).

It includes a nice demo app showing usage.

Fizz answered 11/5, 2016 at 22:23 Comment(0)
P
1

You might want to try MediaExtractor and MediaCodec (They are also available in NDK - AMediaExtractor and AMediaCodec - see sample for playing .mp4 here native-codec)

Proven answered 13/3, 2017 at 12:11 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.