35

I need hardware-accelerated H.264 decoding for a research project, to test a self-defined protocol.

As I have Search on the web, I have found a few ways to perform hardware-accelerated video decoding on Android.

  1. Use ffmpeg libstagefright (overview of libstagefright) or use libstagefright in the OS directly, like here.
  2. Use OpenMax on specific hardware platform. like here about samsung device and here about Qualcomm Snapdragon series
  3. Some people mentioned PVplayer,

Some people "say" libstagefright is the only way while Qualcomm guys have made success obviously.

Currently I am not sure which way could work. I am a little confused now. If all could work, I would certainly prefer a hardware independent method.

As I have tested a few video players of their H/W acceleration with Galaxy Tab 7.7(3.2 & Enxyos), VLC, Mobo, Rock, vplayer, rock and mobo work fine, VLC doesn't work, vplayer seems to have a rendering bug which costs its performance.

Anyway, I did an 'operation' on Rockplayer and deleted all its .so libs in data\data\com.redirecting\rockplayer, and software decoding crashes while hw decoding works still fine! I wonder how they did that. It appears to me that hw acceleration could be independent of hardware platforms.

Can someone nail this problem? Or provide any reference with additional information or better details?

Elrond_EGLDer
  • 47,430
  • 25
  • 189
  • 180
Glenn Yu
  • 603
  • 1
  • 7
  • 11
  • I am bit confused! Do you want to direct access (without Android media APIs) to H/W accelerated decoder to decode your bit-streams? Because all modern phone SOCs decode H.264 using H/w Acceleration. – Oak Bytes Jul 05 '12 at 12:38
  • @OakBytes,I want to implement H/W accelerated decoding however it's done. Now I only know how to decode the stream with ffmpeg software decoding. H/W acceleration refers to level of performance at 1080P@30fps while software decoding is much weaker. I have avoided referring to software decoding as using CPU because the H/W acceleration module is also part of the CPU. What do you mean by all modern phones already used H/W acceleration? – Glenn Yu Jul 07 '12 at 06:27
  • When Gallery Media Player is used to play H.264 clips, all recent Android phones use H/W accelerated H.264 decoder. I guess you plan to use H.264 decoder to decode raw H.264 bitstream and get decoded output, rather than play a file contain H.264 video and some audio. – Oak Bytes Jul 07 '12 at 07:03
  • @OakBytes You are right. That's exactly what I wanted. Just raw bitsreams and no mkv or mp4 containers. Sorry I didn't make it clearer. I want to call H/W decoding based on NAL or frame level over raw bitstream, rather than setting up a media player for files. – Glenn Yu Jul 07 '12 at 07:18
  • @Holyglenn - have you succeeded with your project? Maybe you found some new information about subject? – HitOdessit Sep 17 '12 at 08:19
  • libstagefright is dead in ffmpeg http://stackoverflow.com/questions/9832503/android-include-native-stagefright-features-in-my-own-project – Ciro Santilli新疆棉花TRUMP BAN BAD Feb 22 '16 at 20:52

3 Answers3

22

To answer the above question, let me introduce few concepts related to Android

OpenMAX Android uses OpenMAX for codec interface. Hence all native codecs (hardware accelerated or otherwise) provide OpenMAX interface. This interface is used by StageFright(Player framework) for decoding media using codec

NDK Android allows Java Applications to interact with underlying C/C++ native libraries using NDK. This requires using JNI (Java Native Interface).

Now coming to your question How to tap native decoder to decode raw video bitstream?

In Android 4.0 version and below, Android did not provide access to underlying video decoders at Java layer. You would need to write native code to directly interact with OMX decoder. Though this is possible, it is not trivial as it would need knowledge of how OMX works and how to map this OMX to application using NDK.

In 4.1 (Jelly Bean version), Android seems to provide access to hardware accelerated decoders at application level through JAVA APIs. More details about new APIs at http://developer.android.com/about/versions/android-4.1.html#Multimedia

Oak Bytes
  • 4,406
  • 4
  • 32
  • 51
  • Thank you for your clarification on concepts and answers. I have looked through references only to find it too true that I must dig into the OS framework. This would definitely work. – Glenn Yu Jul 08 '12 at 03:40
  • However, as the operation on Rockplayer suggests, in which I deleted all the .so libraries and the hardware decoding still works while software dec fails, there could be some simpler way to this in Android 4.0 version and belows. As to my raw bitstream decoding and all, I might have to figure out the whole OMX thing anyway. Can you give the Java API in Jelly Bean? – Glenn Yu Jul 08 '12 at 03:44
  • @Holygenn I have added link to Media Player APIs in Jelly Bean. In case of RockPlayer, does it directly display video using Hardware Accelerated Decoders or does it provide you output buffers? It is easier to do the former than latter with hardware accelerated decoders – Oak Bytes Jul 08 '12 at 06:14
  • 2
    RockPlayer is close sourced, only the config of ffmpeg is open hence I am not sure which it used. My guess is in Rockplayer after demuxing, the raw video is fed into a switch, letting the user choose from sw 3rd party decoder or hw system decoder as its menu suggested, then processed and displayed. I did Rockplayer experiment under Honeycomb 3.2. So it seems there would be a way to tap system codecs without NDK/JNI, not saying NDK is too much trouble but exploring a posibility. – Glenn Yu Jul 08 '12 at 07:53
  • 1
    As I understand it, Jelly bean MediaCodec provides system codecs(SW/HW) for raw bitstream. That seems to be exactly what I need -- I need to decode(with hw) and display raw video stream. However as curiosity drives it, I am even more eager to know how I can achieve HW accelerated decoding without Jelly Bean. Thank you very much for your help so far. – Glenn Yu Jul 08 '12 at 07:57
1

Use ExoPlayer (github).

Its a Google-sponsored open-source project that replaces the platform's MediaPlayer. Each component in the pipeline is extensible including the sample source (how the H.264 frames are extracted from your custom protocol) to the rendering (to a Surface, SurfaceTexture, etc).

It includes a nice demo app showing usage.

Peter Tran
  • 1,466
  • 1
  • 15
  • 25
1

You might want to try MediaExtractor and MediaCodec (They are also available in NDK - AMediaExtractor and AMediaCodec - see sample for playing .mp4 here native-codec)

GregoryK
  • 2,741
  • 1
  • 24
  • 22