Android Video Player Using NDK, OpenGL ES, and FFmpeg
Asked Answered
E

1

28

Ok so here is what I have so far. I have built FFmpeg on android and am able to use it fine. I have been able to load a video into FFmpeg after passing the chosen filename from the java side. To save on performance I am writing video player in the NDK rather than passing frames from FFmpeg to java through JNI. I want to send frames from the video to an OpenGL surface. I am having trouble figuring out how to get each frame of video and render it onto the OpenGL surface. I have been stuck trying to figure this out for a couple weeks now with no luck. Hopefully someone can point me in the right direction.

Thanks!

Elmer answered 13/1, 2011 at 2:21 Comment(3)
Hi Kieran, Im trying to convert format and play audio or video using FFmpeg and android ndk. I donot find any resource or guidance for this on web. After a long time search i found this link github.com/havlenapetr/FFMpeg. but im unable to compile and run this example. Please help me or give me ref link-- how i can play audio/video using FFmpeg files in android....ThanksOpener
Praveenb, I had a difficult time getting FFmpeg finally compiled and usable as well on Android. I had to play around with the configure script a bit. I also am not using the newest version of FFmpeg. I downloaded a slightly older version that still makes use of the sws_scale functions to get it to work. I can do my best to help you to get it working just let me know where you are at right now and what types of errors you have come accross. It shouldnt be to hard to get the static libraries to build.Elmer
Hi, I'm struggling to building FFmpeg for Android. Could you tell me how you managed to build the static libraries for Android? Is there any git-repository or something similar which I could start with? What about that configure script - what is there to set up?Clairclairaudience
T
21

One way that springs to mind is to draw the pixels of your frame into a texture and then render that texture using OpenGL.

I wrote a blog post a while back on how to go about this, primarily for old-skool pixel-based video games, but it also applies for your situation. The post is Android Native Coding in C, and I set up a github repository with an example. Using this technique I have been able to get 60 FPS, even on first generation hardware.

EDIT regarding glTexImage2D vs glTexSubImage2D for this approach.

Calling glTexImage2D will allocate video memory for your texture and copy the pixels you pass it into that memory (if you don't pass NULL). Calling glTexSubImage2D will update the pixels you specify in an already-allocated texture.

If you update all of the texture then there's little difference calling one or the other, in fact glTexImage2D is usually faster. But if you only update part of the texture glTexSubImage2D wins out on speed.

You have to use power-of-2 texture sizes, so in covering the screen on hi-res devices requires a 1024x512 texture, and a 512x512 texture on medium resolutions. The texture is larger than the screen area (hi-res is 800x400-ish), which means you only need to update part of it, so glTexSubImage2D is the way to go.

Tincal answered 13/1, 2011 at 8:16 Comment(17)
Thanks for your response! Ya know it's funny I came accross your blog a while back and I used that as a base for my app haha. I haven't been able to find it again until now so thank you for letting me know! Right now I finally got something to appear on the screen but the pixels seem to be totally messed up. I am not sure if I am creating my rectangular texture correctly or if it is a problem converting to the correct RGB pixel format in FFmpeg. I wish there were more OpenGL ES documentation as it is pretty easy to do this with quads in OpenGL. Any tips?Elmer
You can use a quad with a texture on it. Just use glOrthof and set your vertex and texture coordinates, all as you would in regular OpenGL. Messed up pixels is a vague description: are they completely indecipherable? Too green or too red? Maybe they're sheared to the left? My advice: get the code working on the PC using the subset of OpenGL that overlaps with GLES, then compile the exact same code to Android. It's what I usually do.Tincal
Ok thank you for the information. Sorry what I meant by messed up pixels are that yea they are indecipherable. In FFmpeg you have to use a function sws_scale to transform the pixel format to RGG, but as for the OpenGL side I am still learning how to use it and I'm not quite sure if I converted to the right pixel format or if I set up my rectangular vertices and texture coordinates correctly. Also since I am looping through each frame should be be using glTexSubImage2d() or continue using glTexImage2d() to replace the current texture with my new texture data from the new frame?Elmer
@Kieran: I've added a note about tex vs texsub.Tincal
Thanks richq, all this information is really helpful. I'll be sure to post back what I ended up doing in case anyone else needs some help with this.Elmer
I finally did get video frames to display on screen using FFmpeg and OpenGL ES... If anyone is interested in how I got it to work let me know and I'll do my best to help. It ended up being a matter of using the correct pixel format conversion from FFmpeg and mapping it to the correct pixel format in OpenGL as well as using the correct texture coordinates. One things though is that the .3gp decoding on the phone is very slow compared to say .mp4 decoding. I was only able to get ~11fps at best on my original droid. Though the droid x yielded ~22fps... still working on improving that.Elmer
@Kieran nice to see you got further along. I had a comment on my blog post that said the texture buffer method itself can be much slow on some phones (e.g. Nexus One), might be something to check too.Tincal
@Android007: I tried using the native android bitmap api and it resulted in the same fps but i didnt have a problem with any black frames... have you made sure that your loop that runs through the packets is definitely correct? It could be that you are somehow trying to display parts of a packet that aren't supposed to be used maybe? Also is the frames that are being displayed on screen looking correct and does there seem to be any jump in the video between displayed frames? Please post back any results you come up with!Elmer
My guess about the crashing would most likely be that you are not freeing up memory on the native side. Be sure to look closely at everything being allocated in your loop and then free that memory using FFmpeg's av_free function... also does the stack trace give any insight as to which side the crash is coming from?Elmer
@Kieran I'd be very interested in this subject, especially about the pixel conversion and the OpenGL output. Do you have any intentions of posting some source code somewhere? I would so much appreciate that as I'm having trouble understanding all the details - I'm just used to a bit of programming in Android and haven't touched ffmpeg or OpenGL yet.Joviality
@Joviality Sure I can post some examples. Just let me know what you need help with and I will do my best. It is all really pretty straight forward once you figure it out, was just hard to get it all working at first.Elmer
@Kieran That'd be awesome. I'm starting implementing next week or so, maybe we can just talk off the record?Joviality
@Kieran That is so nice from you. I'll still have to see whether we actually need that functionality, would be a waste of resources otherwise :)Joviality
Hello guys! I've asked a question about whether OpenGL or the native bitmap API (or something else) is the best for video. So far I have had two pretty contradictory answers. And that's why opened a 100 points bounty, which is closing in 12h. Could you please take a look? Here it is: #5667013Unbar
@Kieran This is exactly the question I was going to ask! As I can see you managed to get it working altogether? Can you please provide some examples? I've sent you email! Hope it's still the same address :)Bayreuth
Hi DiscGolfer,could you please provide some sample source code for rending video frames using FFMPEG on OPENGLES ....Luzern
Sounds weird that glTexImage is faster than glTexSubImageLenient

© 2022 - 2024 — McMap. All rights reserved.