Modify ExtractMpegFramesTest example to render decoded output on screen
Asked Answered
R

1

1

I'm trying to modify ExtractMpegFramesTest to do the rendering on screen and still use glReadPixels to extract the frames.

I copied the relevant code for extracting the frames from ExtractMpegFramesTest (the CodecOutputSurface and STextureRender classes) and the frame extraction works as expected when rendered off screen.

I have a TextureView with a SurfaceTextureListener and when I receive onSurfaceTextureAvailable I get the SurfaceTexture and start the decoding process. I pass this SurfaceTexture to CodecOutputSurface but it doesn't work.

I'm not sure if this is relevant but onSurfaceTextureAvailable and the SurfaceTexture are received on the main thread and all the decoding (including CodecOutputSurface constructor call) is done on a different thread.

I tried to work with suggestions from here and here but I can't get it to work.

I see this in the logs:

E/BufferQueueProducer: [SurfaceTexture-0-11068-20] connect(P): already connected (cur=1 req=3)
I/MediaCodec: native window already connected. Assuming no change of surface
E/MediaCodec: configure failed with err 0xffffffea, resetting...

I made this modifications to the ExtractMpegFramesTest eglSetup method:

private void eglSetup() {
    mEGLDisplay = EGL14.eglGetDisplay(EGL14.EGL_DEFAULT_DISPLAY);
    if (mEGLDisplay == EGL14.EGL_NO_DISPLAY) {
        throw new RuntimeException("unable to get EGL14 display");
    }
    int[] version = new int[2];
    if (!EGL14.eglInitialize(mEGLDisplay, version, 0, version, 1)) {
        mEGLDisplay = null;
        throw new RuntimeException("unable to initialize EGL14");
    }

    int[] attribList = {
                    EGL14.EGL_RED_SIZE, 8,
                    EGL14.EGL_GREEN_SIZE, 8,
                    EGL14.EGL_BLUE_SIZE, 8,
                    EGL14.EGL_ALPHA_SIZE, 8,
                    EGL14.EGL_RENDERABLE_TYPE, EGL14.EGL_OPENGL_ES2_BIT,
                    EGL14.EGL_SURFACE_TYPE, EGL14.EGL_WINDOW_BIT, // tell it to use a window
                    EGL14.EGL_NONE
    };
    EGLConfig[] configs = new EGLConfig[1];
    int[] numConfigs = new int[1];
    if (!EGL14.eglChooseConfig(mEGLDisplay, attribList, 0, configs, 0, configs.length,
                    numConfigs, 0)) {
        throw new RuntimeException("unable to find RGB888+recordable ES2 EGL config");
    }

    // Configure context for OpenGL ES 2.0.
    int[] attrib_list = {
                    EGL14.EGL_CONTEXT_CLIENT_VERSION, 2,
                    EGL14.EGL_NONE
    };

    mEGLContext = EGL14.eglCreateContext(mEGLDisplay, configs[0], EGL14.EGL_NO_CONTEXT,
                    attrib_list, 0);
    checkEglError("eglCreateContext");
    if (mEGLContext == null) {
        throw new RuntimeException("null context");
    }

    int[] surfaceAttribs = {
                    EGL14.EGL_RENDER_BUFFER, EGL14.EGL_SINGLE_BUFFER,
                    EGL14.EGL_NONE
    };

    mSurfaceTexture.setOnFrameAvailableListener(this);

    mSurface = new Surface(mSurfaceTexture);


    mPixelBuf = ByteBuffer.allocateDirect(mWidth * mHeight * 4);
    mPixelBuf.order(ByteOrder.LITTLE_ENDIAN);

    mEGLSurface = EGL14.eglCreateWindowSurface(mEGLDisplay, configs[0], mSurface,
                    surfaceAttribs, 0); // create window surface instead of eglCreatePbufferSurface
    checkEglError("eglCreateWindowSurface");
    if (mEGLSurface == null) {
        throw new RuntimeException("surface was null");
    }
}

And to ExtractMpegFramesTest setup method:

private void setup() {
    mTextureRender = new STextureRender();
    mTextureRender.surfaceCreated();

    if (VERBOSE) Log.d(TAG, "textureID=" + mTextureRender.getTextureId());
}

Thanks

Remotion answered 22/4, 2018 at 14:48 Comment(0)
K
1

If I correctly understand what you're trying to do, you'd want to decode each frame to a SurfaceTexture, which gives you a GLES "external" texture with the data in it. You could then render that to the TextureView, calling glReadPixels() just before eglSwapBuffers().

You can't read data back once it has been sent to a screen Surface, as the consumer of the data lives in a different process. The efficient video path just passes the "external" texture to the Surface, but that won't work here. Ideally you would clone the external texture reference, forwarding one copy to the display Surface and using the other for rendering to an off-screen buffer that you can pull pixels from. (The Camera2 API can do multi-output tricks like this, but I don't know if it's exposed in MediaCodec. I haven't looked in a while though.)

Kendrickkendricks answered 23/4, 2018 at 17:58 Comment(9)
Hi, thank you for answering. I'm getting a stream from an external camera which I use MediaCodec to decode. when I pass a Surface (constructed from the SurfaceTexture received from TextureView.SurfaceTextureListener) to MediaCodec configure everything works fine stream is decoded and displayed in the TextureView. When I use the frame extraction test I can extract the frames from the stream and save them but can't display them. When I try give CodecOutputSurface the SurfaceTexture received from TextureView.SurfaceTextureListener (instead of the created one from the renderer) it failsRemotion
It fails when I try to configure MediaCodec with the same SurfaceTexture provided to CodecOutputSurface. I get the already connected logs and then java.lang.IllegalArgumentException at android.media.MediaCodec.native_configure(Native Method) at android.media.MediaCodec.configure(MediaCodec.java:1791). What I'm trying to do is decode the camera stream and simultaneously display it on screen and extract the framesRemotion
A Surface is the "consumer" side of a producer-consumer pair. You can't attach two consumers to one producer. The SurfaceTexture (a/k/a GLConsumer) is consuming the video frames, converting them to an "external" GLES texture. You need to render the external texture with GLES. In Grafika, see "Texture from Camera" for an example. (The video player example doesn't apply because it's using the TextureView as the consumer, rather than a separate SurfaceTexture.) Alternate approach: you may be able to get away with calling TextureView's getBitmap() function to get frame data.Kendrickkendricks
Hi thank you for your help, I've managed to display the stream with "Texture from Camera" from Grafika but it's all distorted -imgur.com/K70Dctz. When I decode to output surface provided by TextureView (no GLES) it displays the stream fine - imgur.com/1TUyGGM. Can you may be point me to where to look for the cause of the problem?Remotion
That looks like decoder junk -- macroblocking, lots of green, something valid-looking at the bottom. Since your video decoding is working with a different output path, I'd guess your texture source is bad, pointing to the wrong place. Can you switch from rendering to the screen texture to rendering to an off-screen texture and see if that looks correct? (Should be similar code, both doing GLES rendering, just different destinations, and it sounds like you had the off-screen stuff working earlier.)Kendrickkendricks
Off screen rendering is better but still not as good as rendering to ST provided by TextureView (which works pretty smoothly).Some example captured frames (rendered off screen using sample code from ExtractMpegFramesTest): 1- imgur.com/Qv9U55d, 2 - imgur.com/kS5YRSW, 3 - imgur.com/LlrTJHM . Since the difference is the GLES part I wanted to render it on screen (with GLES) to try and make some modifications to GLES code to get better frames but I don't know where to get started. Also the frames are much worst rendered on screen as you saw from attached imagesRemotion
Those images are pretty damaged. I'm not sure what would cause that. Does the original ExtractMpegFramesTest example work correctly on your device? If so, you will probably need to start with that or some Grafika code and slowly migrate to your implementation to figure out where things are going wrong. Are you doing everything on a single thread? (And again, if TextureView#getBitmap() does what you need, consider just using that.)Kendrickkendricks
ExtractMpegFramesTest works just fine when tested with mp4 asset file but on the live decoded stream I get the results you saw in the images I posted (from last comment). All the decoding and rendering is done on the same thread (not main). I'll try to go back to ExtractMpegFramesTest code again and see what I messed up when changing it to work with a live stream. Thank you for all your help.Remotion
Hello, Ik this is quite a few years late, could you share how you managed to extract frames from an external camera / webcam? I am trying to get frames from an rtsp streamCantata

© 2022 - 2024 — McMap. All rights reserved.