MediaCodec H264 Encoder not working on Snapdragon 800 devices
Asked Answered
F

2

14

I have written a H264 Stream Encoder using the MediaCodec API of Android. I tested it on about ten different devices with different processors and it worked on all of them, except on Snapdragon 800 powered ones (Google Nexus 5 and Sony Xperia Z1). On those devices I get the SPS and PPS and the first Keyframe, but after that mEncoder.dequeueOutputBuffer(mBufferInfo, 0) only returns MediaCodec.INFO_TRY_AGAIN_LATER. I already experimented with different timeouts, bitrates, resolutions and other configuration options, to no avail. The result is always the same.

I use the following code to initialise the Encoder:

        mBufferInfo = new MediaCodec.BufferInfo();
        encoder = MediaCodec.createEncoderByType("video/avc");
        MediaFormat mediaFormat = MediaFormat.createVideoFormat("video/avc", 640, 480);
        mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 768000);
        mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 30);
        mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, mEncoderColorFormat);
        mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 10);
        encoder.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);

where the selected color format is:

MediaCodecInfo.CodecCapabilities capabilities = mCodecInfo.getCapabilitiesForType(MIME_TYPE);
            for (int i = 0; i < capabilities.colorFormats.length && selectedColorFormat == 0; i++)
            {
                int format = capabilities.colorFormats[i];
                switch (format) {
                    case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar:
                    case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedPlanar:
                    case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar:
                    case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedSemiPlanar:
                    case MediaCodecInfo.CodecCapabilities.COLOR_TI_FormatYUV420PackedSemiPlanar:
                    case MediaCodecInfo.CodecCapabilities.COLOR_QCOM_FormatYUV420SemiPlanar:
                        selectedColorFormat = format;
                        break;
                    default:
                        LogHandler.e(LOG_TAG, "Unsupported color format " + format);
                        break;
                }
            }

And I get the data by doing

            ByteBuffer[] inputBuffers = mEncoder.getInputBuffers();
        ByteBuffer[] outputBuffers = mEncoder.getOutputBuffers();

        int inputBufferIndex = mEncoder.dequeueInputBuffer(-1);
        if (inputBufferIndex >= 0)
        {
            // fill inputBuffers[inputBufferIndex] with valid data
            ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
            inputBuffer.clear();
            inputBuffer.put(rawFrame);
            mEncoder.queueInputBuffer(inputBufferIndex, 0, rawFrame.length, 0, 0);
            LogHandler.e(LOG_TAG, "Queue Buffer in " + inputBufferIndex);
        }

        while(true)
        {
            int outputBufferIndex = mEncoder.dequeueOutputBuffer(mBufferInfo, 0);
            if (outputBufferIndex >= 0)
            {
                Log.d(LOG_TAG, "Queue Buffer out " + outputBufferIndex);
                ByteBuffer buffer = outputBuffers[outputBufferIndex];
                if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0)
                {
                    // Config Bytes means SPS and PPS
                    Log.d(LOG_TAG, "Got config bytes");
                }

                if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_SYNC_FRAME) != 0)
                {
                    // Marks a Keyframe
                    Log.d(LOG_TAG, "Got Sync Frame");
                }

                if (mBufferInfo.size != 0)
                {
                    // adjust the ByteBuffer values to match BufferInfo (not needed?)
                    buffer.position(mBufferInfo.offset);
                    buffer.limit(mBufferInfo.offset + mBufferInfo.size);

                    int nalUnitLength = 0;
                    while((nalUnitLength = parseNextNalUnit(buffer)) != 0)
                    {
                        switch(mVideoData[0] & 0x0f)
                        {
                            // SPS
                            case 0x07:
                            {
                                Log.d(LOG_TAG, "Got SPS");
                                break;
                            }

                            // PPS
                            case 0x08:
                            {
                                Log.d(LOG_TAG, "Got PPS");
                                break;
                            }

                            // Key Frame
                            case 0x05:
                            {
                                Log.d(LOG_TAG, "Got Keyframe");
                            }

                            //$FALL-THROUGH$
                            default:
                            {
                                // Process Data
                                break;
                            }
                        }
                    }
                }

                mEncoder.releaseOutputBuffer(outputBufferIndex, false);

                if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0)
                {
                    // Stream is marked as done,
                    // break out of while
                    Log.d(LOG_TAG, "Marked EOS");
                    break;
                }
            }
            else if(outputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED)
            {
                outputBuffers = mEncoder.getOutputBuffers();
                Log.d(LOG_TAG, "Output Buffer changed " + outputBuffers);
            }
            else if(outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED)
            {
                MediaFormat newFormat = mEncoder.getOutputFormat();
                Log.d(LOG_TAG, "Media Format Changed " + newFormat);
            }
            else if(outputBufferIndex == MediaCodec.INFO_TRY_AGAIN_LATER)
            {
                // No Data, break out
                break;
            }
            else
            {
                // Unexpected State, ignore it
                Log.d(LOG_TAG, "Unexpected State " + outputBufferIndex);
            }
        }

Thanks for your help!

Flowery answered 9/12, 2013 at 16:16 Comment(6)
How many input frames are queued up at the point when the output stalls? (I want to make sure it's not simply starving for input.) Is there anything suspicious-looking in logcat? (Codecs tend to spray Log.e, which can make it hard to tell.) What color format is being selected? (Is it the QCOM format?) Is the size of your "raw frame" exactly the same as the capacity of the input buffer? (If not... why not?)Salem
@Salem It does not matter how long I let it run but it always seems to have 5 frames in the input buffers. Its output upon creation is: I/OMXClient(11245): Using client-side OMX mux. I/ACodec(11245): setupVideoEncoder succeeded The color format selected is in both cases MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar (If I query all formats it only has two, the aforementioned and one that has a constant 2130708361 which crashes if selected.) The raw frame and the input buffer are not the same (raw frame size is always smaller and the input buffer capacity is always 282624)Flowery
Five frames is typical -- sounds like it's not processing input, hence no output. I assume you're calling encoder.start()? YUV420SemiPlanar is good; 2130708361 is only used for Surface input. The size of a YUV420 buffer should be width * height * 1.5, or 460800 bytes, so I'm a little confused about your buffer size. Do you see your "Media Format Changed" message in the log file, and if so, what does it say?Salem
@Salem yes, I am calling encoder.start(). The resolution was different for the traces in my answer. The rawFrame size is width * height * 1.5 but the size of the buffer is always a little more than that. I did not get the Format Changed Message.Flowery
I'm not sure what else to try -- code pasted above looks fine. Your best approach from here might be to take something similar that is known to work (say, the buffer-to-buffer EncodeDecodeTest code), get that working in your app, and then gradually alter it to look like your implementation.Salem
@Salem Okay, I will try that and get back to you with my results. Thanks again for your help.Flowery
T
23

You need to set the presentationTimeUs parameter in your call to queueInputBuffer. Most encoders ignore this and you can encode for streaming without issues. The encoder used for Snapdragon 800 devices doesn't.

This parameter represents the recording time of your frame and needs therefore to increase by the number of us between the frame that you want to encode and the previous frame.

If the parameter set is the same value as in the previous frame the encoder drops it. If the parameter is set to a too small value (e.g. 100000 on a 30 FPS recording) the quality of the encoded frames drops.

Typographer answered 12/12, 2013 at 10:27 Comment(3)
Huh. The presentation time stamp isn't part of the H.264 elementary stream, so my expectation was that the value was simply passed through. I added a new FAQ entry (bigflake.com/mediacodec/#q8).Salem
Could you please provide an example of how to set the presentation time?Boz
TimeUs parameter in your call to queueInputBuffer value for Snapdragon 800 devicesTiddly
A
0

encodeCodec.queueInputBuffer(inputBufferIndex, 0, input.length, (System.currentTimeMillis() - startMs) * 1000, 0);

Again answered 15/1, 2014 at 9:13 Comment(3)
Using the current time is fine if you're receiving real-time input (e.g. from the camera), but will work poorly if you're working with other sources (e.g. transcoding video at faster than real time). I'd also recommend against System.currentTimeMillis(), as it is subject to sudden jumps (forward and backward). The monotonic System.nanoTime() is a better source.Salem
Even in case of transcode, the timestamps of source content will be applicable (based on source content's fps). Encoder needs to know the timestamps to be able to manage the rate-control. So, if you have configured the framerate with mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, FPS), advisable to generate the timestamps (N * 1000 * 1000 / FPS) for non-realtime encoding.Fatally
@fadden, would I need to set the presentation time for a asynchronous MediaCodec encoder instance? ffmpeg cannot see the presentation times for video frames when I decode the h264 output. Please advise meAgglomeration

© 2022 - 2024 — McMap. All rights reserved.