Android MediaCodec: encoding falis because no sync frames for video track
Asked Answered
H

0

3

I am planning to convert a video file to another video file with different bitrate, fps, etc.

Bascially I follow the examples in http://bigflake.com/mediacodec

However, Log shows error of no sync frames for video track:

submitted frame 5 to dec, size=47398
no output from encoder available
decoder output format changed: {height=1080, what=1869968451, color-format=2141391875, slice-height=1088, crop-left=0, width=1920, crop-bottom=1079, crop-top=0, mime=video/raw, stride=1920, crop-right=1919}
no output from encoder available
surface decoder given buffer 0 (size=3137536)
awaiting frame
E/MPEG4Writer(3464): There are no sync frames for video track
W/MPEG4Writer(3464): 0-duration samples found: 1
Stopping Video track

Then the program exits.

I searched online. fadden says "Make sure you are passing all of the MediaCodec.BufferInfo values through to the MediaMuxer -- that's where the flags are. The sync frames will have the BUFFER_FLAG_SYNC_FRAME flag set."

However, from the example from http://bigflake.com/mediacodec, it uses:

outputFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL,10); 

This seems to mean encoder will determine which frame to be assigned with the key frame.

Seems no much related information online regarding this issue. Oh, wish bigflake.com has more examples related to some problems that developers are interested in (such as change the format parameters of one existing video file)

==[Update]== Here are some code I use:

MediaFormat outputFormat = MediaFormat.createVideoFormat(MIME_TYPE, mWidth, mHeight);
outputFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT,
        MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface);
outputFormat.setInteger(MediaFormat.KEY_BIT_RATE,5000000);
outputFormat.setInteger(MediaFormat.KEY_FRAME_RATE,30);
outputFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL,5); 
encoder = MediaCodec.createEncoderByType(MIME_TYPE);
encoder.configure(outputFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
inputSurface = new InputSurface(encoder.createInputSurface());
inputSurface.makeCurrent();
encoder.start();
...
try {
    mMuxer = new MediaMuxer(ouVideoFileName, MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4);
} catch (IOException ioe) {
    throw new RuntimeException("MediaMuxer creation failed", ioe);
}
...
// now that we have the Magic Goodies, start the muxer
mTrackIndex = mMuxer.addTrack(newFormat);
mMuxer.start();
mMuxerStarted = true;
...
mMuxer.writeSampleData(mTrackIndex, encodedData, info_encoder);

Then where do I miss to pass parameters to mMuxer? It seems I have passed all required parameters.

==[Update 2]== In:

int encoderStatus = encoder.dequeueOutputBuffer(info_encoder, TIMEOUT_USEC);

I log out info_encoder.flags: from frames 0 to 5, the flags = 0. They are not flags for Key Frame. The input video file is a short one recorded by the device, .mp4, and plays correctly. After frame 5, MPEG4Write complains "There are no sync frames for video track".

==[Update 3]== BTW, I find that the encoding part between DecodeEditEncodeTest.java and EncodeDecodeTest.java are different. The encoding part in EncodeDecodeTest.java includes encoder.dequeueInputBuffer, while in DecodeEditEncodeTest.java no lines related to encoder.dequeueInputBuffer at all. Do you think this would be a problem? But in the first place, why are they different in the two examples?

==[Update 4]== I copy the code to the class ExtractMpegFramesTest Then in my main activity, I have a button. After I click the button, it will call:

// test:
ExtractMpegFramesTest mTest = new ExtractMpegFramesTest();
try {
    mTest.testExtractMpegFrames();
} catch (Throwable e1) {
    // TODO Auto-generated catch block
    e1.printStackTrace();
}

Error:

E/ACodec(11342): [OMX.qcom.video.decoder.avc] storeMetaDataInBuffers failed w/ err -2147483648
java.lang.RuntimeException: frame wait timed out
ExtractMpegFramesTest$CodecOutputSurface.awaitNewImage(ExtractMpegFramesTest.java:496)
Hoshi answered 3/3, 2014 at 5:20 Comment(17)
The encoder decides where to put the sync frames based on the parameters you set. You need to make sure that you pass them to MediaMuxer -- it should be a simple matter of taking whatever you get from the decoder and handing it to writeSampleData().Carolynncarolynne
...also, if you're getting your FPS change by dropping frames, make sure you're not dropping the sync frames. For example, the first frame to come out of the AVC encoder (following one or more CODEC_CONFIG chunks) should be a sync frame.Carolynncarolynne
@fadden: Thanks. I have done this: outputFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL,10); I also change the value to different numbers, such as less than 5. All other code is following the example. Let me put more code in my original post.Hoshi
The parameters just determine how often they occur. The error message indicates that none are appearing, which should be impossible, since the very first frame to come out of the encoder should be a sync frame. (The decoder can't do anything if the stream just has a bunch of P/B frames and no I frames, which is why it's complaining.)Carolynncarolynne
I debug the code. In: int encoderStatus = encoder.dequeueOutputBuffer(info_encoder, TIMEOUT_USEC); I log out info_encoder.flags: from frames 0 to 5, the flags = 0. They are not flags for Key Frame. The input video file is a short one recorded by the device, .mp4, and plays correctly.Hoshi
BTW, I find that the encoding part between DecodeEditEncodeTest.java and EncodeDecodeTest.java are different. The encoding part in EncodeDecodeTest.java includes encoder.dequeueInputBuffer, while in DecodeEditEncodeTest.java no lines related to encoder.dequeueInputBuffer at all. Do you think this would be a problem? But in the first place, why are they different in the two examples?Hoshi
One example uses input from YUV buffers, the other uses input from Surface. In EncodeDecodeTest, you'll find that there's no encoder.dequeueInputBuffer() in doEncodeDecodeVideoFromSurfaceToSurface(). It might be useful to see the full decode / encode loop. You can paste large code chunks on gist.github.com .Carolynncarolynne
@fadden, thanks for your explanation on the difference. OK, I will post it on gishub. Right now, I am trying to output the data to the raw file (avoiding muxer); it fails too! Then I try to run the original code in the example "ExtractMpegFramesTest". It fails too! Error information "E/ACodec(11342): [OMX.qcom.video.decoder.avc] storeMetaDataInBuffers failed w/ err -2147483648". So something must be wrong elsewhere. See my new update in the original post.Hoshi
Ignore the storeMetaDataInBuffers complaint, it appears even on successful runs. Your problem is "frame wait timed out", which would happen if you didn't retain the ExtractMpegFramesWrapper (which runs everything on a separate thread). FWIW, similar question: #21751290Carolynncarolynne
@fadden: You said "which would happen if you didn't retain the ExtractMpegFramesWrapper". I create an object for ExtractMpegFramesTest, and start it in the main activity of the Android app. Is it not enough? Then how to do it? Any example? // I have read the link you mentioned. It points out where the problem is; but I do not know where to solve it. Let me work on it more... Thank you very much!Hoshi
let us continue this discussion in chatHoshi
re "frame wait timed out". as far as i remember you need to start processing in a separate threadCote
@Marlon, thanks. There is a ExtractMpegFramesWrapper, which serves as a function to start extractMpegFrames in a separate thread. Actually the class ExtractMpegFramesWrapper already demonstrates how to use it (there is a test entry point: testExtractMpegFrames)Hoshi
does testExtractMpegFrames test works? not from your application, but simply test?Cote
@Marlon, thanks. Did you mean CTS testExtractMpegFrames test? I didn't do that. I just check the CTS manual, which says "Do a factory data reset on the device (Settings > Storage > Factory data reset). Warning: This will erase all user data from the device." I am hesitant to do that. BTW, my device: Nexus 4. Android 4.4.2. I assume it should be compatible. Or any other better way to test testExtractMpegFrames?Hoshi
ExtractMpegFramesTest is a common test class that extends AndroidTestCase. Do you have an experience to run tests on android? No need to make any data reset or other device config changes when you want to run test. Just create test project with correct settings, put that test there + required video and run on deviceCote
#20879046 - the short discussion about running media testsCote

© 2022 - 2024 — McMap. All rights reserved.