I have a screen recording app that uses a MediaCodec encoder to encode the video frames. Here's one way I retrieve the video-encoder:
videoCodec = MediaCodec.createEncoderByType(MediaFormat.MIMETYPE_VIDEO_AVC);
I then try to determine the best bitrate-mode that this encoder supports, with my order of preference being "Constant Quality" mode, Variable Bitrate mode, Constant Bitrate mode. This is how I try to do it:
MediaCodecInfo.CodecCapabilities capabilities = videoCodec.getCodecInfo().getCapabilitiesForType(MediaFormat.MIMETYPE_VIDEO_AVC);
MediaCodecInfo.EncoderCapabilities encoderCapabilities = capabilities.getEncoderCapabilities();
if (encoderCapabilities.isBitrateModeSupported(MediaCodecInfo.EncoderCapabilities.BITRATE_MODE_CQ)) {
Timber.i("Setting bitrate mode to constant quality");
videoFormat.setInteger(MediaFormat.KEY_BITRATE_MODE, MediaCodecInfo.EncoderCapabilities.BITRATE_MODE_CQ);
} else if (encoderCapabilities.isBitrateModeSupported(MediaCodecInfo.EncoderCapabilities.BITRATE_MODE_VBR)) {
Timber.w("Setting bitrate mode to variable bitrate");
videoFormat.setInteger(MediaFormat.KEY_BITRATE_MODE, MediaCodecInfo.EncoderCapabilities.BITRATE_MODE_VBR);
} else if (encoderCapabilities.isBitrateModeSupported(MediaCodecInfo.EncoderCapabilities.BITRATE_MODE_CBR)) {
Timber.w("Setting bitrate mode to constant bitrate");
videoFormat.setInteger(MediaFormat.KEY_BITRATE_MODE, MediaCodecInfo.EncoderCapabilities.BITRATE_MODE_CBR);
}
Running this on my Samsung Galaxy S7 ends up selecting VBR mode, i.e. Constant Quality mode is supposedly not supported. However, if I just set the BITRATE_MODE to Constant Quality, it not only works but in fact produces a better quality video than VBR mode.
So, if Constant Quality mode is apparently supported by this encoder, why do I get a false negative from isBitrateModeSupported()? Am I missing something here?