So, In my application, I am able to show effects(like blur filter, gaussian) to video that comes from Camera using GPUImage library.
Basically, I (library) will take the input from the Camera, get's the raw byte data, converts it into RGBA format from YUV format, then applies effects to this image and displays on the Surface
of GLSurfaceView
using OpenGL. finally, to the user, it looks like a video with effects applied.
Now I want to record the frames of Surface
as a video using MediaCodec API.
but this discussion says that we can not pass a predefined Surface
to the MediaCodec
.
I have seen some samples at bigflake where he is creating Surface
using MediaCodec.createInputSurface()
but for me, Surface
comes from the GLSurfaceView
.
So, how can I record a frames of a Surface
as a video?
I will record the audio parallelly, merge that video and audio using FFMPEG and present to the user as a Video with effects applied.