Display a gstreamer video feed into Google Cardboard SurfaceTexture
Asked Answered
B

0

6

I'm using gstreamer to retrieve a video feed (sent from a RaspPi), and I need to display it into Google Cardboard.

I based my work on the tutorial-3 example of gstreamer. I managed to display my video into a SurfaceView, by giving my Surface (retrieved from SurfaceView.SurfaceHolder.getSurface(), but I now need to connect it with the Google Cardboard.

If I do not mistake, Google Cardboard relies on some SurfaceTexture. So I thought it would be easy to simply get the Surface from the SurfaceTexture using the Surface(SurfaceTexture) constructor.

The problem is that it simply doesn't work. My Google cardboard application Cardboard Pasthrough example, and I haven't touched the OpenGL code, since I don't know anything about it.

While debugging, I found out that there is (at least) one problem with the code I'm using. It seems that the line

GLES20.glActiveTexture(GL_TEXTURE_EXTERNAL_OES);
GLES20.glBindTexture(GL_TEXTURE_EXTERNAL_OES, texture);

are causing me some trouble since GL_TEXTURE_EXTERNAL_OES isn't in the required range for the glActiveTexture methods (which requires a GL from 0 to GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS). Here are my logs :

GLConsumer  W  [unnamed-12520-0] bindTextureImage: clearing GL error: 0x500
Adreno-ES20  W  <core_glActiveTexture:348>: GL_INVALID_ENUM

So what's working right now ?

It seems that the video feed is received by gstreamer, that tries to update the Surface (I get some notifications about the onFrameAvailable of the SurfaceTexture that are being updated, and the error logs are only sent since there). However, the screens stays black, as if nothing is updated.

Here are the most interesting parts of my code :

@Override
public void onCreate(Bundle savedInstanceState)
{
    super.onCreate(savedInstanceState);
    setContentView(R.layout.main);
    CardboardView cardboardView = (CardboardView) findViewById(R.id.cardboard_view);
    cardboardView.setRenderer(this);
    setCardboardView(cardboardView);

    // Initialize GStreamer and warn if it fails
    try {
        GStreamer.init(this);
    } catch (Exception e) {
    //Catch e...
    }
    mCamera = new float[16];
    mView = new float[16];
    mHeadView = new float[16];
    //gstreamer stuff
    nativeInit();
}


@Override
public void onSurfaceCreated(EGLConfig eglConfig) {
            Log.d(TAG, "onSurfaceCreated start");
    GLES20.glClearColor(0.5f, 0.1f, 0.1f, 0.5f);
    ByteBuffer bb = ByteBuffer.allocateDirect(squareVertices.length * 4);
    bb.order(ByteOrder.nativeOrder());
    vertexBuffer = bb.asFloatBuffer();
    vertexBuffer.put(squareVertices);
    vertexBuffer.position(0);


    ByteBuffer dlb = ByteBuffer.allocateDirect(drawOrder.length * 2);
    dlb.order(ByteOrder.nativeOrder());
    drawListBuffer = dlb.asShortBuffer();
    drawListBuffer.put(drawOrder);
    drawListBuffer.position(0);


    ByteBuffer bb2 = ByteBuffer.allocateDirect(textureVertices.length * 4);
    bb2.order(ByteOrder.nativeOrder());
    textureVerticesBuffer = bb2.asFloatBuffer();
    textureVerticesBuffer.put(textureVertices);
    textureVerticesBuffer.position(0);

    int vertexShader = loadGLShader(GLES20.GL_VERTEX_SHADER, vertexShaderCode);
    int fragmentShader = loadGLShader(GLES20.GL_FRAGMENT_SHADER, fragmentShaderCode);

    mProgram = GLES20.glCreateProgram();             // create empty OpenGL ES Program
    GLES20.glAttachShader(mProgram, vertexShader);   // add the vertex shader to program
    GLES20.glAttachShader(mProgram, fragmentShader); // add the fragment shader to program
    GLES20.glLinkProgram(mProgram);
    checkGLError("Problem on line "+new Throwable().getStackTrace()[0].getLineNumber());
    Log.d(TAG, "Surface created");
    texture = createTexture();
    initSurface(texture);
}
static private int createTexture()
{
    Log.d(TAG + "_cardboard", "createTexture");

    int[] texture = new int[1];

    GLES20.glGenTextures(1,texture, 0);
    checkGLError("GenTextures Problem on line "+new Throwable().getStackTrace()[0].getLineNumber());

    GLES20.glBindTexture(GL_TEXTURE_EXTERNAL_OES, texture[0]);
    checkGLError("BindTextures Problem on line "+new Throwable().getStackTrace()[0].getLineNumber());

    GLES20.glTexParameterf(GL_TEXTURE_EXTERNAL_OES,
            GL10.GL_TEXTURE_MIN_FILTER,GL10.GL_LINEAR);
    GLES20.glTexParameterf(GL_TEXTURE_EXTERNAL_OES,
            GL10.GL_TEXTURE_MAG_FILTER, GL10.GL_LINEAR);
    GLES20.glTexParameteri(GL_TEXTURE_EXTERNAL_OES,
            GL10.GL_TEXTURE_WRAP_S, GL10.GL_CLAMP_TO_EDGE);
    GLES20.glTexParameteri(GL_TEXTURE_EXTERNAL_OES,
            GL10.GL_TEXTURE_WRAP_T, GL10.GL_CLAMP_TO_EDGE);
    checkGLError("Problem on line "+new Throwable().getStackTrace()[0].getLineNumber());
    return texture[0];
}
//Give the surface to gstreamer.
private void initSurface(int texture) {
    mSurface = new SurfaceTexture(texture);
    mSurface.setOnFrameAvailableListener(this);
    Log.d(TAG, "OnFrameAvailableListener set");

    Surface toto = new Surface(mSurface);
    nativeSurfaceInit(toto);
    toto.release();

}

//When we need to render
@Override
public void onFrameAvailable(SurfaceTexture surfaceTexture) {
    Log.d(TAG, "onFrameAvailable");
    this.getCardboardView().requestRender();

}

//Display to cardboard
@Override
public void onNewFrame(HeadTransform headTransform) {

    headTransform.getHeadView(mHeadView, 0);

    // Build the camera matrix and apply it to the ModelView.
    Matrix.setLookAtM(mCamera, 0, 0.0f, 0.0f, 0.01f, 0.0f, 0.0f, 0.0f, 0.0f, 1.0f, 0.0f);

    float[] mtx = new float[16];
    GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
    mSurface.updateTexImage();
    mSurface.getTransformMatrix(mtx);

    float[] test = new float[3];
    headTransform.getEulerAngles(test, 0);

    //if(networkThread != null){
    //    networkThread.setRegValue(test);
    //}
}

@Override
public void onDrawEye(Eye eye) {
    // Log.d(TAG, "onDrawEye");

    GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);

    GLES20.glUseProgram(mProgram);
    Log.d(TAG, "trying to access " + GL_TEXTURE_EXTERNAL_OES +" out of " + GLES20.GL_MAX_COMBINED_TEXTURE_IMAGE_UNITS);
    GLES20.glActiveTexture(GL_TEXTURE_EXTERNAL_OES);
   // checkGLError("Problem on line "+new Throwable().getStackTrace()[0].getLineNumber());
    GLES20.glBindTexture(GL_TEXTURE_EXTERNAL_OES, texture);
   // checkGLError("Problem on line "+new Throwable().getStackTrace()[0].getLineNumber());


    mPositionHandle = GLES20.glGetAttribLocation(mProgram, "position");
    GLES20.glEnableVertexAttribArray(mPositionHandle);
   // checkGLError("Problem on line "+new Throwable().getStackTrace()[0].getLineNumber());
    GLES20.glVertexAttribPointer(mPositionHandle, COORDS_PER_VERTEX, GLES20.GL_FLOAT,
            false, vertexStride, vertexBuffer);
   // checkGLError("Problem on line "+new Throwable().getStackTrace()[0].getLineNumber());


    mTextureCoordHandle = GLES20.glGetAttribLocation(mProgram, "inputTextureCoordinate");
    GLES20.glEnableVertexAttribArray(mTextureCoordHandle);
    GLES20.glVertexAttribPointer(mTextureCoordHandle, COORDS_PER_VERTEX, GLES20.GL_FLOAT,
            false, vertexStride, textureVerticesBuffer);

    mColorHandle = GLES20.glGetAttribLocation(mProgram, "s_texture");


    GLES20.glDrawElements(GLES20.GL_TRIANGLES, drawOrder.length,
            GLES20.GL_UNSIGNED_SHORT, drawListBuffer);


    // Disable vertex array
    GLES20.glDisableVertexAttribArray(mPositionHandle);

    GLES20.glDisableVertexAttribArray(mTextureCoordHandle);
    Matrix.multiplyMM(mView, 0, eye.getEyeView(), 0, mCamera, 0);

}

For more code, here is a gist with the main two files : https://gist.github.com/MagicMicky/4caa3ac669215652e40f

edit: When trying to work with the camera app on gstreamer, the same errors are shown in the logcat as the one I described earlier. So this might be nothing of importance...

Bloodsucker answered 2/6, 2015 at 17:15 Comment(6)
The argument to glActiveTexture() is a texture unit, e.g. GL_TEXTURE0. The first argument to glBindTexture() is a texture target, e.g. GL_TEXTURE_EXTERNAL_OES. You can find various examples that mix GLES and video in Grafika (github.com/google/grafika).Declaration
@Declaration not really helping. I didn't find anything on grafika that would help me put a video on a SurfaceTexture. I know about the glActiveTexture argument, which seems weird, is that the demo app quoten above works with this error.Bloodsucker
The "texture from camera" activity takes video from the camera, plays it on a SurfaceTexture, and renders the texture with GLES. "Continuous capture" does similar things. Replace "camera output" with "video decoder output". It probably works despite the error because it remains set to a useful default value (GL_TEXTURE0).Declaration
@Declaration The problem is that in your examples, they still use a SurfaceView to display the Surface, which I can't since I don't have access to this kind of element using the Cardboard APIBloodsucker
Where you able to get this working? I am attempting to do something similar but am also getting just black. It works fine if I use MediaPlayer from grafika, but MediaPlayer is not flexible enough for my needs.Lashaunda
@JoelF unfortunately I didn't get this working. You can see some "results"/attemps in here: github.com/ZosteropsBloodsucker

© 2022 - 2024 — McMap. All rights reserved.