March 23, 2015

Render camera preview using OpenGL ES 2.0 on Android API 21 or higher

I referred to the Android Samples for android.hardware.camera2 (samples\android-21\media\Camera2Basic).

In the code, I fixed the resolution of camera preview to the device resolution, and tested on the NEXUS 9.

// AndroidManifest.xml
...
< uses-feature android:glEsVersion="0x00020000" android:required="true"/>
< uses-feature android:name="android.hardware.camera2"/>
< uses-permission android:name="android.permission.CAMERA"/>
...
< activity android:name=".MainActivity" android:screenOrientation="landscape">
...


// MainActivity.java
package ...
import android.app.Activity;
import android.os.Bundle;
import android.view.View;
import android.view.Window;
import android.view.WindowManager;

public class MainActivity extends Activity {
    private MainView mView;

    @Override
    public void onCreate ( Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        requestWindowFeature(Window.FEATURE_NO_TITLE);
        int ui = getWindow().getDecorView().getSystemUiVisibility();
        ui = ui | View.SYSTEM_UI_FLAG_HIDE_NAVIGATION | View.SYSTEM_UI_FLAG_FULLSCREEN | View.SYSTEM_UI_FLAG_IMMERSIVE_STICKY;
        getWindow().getDecorView().setSystemUiVisibility(ui);
        getWindow().setFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON, WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);
        mView = new MainView(this);
        setContentView ( mView );
    }

    @Override
    protected void onResume() {
        super.onResume();
        mView.onResume();
    }

    @Override
    protected void onPause() {
        mView.onPause();
        super.onPause();
    }
}


// MainView.java
package ...
import android.content.Context;
import android.opengl.GLSurfaceView;
import android.view.SurfaceHolder;

public class MainView extends GLSurfaceView {
    MainRenderer mRenderer;

    MainView ( Context context ) {
        super ( context );
        mRenderer = new MainRenderer(this);
        setEGLContextClientVersion ( 2 );
        setRenderer ( mRenderer );
        setRenderMode ( GLSurfaceView.RENDERMODE_WHEN_DIRTY );
    }

    public void surfaceCreated ( SurfaceHolder holder ) {
        super.surfaceCreated ( holder );
    }

    public void surfaceDestroyed ( SurfaceHolder holder ) {
        super.surfaceDestroyed ( holder );
    }

    public void surfaceChanged ( SurfaceHolder holder, int format, int w, int h ) {
        super.surfaceChanged ( holder, format, w, h );
    }

    @Override
    public void onResume() {
        super.onResume();
        mRenderer.onResume();
    }

    @Override
    public void onPause() {
        mRenderer.onPause();
        super.onPause();
    }
}


// MainRenderer.java
package ...
import android.content.Context;
import android.graphics.Point;
import android.graphics.SurfaceTexture;
import android.hardware.camera2.CameraAccessException;
import android.hardware.camera2.CameraCaptureSession;
import android.hardware.camera2.CameraCharacteristics;
import android.hardware.camera2.CameraDevice;
import android.hardware.camera2.CameraManager;
import android.hardware.camera2.CaptureRequest;
import android.hardware.camera2.params.StreamConfigurationMap;
import android.opengl.GLES11Ext;
import android.opengl.GLES20;
import android.opengl.GLSurfaceView;
import android.os.Handler;
import android.os.HandlerThread;
import android.util.Log;
import android.util.Size;
import android.view.Surface;

import java.nio.ByteBuffer;
import java.nio.ByteOrder;
import java.nio.FloatBuffer;
import java.util.Arrays;
import java.util.concurrent.Semaphore;
import java.util.concurrent.TimeUnit;

import javax.microedition.khronos.egl.EGLConfig;
import javax.microedition.khronos.opengles.GL10;

public class MainRenderer implements GLSurfaceView.Renderer, SurfaceTexture.OnFrameAvailableListener {
    private final String vss_default = "" +
            "attribute vec2 vPosition;\n" +
            "attribute vec2 vTexCoord;\n" +
            "varying vec2 texCoord;\n" +
            "void main() {\n" +
            "  texCoord = vTexCoord;\n" +
            "  gl_Position = vec4 ( vPosition.x, vPosition.y, 0.0, 1.0 );\n" +
            "}";

    private final String fss_default = "" +
            "#extension GL_OES_EGL_image_external : require\n" +
            "precision mediump float;\n" +
            "uniform samplerExternalOES sTexture;\n" +
            "varying vec2 texCoord;\n" +
            "void main() {\n" +
            "  gl_FragColor = texture2D(sTexture,texCoord);\n" +
            "}";

    private int[] hTex;
    private FloatBuffer pVertex;
    private FloatBuffer pTexCoord;
    private int hProgram;

    private SurfaceTexture mSTexture;

    private boolean mGLInit = false;
    private boolean mUpdateST = false;

    private MainView mView;

    private CameraDevice mCameraDevice;
    private CameraCaptureSession mCaptureSession;
    private CaptureRequest.Builder mPreviewRequestBuilder;
    private String mCameraID;
    private Size mPreviewSize = new Size ( 1920, 1080 );

    private HandlerThread mBackgroundThread;
    private Handler mBackgroundHandler;
    private Semaphore mCameraOpenCloseLock = new Semaphore(1);

    MainRenderer ( MainView view ) {
        mView = view;
        float[] vtmp = { 1.0f, -1.0f, -1.0f, -1.0f, 1.0f, 1.0f, -1.0f, 1.0f };
        float[] ttmp = { 1.0f, 1.0f, 0.0f, 1.0f, 1.0f, 0.0f, 0.0f, 0.0f };
        pVertex = ByteBuffer.allocateDirect(8 * 4).order(ByteOrder.nativeOrder()).asFloatBuffer();
        pVertex.put ( vtmp );
        pVertex.position(0);
        pTexCoord = ByteBuffer.allocateDirect(8*4).order(ByteOrder.nativeOrder()).asFloatBuffer();
        pTexCoord.put ( ttmp );
        pTexCoord.position(0);
    }

    public void onResume() {
        startBackgroundThread();
    }

    public void onPause() {
        mGLInit = false;
        mUpdateST = false;
        closeCamera();
        stopBackgroundThread();
    }

    public void onSurfaceCreated ( GL10 unused, EGLConfig config ) {
        initTex();
        mSTexture = new SurfaceTexture ( hTex[0] );
        mSTexture.setOnFrameAvailableListener(this);

        GLES20.glClearColor(1.0f, 1.0f, 1.0f, 1.0f);

        hProgram = loadShader ( vss_default, fss_default );

        Point ss = new Point();
        mView.getDisplay().getRealSize(ss);

        cacPreviewSize(ss.x, ss.y);
        openCamera();

        mGLInit = true;
    }

    public void onDrawFrame ( GL10 unused ) {
        if ( !mGLInit ) return;
        GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT);

        synchronized(this) {
            if ( mUpdateST ) {
                mSTexture.updateTexImage();
                mUpdateST = false;
            }
        }

        GLES20.glUseProgram(hProgram);

        int ph = GLES20.glGetAttribLocation(hProgram, "vPosition");
        int tch = GLES20.glGetAttribLocation ( hProgram, "vTexCoord" );

        GLES20.glVertexAttribPointer(ph, 2, GLES20.GL_FLOAT, false, 4*2, pVertex);
        GLES20.glVertexAttribPointer(tch, 2, GLES20.GL_FLOAT, false, 4*2, pTexCoord );
        GLES20.glEnableVertexAttribArray(ph);
        GLES20.glEnableVertexAttribArray(tch);

        GLES20.glActiveTexture(GLES20.GL_TEXTURE0);
        GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, hTex[0]);
        GLES20.glUniform1i(GLES20.glGetUniformLocation ( hProgram, "sTexture" ), 0);

        GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4);
        GLES20.glFlush();
    }

    public void onSurfaceChanged ( GL10 unused, int width, int height ) {
        GLES20.glViewport(0, 0, width, height);
    }

    private void initTex() {
        hTex = new int[1];
        GLES20.glGenTextures ( 1, hTex, 0 );
        GLES20.glBindTexture(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, hTex[0]);
        GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_WRAP_S, GLES20.GL_CLAMP_TO_EDGE);
        GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_WRAP_T, GLES20.GL_CLAMP_TO_EDGE);
        GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_NEAREST);
        GLES20.glTexParameteri(GLES11Ext.GL_TEXTURE_EXTERNAL_OES, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_NEAREST);
    }

    public synchronized void onFrameAvailable ( SurfaceTexture st ) {
        mUpdateST = true;
        mView.requestRender();
    }

    private static int loadShader ( String vss, String fss ) {
        int vshader = GLES20.glCreateShader(GLES20.GL_VERTEX_SHADER);
        GLES20.glShaderSource(vshader, vss);
        GLES20.glCompileShader(vshader);
        int[] compiled = new int[1];
        GLES20.glGetShaderiv(vshader, GLES20.GL_COMPILE_STATUS, compiled, 0);
        if (compiled[0] == 0) {
            Log.e("Shader", "Could not compile vshader");
            Log.v("Shader", "Could not compile vshader:"+GLES20.glGetShaderInfoLog(vshader));
            GLES20.glDeleteShader(vshader);
            vshader = 0;
        }

        int fshader = GLES20.glCreateShader(GLES20.GL_FRAGMENT_SHADER);
        GLES20.glShaderSource(fshader, fss);
        GLES20.glCompileShader(fshader);
        GLES20.glGetShaderiv(fshader, GLES20.GL_COMPILE_STATUS, compiled, 0);
        if (compiled[0] == 0) {
            Log.e("Shader", "Could not compile fshader");
            Log.v("Shader", "Could not compile fshader:"+GLES20.glGetShaderInfoLog(fshader));
            GLES20.glDeleteShader(fshader);
            fshader = 0;
        }

        int program = GLES20.glCreateProgram();
        GLES20.glAttachShader(program, vshader);
        GLES20.glAttachShader(program, fshader);
        GLES20.glLinkProgram(program);

        return program;
    }

    void cacPreviewSize( final int width, final int height ) {
        CameraManager manager = (CameraManager)mView.getContext().getSystemService(Context.CAMERA_SERVICE);
        try {
            for (String cameraID : manager.getCameraIdList()) {
                CameraCharacteristics characteristics = manager.getCameraCharacteristics(cameraID);
                if (characteristics.get(CameraCharacteristics.LENS_FACING) == CameraCharacteristics.LENS_FACING_FRONT)
                    continue;

                mCameraID = cameraID;
                StreamConfigurationMap map = characteristics.get(CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
                for ( Size psize : map.getOutputSizes(SurfaceTexture.class)) {
                    if ( width == psize.getWidth() && height == psize.getHeight() ) {
                        mPreviewSize = psize;
                        break;
                    }
                }
                break;
            }
        } catch ( CameraAccessException e ) {
            Log.e("mr", "cacPreviewSize - Camera Access Exception");
        } catch ( IllegalArgumentException e ) {
            Log.e("mr", "cacPreviewSize - Illegal Argument Exception");
        } catch ( SecurityException e ) {
            Log.e("mr", "cacPreviewSize - Security Exception");
        }
    }

    void openCamera() {
        CameraManager manager = (CameraManager)mView.getContext().getSystemService(Context.CAMERA_SERVICE);
        try {
            CameraCharacteristics characteristics = manager.getCameraCharacteristics(mCameraID);
            if (!mCameraOpenCloseLock.tryAcquire(2500, TimeUnit.MILLISECONDS)) {
                throw new RuntimeException("Time out waiting to lock camera opening.");
            }
            manager.openCamera(mCameraID,mStateCallback,mBackgroundHandler);
        } catch ( CameraAccessException e ) {
            Log.e("mr", "OpenCamera - Camera Access Exception");
        } catch ( IllegalArgumentException e ) {
            Log.e("mr", "OpenCamera - Illegal Argument Exception");
        } catch ( SecurityException e ) {
            Log.e("mr", "OpenCamera - Security Exception");
        } catch ( InterruptedException e ) {
            Log.e("mr", "OpenCamera - Interrupted Exception");
        }
    }

    private void closeCamera() {
        try {
            mCameraOpenCloseLock.acquire();
            if (null != mCaptureSession) {
                mCaptureSession.close();
                mCaptureSession = null;
            }
            if (null != mCameraDevice) {
                mCameraDevice.close();
                mCameraDevice = null;
            }
        } catch (InterruptedException e) {
            throw new RuntimeException("Interrupted while trying to lock camera closing.", e);
        } finally {
            mCameraOpenCloseLock.release();
        }
    }

    private final CameraDevice.StateCallback mStateCallback = new CameraDevice.StateCallback() {

        @Override
        public void onOpened(CameraDevice cameraDevice) {
            mCameraOpenCloseLock.release();
            mCameraDevice = cameraDevice;
            createCameraPreviewSession();
        }

        @Override
        public void onDisconnected(CameraDevice cameraDevice) {
            mCameraOpenCloseLock.release();
            cameraDevice.close();
            mCameraDevice = null;
        }

        @Override
        public void onError(CameraDevice cameraDevice, int error) {
            mCameraOpenCloseLock.release();
            cameraDevice.close();
            mCameraDevice = null;
        }

    };

    private void createCameraPreviewSession() {
        try {
            mSTexture.setDefaultBufferSize(mPreviewSize.getWidth(), mPreviewSize.getHeight());

            Surface surface = new Surface(mSTexture);

            mPreviewRequestBuilder = mCameraDevice.createCaptureRequest(CameraDevice.TEMPLATE_PREVIEW);
            mPreviewRequestBuilder.addTarget(surface);

            mCameraDevice.createCaptureSession(Arrays.asList(surface),
                    new CameraCaptureSession.StateCallback() {
                        @Override
                        public void onConfigured(CameraCaptureSession cameraCaptureSession) {
                            if (null == mCameraDevice)
                                return;

                            mCaptureSession = cameraCaptureSession;
                            try {
                                mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AF_MODE, CaptureRequest.CONTROL_AF_MODE_CONTINUOUS_PICTURE);
                                mPreviewRequestBuilder.set(CaptureRequest.CONTROL_AE_MODE, CaptureRequest.CONTROL_AE_MODE_ON_AUTO_FLASH);

                                mCaptureSession.setRepeatingRequest(mPreviewRequestBuilder.build(), null, mBackgroundHandler);
                            } catch (CameraAccessException e) {
                                Log.e("mr", "createCaptureSession");
                            }
                        }
                        @Override
                        public void onConfigureFailed(CameraCaptureSession cameraCaptureSession) {
                        }
                    }, null
            );
        } catch (CameraAccessException e) {
            Log.e("mr", "createCameraPreviewSession");
        }
    }

    private void startBackgroundThread() {
        mBackgroundThread = new HandlerThread("CameraBackground");
        mBackgroundThread.start();
        mBackgroundHandler = new Handler(mBackgroundThread.getLooper());
    }

    private void stopBackgroundThread() {
        mBackgroundThread.quitSafely();
        try {
            mBackgroundThread.join();
            mBackgroundThread = null;
            mBackgroundHandler = null;
        } catch (InterruptedException e) {
            Log.e("mr", "stopBackgroundThread");
        }
    }
}

31 comments:

  1. Hi i am getting this crash,
    06-17 15:04:44.780: E/CameraDevice-0-LE(21881): Surface with size (w=1920, h=1080) and format 0x1 is not valid, size not in valid set: [1440x1080, 1088x1088, 1280x720, 1056x704, 960x720, 736x736, 720x480, 640x480, 352x288, 320x240]
    06-17 15:04:44.785: W/CameraDevice-JV-0(21881): Stream configuration failed
    06-17 15:04:44.785: E/CameraCaptureSession(21881): Session 0: Failed to create capture session; configuration failed
    06-17 15:04:44.810: I/Timeline(21881): Timeline: Activity_idle id: android.os.BinderProxy@ac10ad6 time:3580176

    ReplyDelete
    Replies
    1. In this code, I fix the camera resolution to the display resolution.
      In your case, I think, your camera does not support 1920x1080 i.e. display resolution.
      In MainRenderer -> onSurfaceCreated, call cacPreviewSize(1280, 720) instead of cacPreviewSize(ss.x, ss.y).

      Delete
    2. Hi i want to process all the Raw Camera buffer from Camer2 in JNI and then want to display on OpenGL. How to do that ? I have done using old camera api but how to do with it Android Camera2 ?

      Delete
    3. You can change the surface type using createCaptureSession function.
      If you create the GL texture for SurfaceTexture in JNI, then render the texture using FBO and read the frame buffer using glReadPixels function.

      Delete
  2. Hi then how can we do in the above code i dont have that much idea about GL manipulation

    ReplyDelete
    Replies
    1. refer this

      http://maninara.blogspot.kr/2015/06/fbo-on-android.html

      If you remove 'GLES20', you can use this code on JNI.

      Delete
  3. Hello Maninara ,

    how can i process camera preview frame before showing on glsurface.
    i want to apply emboss effect to each frame. so can you tell me how can i achieve this thing using your code..?

    ReplyDelete
    Replies
    1. If you have some experience in GLSL, just edit fragment shader code.
      ex) gl_FragColor = texture2D(sTexture,vec2(texCoord.s+1/width,texCoord.t+1/height)) - texture2D(sTexture,texCoord);

      Delete
  4. Hi Maninara,

    Thank you for the good article. One more question about preview processing. After changing fss_default shader I am getting warning in my log:
    W/GLConsumer: [unnamed-7902-0] bindTextureImage: clearing GL error: 0x502
    And preview becomes white.

    Do you have any idea, why it happens?

    ReplyDelete
    Replies
    1. can I get your fs code?

      Delete
    2. Sorry, for a late reply.

      I don't remember which one was with that warning, but now I'm trying to apply grayscale shader now and another warning is in - af_exception_handling_lgaf: Invalid FV/Luma, This is warning!

      My grayscale shader is:
      public static final String GRAYSCALE_FRAGMENT_SHADER = "" +
      "#extension GL_OES_EGL_image_external : require\n"+
      "varying vec2 texCoord;\n" +
      "\n" +
      "uniform sampler2D inputImageTexture;\n" +
      "\n" +
      "void main()\n" +
      "{\n" +
      " lowp vec4 textureColor = texture2D(inputImageTexture, texCoord);\n" +
      " \n" +
      " gl_FragColor = vec4((1.0 - textureColor.rgb), textureColor.w);\n" +
      "}";

      I've take it from here - https://goo.gl/EfhA4D
      I'm not really experienced in shaders, so I would be also glad to hear any tips, links or any other useful information.

      Thanks in advance

      Delete
    3. you have to use 'GL_TEXTURE_EXTERNAL_OES' and 'samplerExternalOES' for the SurfaceTexture.
      If you want to change the image to grayscale, just replace my code to below.
      gl_FragColor = vec4(vec3(dot(vec3(0.3, 0.59, 0.11),texture2D(sTexture,texCoord).rgb)),1.0);

      And it will be helpful for you
      https://books.google.co.kr/books/about/OpenGL_Shading_Language.html?id=gwuAmAEACAAJ&redir_esc=y
      https://www.khronos.org/files/opengles_shading_language.pdf
      https://www.khronos.org/files/opengles20-reference-card.pdf

      Delete
  5. Anonymous7:26 PM

    Hey how can i blend another image on camera frame can you guide me?

    ReplyDelete
    Replies
    1. Use opengl multi-texture and alpha blending
      You can use glActiveTexture and glBlendFunc on OpenGL ES 2.0

      Delete
  6. This comment has been removed by the author.

    ReplyDelete
  7. This comment has been removed by the author.

    ReplyDelete
  8. how can i change the orientation of camera because you set activity orientation landscape... i tried some solutions like mPreviewRequestBuilder.set(CaptureRequest.JPEG_ORIENTATION, getOrientation(0)); & mSTexture.getTransformMatrix(mSTMatrix); it doesn't work... plz give me the solution for display camera preview orientation in portrait mode.....

    ReplyDelete
    Replies
    1. It's very simple. just change 3 lines.
      1. android:screenOrientation="landscape" to android:screenOrientation="portrait" in AndroidManifest.xml
      2. float[] vtmp = { 1.0f, -1.0f, -1.0f, -1.0f, 1.0f, 1.0f, -1.0f, 1.0f }; to float[] vtmp = { -1.0f, -1.0f, -1.0f, 1.0f, 1.0f, -1.0f, 1.0f, 1.0f }; in MainRenderer.java(constructor)
      3. if ( width == psize.getWidth() && height == psize.getHeight() ) to if ( height == psize.getWidth() && width == psize.getHeight() ) in MainRenderer.java(cacPreviewSize)

      Delete
  9. My Android Studio complains that MainRenderer must implement abstract method onSurfaceCreated but it already does and is accessbile (public)! Why is it so?

    ReplyDelete
    Replies
    1. Found it. Imported wrong package. It should be,

      import javax.microedition.khronos.egl.EGLConfig;
      import javax.microedition.khronos.opengles.GL10;

      Delete
  10. hi maninara,

    its a very good learning material! Could you give some hint like how to use your example to capture? seems like you have no layout there, so how to add button or something else. Sorry, i am new to this area.

    Thanks!

    ReplyDelete
    Replies
    1. There are two options for drawing button
      1. draw button using OpenGL and check touch event or GL_SELECT
      2. put android's button on MainView class as code level

      Delete
  11. How to capture frame and save it as bitmap

    ReplyDelete
    Replies
    1. This example just display the camera's preview. If you want to capture the image, you can use ACTION_IMAGE_CAPTURE.
      OR if you just want to save the preview, use glReadPixels function.

      Delete
  12. Thank you Maninara...I was waiting for this kind of tutorial..Keep it up

    ReplyDelete
  13. Hey Maninara.... how to draw cube on camera preview

    ReplyDelete
    Replies
    1. You can add some GL code after glDrawArrays in MainRenderer::onDrawFrame.
      But you must turn off GLSL shader and texture, and set the camera matrixes.

      Delete
    2. But you must turn off GLSL shader and texture, and set the camera matrixes.
      Can you tell me how to do that, pls give for me demo code?\

      Delete
  14. Thank you for the great tutorial. But because this was created in 2015, there are a lot of bugs. Can you do an updated version please? It would be really helpful.

    ReplyDelete
    Replies
    1. It works fine on API 29 too :)

      Delete
  15. Can you help me with some references so that I can create great filters with the fragment shader? like instagram and tiktok? Anyway thanks for the great tutorial!

    ReplyDelete