Fading Coder

An Old Coder’s Final Dance

You are here: Home > Tech > Content

Android OpenGL ES – Camera Preview and Recording with Filters

Tech 3

This article demonstrates a combination of OpenGL ES, camera integration, and real-time filter application on textures. The objective is to preview camera feeds, apply filters, and record the output seamlessly.

Preview and Capture Workflow

High-Level Understanding:

The image stream acquired from the camera is transferred to an OpenGL texture using a SurfaceTexture. GLSurfaceView then renders this texture within the OpenGL environment. The overall flow is:

Camera → SurfaceTexture → OpenGL Texture (GL_TEXTURE_EXTERNAL_OES) → GLSurfaceView Rendering

Key Concepts in Camera API Implemantation:

The Camera API is abstracted using an interface (ICamera) for basic operations like opening the camera, setting aspect ratio, previewing, and connecting it with the SurfaceTexture.

Interface Definition Example:

public interface ICamera {
    boolean open(int cameraId);
    void setAspectRatio(AspectRatio aspectRatio);
    boolean preview();
    boolean close();
    void setPreviewTexture(SurfaceTexture surfaceTexture);
    CameraSize getPreviewSize();
    CameraSize getPictureSize();
}

Camera Implementation for API Level 14:

A class implementation handles camera operations including setting preview dimensions, picture sizes, rotation, and autofocus.

public class CameraApi14 implements ICamera {
    private int cameraId;
    private Camera camera;
    private Camera.Parameters cameraParameters;
    private int desiredWidth = 1080;
    private int desiredHeight = 1920;

    public boolean open(int cameraId) {
        camera = Camera.open(cameraId);
        cameraParameters = camera.getParameters();
        configureCameraParameters();
        return true;
    }

    private void configureCameraParameters() {
        cameraParameters.setPreviewSize(desiredWidth, desiredHeight);
        cameraParameters.setPictureSize(desiredWidth, desiredHeight);
        camera.setParameters(cameraParameters);
    }
}

Using SurfaceTexture:

SurfaceTexture captures frames from the camera feed as OpenGL textures. It is refreshed whenever a new frame becomes available:

SurfaceTexture surfaceTexture = new SurfaceTexture(textureId);
surfaceTexture.setOnFrameAvailableListener(frame -> requestRender());

Textures must use the GL_TEXTURE_EXTERNAL_OES target for rendering in shaders:

#extension GL_OES_EGL_image_external : require
uniform samplerExternalOES uTexture;

Applying Filters and Off-Screen Rendering:

Filter Implementation Workflow:

Filters are applied through off-screen rendering using a FrameBuffer Object (FBO). This enables capturing the texture data, applying transformations, and rendering the filtered output to either GLSurfaceView or a MediaCodec InputSurface during recording.

Generating FrameBuffer:

private void prepareFramebuffer(int width, int height) {
    int[] textureId = new int[1];
    GLES20.glGenTextures(1, textureId, 0);
    GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureId[0]);

    GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_RGBA,
            width, height, 0, GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, null);

    GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);
    GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);

    int[] framebuffer = new int[1];
    GLES20.glGenFramebuffers(1, framebuffer, 0);
    GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, framebuffer[0]);

    int[] renderbuffer = new int[1];
    GLES20.glGenRenderbuffers(1, renderbuffer, 0);
    GLES20.glBindRenderbuffer(GLES20.GL_RENDERBUFFER, renderbuffer[0]);
    GLES20.glRenderbufferStorage(GLES20.GL_RENDERBUFFER, GLES20.GL_DEPTH_COMPONENT16, width, height);

    GLES20.glFramebufferRenderbuffer(GLES20.GL_FRAMEBUFFER, GLES20.GL_DEPTH_ATTACHMENT,
            GLES20.GL_RENDERBUFFER, renderbuffer[0]);
    GLES20.glFramebufferTexture2D(GLES20.GL_FRAMEBUFFER, GLES20.GL_COLOR_ATTACHMENT0,
            GLES20.GL_TEXTURE_2D, textureId[0], 0);
    GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0);
}

Filter Application:

Using FBO, the raw texture is filtered and stored in an off-screen texture. The filtered texture is then passed to the GLSurfaceView or MediaCodec InputSurface:

@Override
public void onDrawFrame(GL10 gl) {
    // Draw OES texture to off-screen texture
    GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, framebuffer);
    filter.draw();
    GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0);

    // Retrieve filtered texture ID
    int filteredTextureId = filter.getOutputTextureId();

    // Pass texture to encoder for recording
    videoEncoder.setTextureId(filteredTextureId);
    videoEncoder.frameAvailable(surfaceTexture);

    // Render filtered texture on screen
    displayFilter.setTextureId(filteredTextureId);
    displayFilter.onDrawFrame();
}

Integrating MediaCodec for Recording:

Recording Workflow:

  1. Link EGLContext between rendering thread and encoder.
  2. Use MediaCodec's InputSurface for feeding texture data.
  3. Synchronize frame rendering and encoding through shared EGL resources.

Encoder Initialization:

private void prepareEncoder(EGLContext sharedContext, int width, int height, int bitrate) {
    videoEncoder = new MediaCodecEncoderCore(width, height, bitrate);
    eglCore = new EglCore(sharedContext, EglCore.FLAG_RECORDABLE);
    inputWindowSurface = new WindowSurface(eglCore, videoEncoder.getInputSurface(), true);
    inputWindowSurface.makeCurrent();
}

Frame Encoding:

@Override
public void handleFrameAvailable(float[] transformMatrix, long timestamp) {
    videoEncoder.drainEncoder(false);
    fullFrameRenderer.drawFrame(textureId, transformMatrix);
    inputWindowSurface.setPresentationTime(timestamp);
    inputWindowSurface.swapBuffers();
}

Sumary:

  1. Camera preview and recording use OpenGL textures routed through Glsurfaceveiw for rendering and MediaCodec for encoding.
  2. Filter are applied using off-screen techniques where textures are processed via FrameBuffer Objects.
  3. EGLContext is shared between threads to ensure seamless integration of rendering and encoding.

For a detailed demo, visit: GitHub Repository.

Related Articles

Understanding Strong and Weak References in Java

Strong References Strong reference are the most prevalent type of object referencing in Java. When an object has a strong reference pointing to it, the garbage collector will not reclaim its memory. F...

Comprehensive Guide to SSTI Explained with Payload Bypass Techniques

Introduction Server-Side Template Injection (SSTI) is a vulnerability in web applications where user input is improper handled within the template engine and executed on the server. This exploit can r...

Implement Image Upload Functionality for Django Integrated TinyMCE Editor

Django’s Admin panel is highly user-friendly, and pairing it with TinyMCE, an effective rich text editor, simplifies content management significantly. Combining the two is particular useful for bloggi...

Leave a Comment

Anonymous

◎Feel free to join the discussion and share your thoughts.