Android OpenGL ES – Camera Preview and Recording with Filters
This article demonstrates a combination of OpenGL ES, camera integration, and real-time filter application on textures. The objective is to preview camera feeds, apply filters, and record the output seamlessly.
Preview and Capture Workflow
High-Level Understanding:
The image stream acquired from the camera is transferred to an OpenGL texture using a SurfaceTexture. GLSurfaceView then renders this texture within the OpenGL environment. The overall flow is:
Camera → SurfaceTexture → OpenGL Texture (GL_TEXTURE_EXTERNAL_OES) → GLSurfaceView Rendering
Key Concepts in Camera API Implemantation:
The Camera API is abstracted using an interface (ICamera) for basic operations like opening the camera, setting aspect ratio, previewing, and connecting it with the SurfaceTexture.
Interface Definition Example:
public interface ICamera {
boolean open(int cameraId);
void setAspectRatio(AspectRatio aspectRatio);
boolean preview();
boolean close();
void setPreviewTexture(SurfaceTexture surfaceTexture);
CameraSize getPreviewSize();
CameraSize getPictureSize();
}
Camera Implementation for API Level 14:
A class implementation handles camera operations including setting preview dimensions, picture sizes, rotation, and autofocus.
public class CameraApi14 implements ICamera {
private int cameraId;
private Camera camera;
private Camera.Parameters cameraParameters;
private int desiredWidth = 1080;
private int desiredHeight = 1920;
public boolean open(int cameraId) {
camera = Camera.open(cameraId);
cameraParameters = camera.getParameters();
configureCameraParameters();
return true;
}
private void configureCameraParameters() {
cameraParameters.setPreviewSize(desiredWidth, desiredHeight);
cameraParameters.setPictureSize(desiredWidth, desiredHeight);
camera.setParameters(cameraParameters);
}
}
Using SurfaceTexture:
SurfaceTexture captures frames from the camera feed as OpenGL textures. It is refreshed whenever a new frame becomes available:
SurfaceTexture surfaceTexture = new SurfaceTexture(textureId);
surfaceTexture.setOnFrameAvailableListener(frame -> requestRender());
Textures must use the GL_TEXTURE_EXTERNAL_OES target for rendering in shaders:
#extension GL_OES_EGL_image_external : require
uniform samplerExternalOES uTexture;
Applying Filters and Off-Screen Rendering:
Filter Implementation Workflow:
Filters are applied through off-screen rendering using a FrameBuffer Object (FBO). This enables capturing the texture data, applying transformations, and rendering the filtered output to either GLSurfaceView or a MediaCodec InputSurface during recording.
Generating FrameBuffer:
private void prepareFramebuffer(int width, int height) {
int[] textureId = new int[1];
GLES20.glGenTextures(1, textureId, 0);
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureId[0]);
GLES20.glTexImage2D(GLES20.GL_TEXTURE_2D, 0, GLES20.GL_RGBA,
width, height, 0, GLES20.GL_RGBA, GLES20.GL_UNSIGNED_BYTE, null);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MIN_FILTER, GLES20.GL_LINEAR);
GLES20.glTexParameteri(GLES20.GL_TEXTURE_2D, GLES20.GL_TEXTURE_MAG_FILTER, GLES20.GL_LINEAR);
int[] framebuffer = new int[1];
GLES20.glGenFramebuffers(1, framebuffer, 0);
GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, framebuffer[0]);
int[] renderbuffer = new int[1];
GLES20.glGenRenderbuffers(1, renderbuffer, 0);
GLES20.glBindRenderbuffer(GLES20.GL_RENDERBUFFER, renderbuffer[0]);
GLES20.glRenderbufferStorage(GLES20.GL_RENDERBUFFER, GLES20.GL_DEPTH_COMPONENT16, width, height);
GLES20.glFramebufferRenderbuffer(GLES20.GL_FRAMEBUFFER, GLES20.GL_DEPTH_ATTACHMENT,
GLES20.GL_RENDERBUFFER, renderbuffer[0]);
GLES20.glFramebufferTexture2D(GLES20.GL_FRAMEBUFFER, GLES20.GL_COLOR_ATTACHMENT0,
GLES20.GL_TEXTURE_2D, textureId[0], 0);
GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0);
}
Filter Application:
Using FBO, the raw texture is filtered and stored in an off-screen texture. The filtered texture is then passed to the GLSurfaceView or MediaCodec InputSurface:
@Override
public void onDrawFrame(GL10 gl) {
// Draw OES texture to off-screen texture
GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, framebuffer);
filter.draw();
GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0);
// Retrieve filtered texture ID
int filteredTextureId = filter.getOutputTextureId();
// Pass texture to encoder for recording
videoEncoder.setTextureId(filteredTextureId);
videoEncoder.frameAvailable(surfaceTexture);
// Render filtered texture on screen
displayFilter.setTextureId(filteredTextureId);
displayFilter.onDrawFrame();
}
Integrating MediaCodec for Recording:
Recording Workflow:
- Link EGLContext between rendering thread and encoder.
- Use MediaCodec's
InputSurfacefor feeding texture data. - Synchronize frame rendering and encoding through shared EGL resources.
Encoder Initialization:
private void prepareEncoder(EGLContext sharedContext, int width, int height, int bitrate) {
videoEncoder = new MediaCodecEncoderCore(width, height, bitrate);
eglCore = new EglCore(sharedContext, EglCore.FLAG_RECORDABLE);
inputWindowSurface = new WindowSurface(eglCore, videoEncoder.getInputSurface(), true);
inputWindowSurface.makeCurrent();
}
Frame Encoding:
@Override
public void handleFrameAvailable(float[] transformMatrix, long timestamp) {
videoEncoder.drainEncoder(false);
fullFrameRenderer.drawFrame(textureId, transformMatrix);
inputWindowSurface.setPresentationTime(timestamp);
inputWindowSurface.swapBuffers();
}
Sumary:
- Camera preview and recording use OpenGL textures routed through Glsurfaceveiw for rendering and MediaCodec for encoding.
- Filter are applied using off-screen techniques where textures are processed via FrameBuffer Objects.
- EGLContext is shared between threads to ensure seamless integration of rendering and encoding.
For a detailed demo, visit: GitHub Repository.