There are many introductions on the Internet OpenGL ES Article , But because OpenGL
ES Too much content , Therefore, these articles are inevitably too cumbersome and messy , It's hard to get to the point , For beginners, it's still a mystery in the end . A lot of people ( Including the author himself ) Get to know more OpenGL
ES Because it involves the application of real-time filter , It's usually referred to open source frameworks GPUImage Implementation of . If you don't master the basic OpenGL Es Development knowledge of , It's hard to understand why .

Currently very popular short video special effect processing also involves OpenGL Application of , So the author, who has stepped on countless pits, is determined to let the latter take less detours , With the most practical scene —— Show a picture to start learning OpenGL

This article is suitable for beginners Android OpenGL ES 2.0+, And want to know OpenGL Real time filter implementation principle .


get ready

Before you start implementing , Let's first talk about some basic knowledge , So is it OpenGL ES 2D\3D Some basic theories of drawing , Here we only talk about the knowledge points needed to draw a picture .


Coordinate system

OpenGL It has independent coordinate system , The initial coordinate system before any transformation is 3D coordinate system ,x y z The range of values is [-1, 1]:


Because what we're drawing is 2D picture , Therefore, it can be simplified to two-dimensional coordinate system ( Contains only xy axis ), The origin of the coordinate system is in the center of the window ,x Axial right ,y Axial up :

Now there's a question , The ratio of the length to width of our screen or display window is not 1:1( It's not a square ), How to follow OpenGL What about the corresponding initial world coordinate system ? If we don't specify the projection scale , The world coordinate system fills the entire display window , This results in tensile deformation , For example, if you project the triangle above on the window, it will display as follows :

If you want to specify the projection scale, you have to apply it to projections and matrix transformations , Here we still use the original world coordinate system , For example, the triangle above is normal , Change the coordinates of the drawn vertices according to the stretch scale .


Vertex coordinates

stay OpenGL
ES in , Supports three types of rendering : spot , Straight lines and triangles ; These three kinds of figures make up all the other figures , For example, the smooth sphere we see is also made up of triangles , The more triangles, the smoother they look :

When drawing a graph, we need to provide the corresponding vertex position , Then you specify how to draw the vertices , So that's what we want .

When we show a picture later, we also need to draw a rectangle composed of two triangles , adopt GL_TRIANGLE_STRIP Drawing mode ( That is, every three adjacent vertices form a triangle , It is composed of a series of connected triangles ) draw :


Texture mapping ( texture mapping )

What we need to show is a picture , And it's always about drawing graphics . It's like we put wallpaper on the wall , First of all, you have to build a house , Then decide where each place of the wallpaper should be placed on the wall , The process is OpenGL Is called texture mapping , Also called texture mapping .

Texture mapping involves UV coordinate , All image files are two-dimensional plane , Through this plane UV Coordinates we can locate any pixel in the image , stay android Of uv The origin of the coordinates is in the upper left corner :

We render according to the order of vertices , Define each vertex uv coordinate , As shown in the figure below, we define the four vertices , Draw as a rectangle :

According to the rendering order of the vertices , Define each vertex uv coordinate :

After specifying the texture coordinates corresponding to a specific vertex , The other parts between the vertices will be interpolated smoothly , Finally, the whole texture is displayed .



Rasterization is the process of converting vertex data into slices . Each element in a slice corresponds to a pixel in the frame buffer .

The 3D geometric information in the virtual world is projected onto the 2D screen , Because the screen of current display equipment is discrete ( It's made up of pixels ), Therefore, it is necessary to discretize the projection results , It is decomposed into discrete small elements , These small units are called slice elements ( fragment ,Fragment).



OpenGL ES2.0 Using programmable rendering pipelines , Since it's programmable , Then we need to write our own shader code (GLSL),OpenGL Vertex shaders in (Vertex
Shader) And slice shaders (Fragment Shader).

Vertex shaders are mainly used to deal with the final position of each vertex in the graph . Vertex data is passed to the shader by us , Because drawing a picture does not need to transform the vertex , So we don't need to deal with each vertex in vertex shader . The slice shader mainly deals with the final color of each slice , Here we just need to use the texture data that comes in , Texture sampling is enough .


Start doing it !

stay Android Used in the system OpenGL Two basic classes need to be involved ,GLSurfaceView and GLSurfaceView.Renderer.

1.GLSurfaceView Inherited SurfaceView class , It's dedicated to display OpenGL Rendered graphics . That's understandable ,GLSurfaceView That's what we used to show OpenGL Graphics window .
* GLSurfaceView.Renderer yes GLSurfaceview Render for , adopt GLSurfaceView.setRender() set up .
interface GLSurfaceView.Renderer { // stay Surface Callback when creating , You can do some initialization here public void
onSurfaceCreated(GL10 gl, EGLConfig config); // stay Surface Call back when the size changes , You can set the size of the window here
public void onSurfaceChanged(GL10 gl, int width, int height); // Call back when drawing each frame
public void onDrawFrame(GL10 gl); }

It needs special explanation here ,Render The callback of the render is executed on a separate thread , So we did OpenGL You also need to switch to the GL On threads in the environment , It can be done through GLSurfaceView.queueEvent(Runnable) Put the operation into GL In the queue of the environment , You can also control the queue yourself , wait for Render When calling back, the queue operation is executed .

The code is as follows :
public class GLShowImageActivity extends Activity { //
How to draw pictures : Defines the vertices of a set of rectangular regions , Then according to the texture coordinates, the image is pasted as texture in the rectangular area . //
The vertex coordinates of the original rectangular region , Because the vertex method is used later to draw the vertex , Therefore, there is no need to define the index for drawing vertices . Regardless of the size of the window , stay OpenGL The two-dimensional coordinate system is a rectangular area represented by the following
static final float CUBE[] = { // The center of the window is OpenGL Origin of 2D coordinate system (0,0) -1.0f, -1.0f, // v1
1.0f, -1.0f, // v2 -1.0f, 1.0f, // v3 1.0f, 1.0f, // v4 }; //
Texture also has a coordinate system , call UV coordinate , perhaps ST coordinate .UV The coordinates are defined as the upper left corner (0,0), Lower right corner (1,1), A picture, no matter how big it is , stay UV In the coordinate system, the upper left corner of the picture is (0,0), Lower right corner (1,1)
// Texture coordinates , The texture samples of each coordinate correspond to the coordinates of the upper vertex . public static final float TEXTURE_NO_ROTATION[] = {
0.0f, 1.0f, // v1 1.0f, 1.0f, // v2 0.0f, 0.0f, // v3 1.0f, 0.0f, // v4 };
private GLSurfaceView mGLSurfaceView; private int mGLTextureId =
OpenGlUtils.NO_TEXTURE; // texture id private GLImageHandler mGLImageHandler = new
GLImageHandler(); private FloatBuffer mGLCubeBuffer; private FloatBuffer
mGLTextureBuffer; private int mOutputWidth, mOutputHeight; // Window size private int
mImageWidth, mImageHeight; // bitmap Actual size of picture @Override protected void
onCreate(@Nullable Bundle savedInstanceState) {
super.onCreate(savedInstanceState); setContentView(R.layout.activity_01);
mGLSurfaceView = findViewById(;
mGLSurfaceView.setEGLContextClientVersion(2); // establish OpenGL ES 2.0 Context of
mGLSurfaceView.setRenderer(new MyRender());
mGLSurfaceView.setRenderMode(GLSurfaceView.RENDERMODE_WHEN_DIRTY); // Manual refresh }
private class MyRender implements GLSurfaceView.Renderer { @Override public
void onSurfaceCreated(GL10 gl, EGLConfig config) { GLES20.glClearColor(0, 0, 0,
1); GLES20.glDisable(GLES20.GL_DEPTH_TEST); // When we need to draw transparent pictures , You need to turn it off
mGLImageHandler.init(); // Pictures to display Bitmap bitmap =
BitmapFactory.decodeResource(getResources(), R.drawable.thelittleprince);
mImageWidth = bitmap.getWidth(); mImageHeight = bitmap.getHeight(); //
Load image data into GPU, Generate the corresponding texture id mGLTextureId = OpenGlUtils.loadTexture(bitmap,
mGLTextureId, true); // Load texture // Vertex array buffer mGLCubeBuffer =
ByteBuffer.allocateDirect(CUBE.length * 4) .order(ByteOrder.nativeOrder())
.asFloatBuffer(); mGLCubeBuffer.put(CUBE).position(0); // Texture array buffer
mGLTextureBuffer = ByteBuffer.allocateDirect(TEXTURE_NO_ROTATION.length * 4)
.order(ByteOrder.nativeOrder()) .asFloatBuffer();
mGLTextureBuffer.put(TEXTURE_NO_ROTATION).position(0); } @Override public void
onSurfaceChanged(GL10 gl, int width, int height) { mOutputWidth = width;
mOutputHeight = height; GLES20.glViewport(0, 0, width, height); // Set window size
adjustImageScaling(); // Adjust picture display size . If the method is not called , Will cause the picture to stretch to the fill window display area } @Override
public void onDrawFrame(GL10 gl) { // draw
According to texture id, Drawing images with vertex and texture coordinate data mGLImageHandler.onDraw(mGLTextureId, mGLCubeBuffer,
mGLTextureBuffer); } // Adjust picture display size to center display private void adjustImageScaling() { float
outputWidth = mOutputWidth; float outputHeight = mOutputHeight; float ratio1 =
outputWidth / mImageWidth; float ratio2 = outputHeight / mImageHeight; float
ratioMax = Math.min(ratio1, ratio2); // The size of the picture after centering int imageWidthNew =
Math.round(mImageWidth * ratioMax); int imageHeightNew =
Math.round(mImageHeight * ratioMax); // The scale at which the picture is stretched float ratioWidth = outputWidth
/ imageWidthNew; float ratioHeight = outputHeight / imageHeightNew; //
Restore vertices based on stretch scale float[] cube = new float[]{ CUBE[0] / ratioWidth, CUBE[1] /
ratioHeight, CUBE[2] / ratioWidth, CUBE[3] / ratioHeight, CUBE[4] / ratioWidth,
CUBE[5] / ratioHeight, CUBE[6] / ratioWidth, CUBE[7] / ratioHeight, };
mGLCubeBuffer.clear(); mGLCubeBuffer.put(cube).position(0); } } }
Syntax and related usage for shaders , I won't go over it here , My advice is ,
First understand the main functions of vertex shaders and slice shaders , And then after understanding this tutorial , If you are interested in the color device, go to the relevant information . Here we just show a picture , The shader code used is simple , All annotated , It doesn't affect people's understanding .
/** * Responsible for displaying a picture */ public class GLImageHandler { // How many vertices are there in the data , How many times does the pipeline call the vertex shader
public static final String NO_FILTER_VERTEX_SHADER = "" + "attribute vec4
position;\n" + // Vertex coordinates of vertex shaders , Passed in by an external program "attribute vec4 inputTextureCoordinate;\n"
+ // Incoming texture coordinates " \n" + "varying vec2 textureCoordinate;\n" + " \n" + "void
main()\n" + "{\n" + " gl_Position = position;\n" + " textureCoordinate =
inputTextureCoordinate.xy;\n" + // Final vertex position "}"; //
How many segments are produced after rasterization , How many will be interpolated varying variable , How many times does the render pipeline call the clip shader at the same time public static final String
NO_FILTER_FRAGMENT_SHADER = "" + "varying highp vec2 textureCoordinate;\n" + //
Final vertex position , Of the vertex shader above varying Variables are passed here " \n" + "uniform sampler2D inputImageTexture;\n"
+ // External incoming image texture That is, the data representing the whole picture " \n" + "void main()\n" + "{\n" + " gl_FragColor =
texture2D(inputImageTexture, textureCoordinate);\n" + // Call function Texture mapping "}";
private final LinkedList<Runnable> mRunOnDraw; private final String
mVertexShader; private final String mFragmentShader; protected int mGLProgId;
protected int mGLAttribPosition; protected int mGLUniformTexture; protected int
mGLAttribTextureCoordinate; public GLImageHandler() {
GLImageHandler(final String vertexShader, final String fragmentShader) {
mRunOnDraw = new LinkedList<Runnable>(); mVertexShader = vertexShader;
mFragmentShader = fragmentShader; } public final void init() { mGLProgId =
OpenGlUtils.loadProgram(mVertexShader, mFragmentShader); // Compile link shaders , Create a shader program
mGLAttribPosition = GLES20.glGetAttribLocation(mGLProgId, "position"); //
Vertex coordinates of vertex shaders mGLUniformTexture = GLES20.glGetUniformLocation(mGLProgId,
"inputImageTexture"); // Incoming image texture mGLAttribTextureCoordinate =
GLES20.glGetAttribLocation(mGLProgId, "inputTextureCoordinate"); // Texture coordinates of vertex shaders
} public void onDraw(final int textureId, final FloatBuffer cubeBuffer, final
FloatBuffer textureBuffer) { GLES20.glUseProgram(mGLProgId); // Vertex coordinates of vertex shaders
cubeBuffer.position(0); GLES20.glVertexAttribPointer(mGLAttribPosition, 2,
GLES20.GL_FLOAT, false, 0, cubeBuffer);
GLES20.glEnableVertexAttribArray(mGLAttribPosition); // Texture coordinates of vertex shaders
GLES20.glVertexAttribPointer(mGLAttribTextureCoordinate, 2, GLES20.GL_FLOAT,
false, 0, textureBuffer);
GLES20.glEnableVertexAttribArray(mGLAttribTextureCoordinate); // Incoming image texture if
(textureId != OpenGlUtils.NO_TEXTURE) {
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, textureId);
GLES20.glUniform1i(mGLUniformTexture, 0); } // Draw vertices , There are vertex method and index method //
GLES20.GL_TRIANGLE_STRIP That is, every three adjacent vertices form a triangle , It is composed of a series of connected triangles
GLES20.glDrawArrays(GLES20.GL_TRIANGLE_STRIP, 0, 4); //
Vertex method , According to the vertex order of the input rendering pipeline and the drawing method adopted, the vertex component primitives are drawn
GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, 0); } }

Used in the code above OpenGlUtils Class is a tool class encapsulated , Mainly responsible for loading textures id, And loading the shader code , Code details are not posted here ( It's all template code ), Interested students can check the corresponding project code of this article later .

Finally our picture will be shown .

matters needing attention

* Need to be in GLSurfaceView Set in OpenGL Version of : setEGLContextClientVersion(2); // 2.0
Otherwise, a similar error will be reported glDrawArrays is called with VERTEX_ARRAY client state disabled!

* Operation GPU When the interface is related, the GLSurfaceView Otherwise, it will report call to OpenGL ES API with no current
context. For example, get texture id Cannot be initialized on the interface , Need to be in onSurfaceCreated after

Full code address



ES That's the end of the introduction , Although I always want to explain it as simply as possible , But there are still many things involved in the whole writing , Therefore, we still hope that readers can point out the shortcomings ! In fact, the above mentioned is GPUImage The core principles of this open source library , At the same time, there are many popular short video effects involved OpenGL Processed , I hope this article will be helpful to you OpenGL A little help .

last , Thank you for your support !!! We will follow the response of this article , Consider whether you need to continue to write the next article on filter implementation ( In fact, it is mainly realized by writing shaders ).