2 3 4 5 6 7 8

Rendering pipeline setup


If you've never dealt with 3D graphics API before then you may expect that drawing a triangle should be simple, especially in JavaScript. Something like canvas.drawTriangle(p1, p2, p3). However, it's not.

Any 3D graphics API is designed not to draw simple primitives but high-polygonal models with complex shading effects. And every step of the rendering process should be configurable by the user in details. So, there's no really difference in code if you're going to draw a single triangle or a high-polygonal 3D model. That's why you may see things overcomplicated at first steps when we'll start with a single triangle. I'll try to keep things as simple as possible though.

Anyway, in your production code you shouldn't deal with WebGL functions directly, even if your project is really tiny. WebGL API is too low-level to be used in an application directly. You should wrap it into at least one additional level of indirection, just to keep your code simple and avoid copypasting. We will do something like that during the next lessons.

Also sometimes you'll need to deal with linear algebra to perform some operations on 3D models. This mostly consists of vector and matrix operations. Unfortunately there is no built-in linear algebra tools in JavaScript, so we'll need to find some good third party library for that. We'll get back to linear algebra on the JavaScript side some later.

Basically, you should do the following steps to render a single triangle (or a 3D model):

Implement a vertex shader program. A vertex shader is a program that converts a given vertex coordinates into the relative screen-space coordinates. A vertex shared is called for each vertex in a given geometry model.

Implement a fragment shader program. A fragment shader is a program that should tell GPU what color should be used to colorize a pixel. A fragment shader is called for every rasterized pixel of a model.

Create a memory buffer on a GPU side and transfer a geometry data into that buffer. Geometry data is just an array of float-point numbers.

Describe how a data in that array should be treated by GPU. Something like “Hey, GPU! Here is an array buffer where my geometry is stored! The first three numbers in the array are 3D coordinates of a vertex, the next three numbers represent RGB color of a vertex, and there are 150 vertices in the buffer so all the data is packed into (3 + 3) * 150 float-point numbers! Cheers!”.

Setup rendering states: whether depth test should be involved, should pixels be blended or overwritten, etc.

Finally, draw a geometry from a given GPU buffer. “Hey, GPU! Could you please draw a geometry from this buffer using these shader programs? Oh, by the way, you should threat every three vertices from the buffer as a triangle! Have a nice day!”.

This may look like a long way to a simple goal. However you should go that way only once. We'll split that way into several lessons and learn how to do each of the steps. Divide et impera!

Rate this post:
Lesson 6
Share this page:

Learning plan

How to setup a canvas and get it's WebGL context
How to setup WebGL render loop property
The very basics of how things are being presented and processed in 3D graphics
5. Rendering pipeline setup
The very basics of steps you should perform to draw a triangle or a 3D model from scratch
Let's create a simple vertex shader, compile it, and check for compilation issues
Now let's create a fragment shader, compile it and check for errors
It's time to link our shaders into a completed GLSL program