WebGL is a web browser graphics library based on a version of OpenGL for embedded devices, OpenGL ES. The library enables individual real-time 3D graphics rendering in modern browsers, including the use of shaders (plays an important role in the creating process of 3D configurators). There are a variety of scenarios in which you can use such a librarym such as browser games, 3D maps or product views. The simple WebGL interface is accessible via JavaScript and even entire frameworks are available such as three.js.

In this tutorial I will provide the basics for wrapping around and experimenting with WebGL and fragment shaders, as it makes the basic concept relatively easy to understand.

As already mentioned, WebGL is accessed via JavaScript. For my experiments I use the latest version of Chrome and a simple text editor. The frame of our toy will be minimalistic HTML5, which contains a canvas element, because WebGL uses canvas elements for drawing:

Copy to Clipboard

The result is an empty image. WebGL is then initialized by retrieving a WebGL object based on the canvas, which can then be used for further WebGL operations. Here is an extension of your JavaScript, how WebGL initializes and deletes the WebGL canvas with a red color:

Copy to Clipboard

The code is quite simple. We declare global variables, the GL and Canvas object, which we need to perform the basic initialization and in the render function to perform the drawing continously. Next, we set the Window Onload event to init.

The initialization function retrieves the HTML canvas object and initializes a WebGL object based on it. We define the height and width of the canvas and then initialize the so-called viewport of the WebGL object. Then a first call for rendering is made to actually draw something.

The render function first requests a next animation frame to ensure continuous rendering. Then a clear color is set to red with full opacity and gl.clear is called. The call to gl.clear contains a constant that specifies that the color of the screen buffer should be affected.

So far this is not too exciting. To get something more useful and something to play around with, we need the concept of a shader. Shaders are small programs that usually run in parallel on the graphics processor. The two shader types relevant here are the so-called fragment shader and the vertex shader. In the cas of WebGL, they are written in a derivative of GLSL.

The 3D graphics processing works as follows: A 3D model is loaded which is basically a set of points in three-dimensional space and a description of how these points from polygons. This model and especially the set of vertices in then transformed from a local, natural description into a description that can be used directly to draw the polygons. The various trasnformations do not play a major role here, but it is worth noting that scaling, rotation and perspective corrections are all performed during the transformation phase of a mesh from object to screen space. Most of these calculations can be performed vertex by vertex with a vertex shader. However, we will only use a trivial vertex shader here that does not perform trasnformations, so you can ignore most of the information in this section.

With the normalized information, the render engine can then use relatively simple algorithms to draw polygons pixel by pixel. This is where the fragment shader comes in. Imagine fragments as pixels covered by a polygon. For each fragment, a fragment shader is called which has at least the coordinates of the fragment as input and returns the color to be rendered as output.

Fragment shaders are usually used for lighting and post-processing. However, the idea for a simple experiment is as follows: we draw two polygons (triangles) that only cover the entire visible screen. Using a fragment shader we can then determine the color of each pixel on the screen. So we can imagine the fragment shader as a function of screen coordinates up to colors and experiment with such functions. And it turns out to be a very funny toy. In fact, what you can draw in this way is limited only by your curiosity. As an example, whole raytracing engines are equipped with fragment shaders. But we`ll start with something simpler. Let`s start with some code that defines the framework for our experiments.

First, we need to draw a primitive on the tidy screen, a quad consisting of two triangles that fills the whole screen and servers as a canvas for our fragment shader, as described above. For this reason, we introduce a new global variable that contains its description:

Copy to Clipboard

We initialize the buffer in our init function:

Copy to Clipboard

The values are what we want, because by default -1 and 1 are mapped to the borders of our screen. The primitive is then drawn in the render function, like here:

Copy to Clipboard

If you run this code, you won`t see anything. That`s because WebGL still doesn`t know how to draw the quad. This is where our two shaders come in. As I mentioned before, we need both a vertex and a fragment shader. We present them as new script elements above our JavaScript. Our very simple standard vertex shader looks like this:

Copy to Clipboard

It tells WebGL to use exactly the same position for the final rendering as specified in the vertex description. In other words, we directly use the data input of the vertex as output. The pixel shader is slightly more interesting. First the script:

Copy to Clipboard

What we have here is a function that returns the final color of a fragment or pixel on our drawn canvas. This is done by setting the variable gl_FragColor. gl_FragCoord.x and gl_FragCoord.y are the x and y positions on our canvas in the screen area, i. e. they range from 0 to 640 and from 0 to 480 as this is the size of the canvas on the screen. For each combination of these values, the Pixel Shader is called and a color is calculated. In this case, we set the red component depending on the x-position and the green component depending on the y position. The color components of gl_FragCoord.y range from 0.0 to 1.0, so we have to divide the position by the size of the screen to get all values between 0 and 1.

Then we have to tell WebGL that these shaders should be used to render our primitive. This is done as follows. First we need to load the shaders into our init function. First we declare four local variables:

Copy to Clipboard

And then insert the following code just before the render() call:

Copy to Clipboard

This code is actually quite simple. We load the contents of the script elements, create a shader of the right type with a call to gl.createShader() and compile the shader code. This is exactly the same procedure for both the vertex and fragment shaders. The third block creates on so-called program, which is basically a way of rendering primitives. A program can consist of several shaders, but we need at least one vertex shader and one fragment shader. The program is created with a call to gl.createProgram() and the two shaders are appended. The program is then linked and we instruct WebGL to use it in the following polygon rendering with to call to gl.useProgram().

If we call this code, we still don`t see anything. This is because the shaders require a corresponding setting of the input variables. In this case we have to instruct WebGL to provide the Vertex shader with the position data from the buffer we created above. This is done as follows in our render function, just before calling gl.drawArrays:

Copy to Clipboard

The resultting code is functional and meets our expectations. Our canvas is no longer red, but looks like this:

salihosmanoglu e

This is the proof that the value range of gl_FragCoord is as described above. You could pause a little and think about it. One thing to keep in mind is that the highest y value (480) is actually on the canvas. It is important to know how exactly the coordinates work here:

This is where you can start playing around. A second example is the following fragment shader:

Copy to Clipboard

The bit at the beginning that determines the accuracy of the float values seems to be necessary to use float variables in their shaders. As you can see, you can use simple control structures in shaders. You will probably have noticed immediately what the code does. The simple formula applies to a point located in a two-dimensional sphere with a radius of 80. The result is as expected:

redpoint e

Go ahead, play around with it a little. What types of shapes can you create? As a last example we want to render the Mandelbrot set. Let`s take a look at the following fragment shader:

Copy to Clipboard

With atrick we can use the second variable and remember the number of iterations required. The problem is that WebGL needs the for loop to use a local count variable declared in the head of the for loop.

Playing around with these things can quickly become addictive. In fact, there are websites that provide simple frameworks for playing around with fragment shaders, and entire communities that revolve around them. Recommended is Shader Toy. Here you can share your shaders, see and rate the results immediately. Here is an example:

Thanks for reading.