Advertisement
  1. Game Development

Building Shaders With Babylon.js and WebGL: Theory and Examples

Scroll to top
19 min read
Sponsored Content

This sponsored post features a product relevant to our readers while meeting our editorial guidelines for being objective and educational.

In the keynote for Day 2 of //Build 2014 (see 2:24-2:28), Microsoft evangelists Steven Guggenheimer and John Shewchuk demoed how Oculus Rift support was added to Babylon.js. And one of the key things for this demo was the work we did on a specific shader to simulate lenses, as you can see in this picture:

Lens simulation imageLens simulation imageLens simulation image

I also presented a session with Frank Olivier and Ben Constable about graphics on IE and Babylon.js

This leads me to one of the questions people often ask me about Babylon.js: "What do you mean by shaders?" So in this post, I am going to explain to you how shaders work, and give some examples of common types of shaders.

The Theory

Before starting experimenting, we must first see how things work internally.

When dealing with hardware-accelerated 3D, we are discussing two CPUs: the main CPU and the GPU. The GPU is a kind of extremely specialized CPU.

The GPU is a state machine that you set up using the CPU. For instance the CPU will configure the GPU to render lines instead of triangles. Or it will define that transparency is on, and so on.

Once all the states are set, the CPU will define what to render—the geometry, which is composed of a list of points (called the vertices and stored into an array called vertex buffer), and a list of indexes (the faces, or triangles, stored into an array called index buffer).

The final step for the CPU is to define how to render the geometry, and for this specific task, the CPU will define shaders for the GPU. Shaders are a piece of code that the GPU will execute for each of the vertices and pixels it has to render.

First, some vocabulary: think of a vertex (vertices when there are several of them) as a “point” in a 3D environment (as opposed to a point in a 2D environment).

There are two kinds of shaders: vertex shaders, and pixel (or fragment) shaders.

Graphics Pipeline

Before digging into shaders, let’s take a step back. To render pixels, the GPU will take the geometry defined by the CPU and will do the following:

Using the index buffer, three vertices are gathered to define a triangle: the index buffer contains a list of vertex indexes. This means that each entry in the index buffer is the number of a vertex in the vertex buffer. This is really useful to avoid duplicating vertices. 

For instance, the following index buffer is a list of two faces: [1 2 3 1 3 4]. The first face contains vertex 1, vertex 2 and vertex 3. The second face contains vertex 1, vertex 3 and vertex 4. So there are four vertices in this geometry: 

Chart showing four verticesChart showing four verticesChart showing four vertices

The vertex shader is applied on each vertex of the triangle. The primary goal of the vertex shader is to produce a pixel for each vertex (the projection on the 2D screen of the 3D vertex): 

vertex shader is applied on each vertex of the trianglevertex shader is applied on each vertex of the trianglevertex shader is applied on each vertex of the triangle

Using these three pixels (which define a 2D triangle on the screen), the GPU will interpolate all values attached to the pixel (at least its position), and the pixel shader will be applied on every pixel included into the 2D triangle in order to generate a color for every pixel: 

pixel shader will be applied on every pixel included into the 2D trianglepixel shader will be applied on every pixel included into the 2D trianglepixel shader will be applied on every pixel included into the 2D triangle

This process is done for every face defined by the index buffer. 

Obviously, due to its parallel nature, the GPU is able to process this step for a lot of faces simultaneously, and thereby achieve really good performance.

GLSL

We have just seen that to render triangles, the GPU needs two shaders: the vertex shader and the pixel shader. These shaders are written using a language called GLSL (Graphics Library Shader Language). It looks like C.

For Internet Explorer 11, we have developed a compiler to transform GLSL to HLSL (High Level Shader Language), which is the shader language of DirectX 11. This allows IE11 to ensure that the shader code is safe (you don’t want to using WebGL to reset your computer!):

Flow chart of transforming GLSL to HLSLFlow chart of transforming GLSL to HLSLFlow chart of transforming GLSL to HLSL

Here is a sample of a common vertex shader:

1
precision highp float;
2
3
4
5
// Attributes
6
7
attribute vec3 position;
8
9
attribute vec2 uv;
10
11
12
13
// Uniforms
14
15
uniform mat4 worldViewProjection;
16
17
18
19
// Varying
20
21
varying vec2 vUV;
22
23
24
25
void main(void) {
26
27
    gl_Position = worldViewProjection * vec4(position, 1.0);
28
29
30
31
    vUV = uv;
32
33
}

Vertex Shader Structure

A vertex shader contains the following:

  • Attributes: An attribute defines a portion of a vertex. By default a vertex should at least contain a position (a vector3:x, y, z). But as a developer, you can decide to add more information. For instance, in the former shader, there is a vector2 named uv (texture coordinates that allow us to apply a 2D texture on an 3D object).
  • Uniforms: A uniform is a variable used by the shader and defined by the CPU. The only uniform we have here is a matrix used to project the position of the vertex (x, y, z) to the screen (x, y).
  • Varying: Varying variables are values created by the vertex shader and transmitted to the pixel shader. Here, the vertex shader will transmit a vUV (a simple copy of uv) value to the pixel shader. This means that a pixel is defined here with a position and texture coordinates. These values will be interpolated by the GPU and used by the pixel shader. 
  • main: The function named main() is the code executed by the GPU for each vertex and must at least produce a value for gl_position (the position on the screen of the current vertex). 

We can see in our sample that the vertex shader is pretty simple. It generates a system variable (starting with gl_) named gl_position to define the position of the associated pixel, and it sets a varying variable called vUV

The Voodoo Behind Matrices

In our shader we have a matrix named worldViewProjection. We use this matrix to project the vertex position to the gl_position variable. That is cool, but how do we get the value of this matrix? It is a uniform, so we have to define it on the CPU side (using JavaScript).

This is one of the complex parts of doing 3D. You must understand complex math (or you will have to use a 3D engine, like Babylon.js, which we are going to see later).

The worldViewProjection matrix is the combination of three different matrices:

The worldViewProjection matrix is the combination of three different matricesThe worldViewProjection matrix is the combination of three different matricesThe worldViewProjection matrix is the combination of three different matrices

Using the resulting matrix allows us to be able to transform 3D vertices into 2D pixels while taking in account the point of view and everything related to the position/scale/rotation of the current object.

This is your responsibility as a 3D developer: to create and keep this matrix up to date.

Back to the Shaders

Once the vertex shader is executed on every vertex (three times, then) we have three pixels with a correct gl_position and a vUV value. The GPU will then interpolate these values on every pixel contained in the triangle produced by these pixels.

Then, for each pixel, it will execute the pixel shader:

1
precision highp float;
2
3
varying vec2 vUV;
4
5
uniform sampler2D textureSampler;
6
7
8
9
void main(void) {
10
11
    gl_FragColor = texture2D(textureSampler, vUV);
12
13
}

Pixel (or Fragment) Shader Structure

The structure of a pixel shader is similar to a vertex shader:

  • Varying: Varying variables are values created by the vertex shader and transmitted to the pixel shader. Here the pixel shader will receive a vUV value from the vertex shader. 
  • Uniforms: A uniform is a variable used by the shader and defined by the CPU. The only uniform we have here is a sampler, which is a tool used to read texture colors.
  • main: The function named main is the code executed by the GPU for each pixel and must at least produce a value for gl_FragColor (the color of the current pixel). 

This pixel shader is fairly simple: It reads the color from the texture using texture coordinates from the vertex shader (which in turn got it from the vertex).

Do you want to see the result of such a shader? Here it is:

This is being rendered in real time; you can drag the sphere with your mouse.

To achieve this result, you will have to deal with a lot of WebGL code. Indeed, WebGL is a really powerful but really low-level API, and you have to do everything by yourself, from creating the buffers to defining vertex structures. You also have to do all the math and set all the states and handle texture loading and so on…

Too Hard? BABYLON.ShaderMaterial to the Rescue

I know what you are thinking: shaders are really cool, but I do not want to bother with WebGL internal plumbing or even with math.

And that's fine! This is a perfectly legitimate ask, and that is exactly why I created Babylon.js.

Let me present to you the code used by the previous rolling sphere demo. First of all, you will need a simple webpage:

1
<!DOCTYPE html>
2
3
<html>
4
5
<head>
6
7
    <title>Babylon.js</title>
8
9
    <script src="Babylon.js"></script>
10
11
    
12
13
    <script type="application/vertexShader" id="vertexShaderCode">
14
15
        precision highp float;
16
17
18
19
        // Attributes

20
21
        attribute vec3 position;
22
23
        attribute vec2 uv;
24
25
26
27
        // Uniforms

28
29
        uniform mat4 worldViewProjection;
30
31
32
33
        // Normal

34
35
        varying vec2 vUV;
36
37
38
39
        void main(void) {
40
41
        gl_Position = worldViewProjection * vec4(position, 1.0);
42
43
44
45
        vUV = uv;
46
47
        }
48
49
    </script>
50
51
  
52
53
    <script type="application/fragmentShader" id="fragmentShaderCode">
54
55
        precision highp float;
56
57
        varying vec2 vUV;
58
59
60
61
        uniform sampler2D textureSampler;
62
63
64
65
        void main(void) {
66
67
        gl_FragColor = texture2D(textureSampler, vUV);
68
69
        }
70
71
    </script>
72
73
74
75
    <script src="index.js"></script>
76
77
    <style>
78
79
        html, body {
80
81
            width: 100%;
82
83
            height: 100%;
84
85
            padding: 0;
86
87
            margin: 0;
88
89
            overflow: hidden;
90
91
            margin: 0px;
92
93
            overflow: hidden;
94
95
        }
96
97
98
99
        #renderCanvas {
100
101
            width: 100%;
102
103
            height: 100%;
104
105
            touch-action: none;
106
107
            -ms-touch-action: none;
108
109
        }
110
111
    </style>
112
113
</head>
114
115
<body>
116
117
    <canvas id="renderCanvas"></canvas>
118
119
</body>
120
121
</html>

You will notice that the shaders are defined by <script> tags. With Babylon.js you can also define them in separate files (.fx files).

You can get Babylon.js here or on our GitHub repo. You must use version 1.11 or higher to get access to BABYLON.StandardMaterial.

And finally the main JavaScript code is the following:

1
"use strict";
2
3
4
5
document.addEventListener("DOMContentLoaded", startGame, false);
6
7
8
9
function startGame() {
10
11
    if (BABYLON.Engine.isSupported()) {
12
13
        var canvas = document.getElementById("renderCanvas");
14
15
        var engine = new BABYLON.Engine(canvas, false);
16
17
        var scene = new BABYLON.Scene(engine);
18
19
        var camera = new BABYLON.ArcRotateCamera("Camera", 0, Math.PI / 2, 10, BABYLON.Vector3.Zero(), scene);
20
21
22
23
        camera.attachControl(canvas);
24
25
26
27
        // Creating sphere

28
29
        var sphere = BABYLON.Mesh.CreateSphere("Sphere", 16, 5, scene);
30
31
32
33
        var amigaMaterial = new BABYLON.ShaderMaterial("amiga", scene, {
34
35
            vertexElement: "vertexShaderCode",
36
37
            fragmentElement: "fragmentShaderCode",
38
39
        },
40
41
        {
42
43
            attributes: ["position", "uv"],
44
45
            uniforms: ["worldViewProjection"]
46
47
        });
48
49
        amigaMaterial.setTexture("textureSampler", new BABYLON.Texture("amiga.jpg", scene));
50
51
52
53
        sphere.material = amigaMaterial;
54
55
56
57
        engine.runRenderLoop(function () {
58
59
            sphere.rotation.y += 0.05;
60
61
            scene.render();
62
63
        });
64
65
    }
66
67
};

You can see that I use a BABYLON.ShaderMaterial to get rid of all the burden of compiling, linking and handling shaders.

When you create a BABYLON.ShaderMaterial, you have to specify the DOM element used to store the shaders or the base name of the files where the shaders are. If you choose to use files, you must create a file for each shader and use the following filename pattern: basename.vertex.fx and basename.fragment.fx. Then you will have to create the material like this:

1
var cloudMaterial = new BABYLON.ShaderMaterial("cloud", scene, "./myShader",
2
3
        {
4
5
            attributes: ["position", "uv"],
6
7
            uniforms: ["worldViewProjection"]
8
9
        });

You must also specify the names of any attributes and uniforms that you use. Then, you can set directly the value of your uniforms and samplers using the setTexture, setFloat, setFloats, setColor3, setColor4, setVector2, setVector3, setVector4, and setMatrix functions.

Pretty simple, right?

Do you remember the previous worldViewProjection matrix? Using Babylon.js and BABYLON.ShaderMaterial, you have nothing to worry about! The BABYLON.ShaderMaterial will automatically compute it for you because you declare it in the list of uniforms.

BABYLON.ShaderMaterial can also handle the following matrices for you:

  • world 
  • view 
  • projection 
  • worldView 
  • worldViewProjection 

No need for math any longer. For instance, each time you execute sphere.rotation.y += 0.05, the world matrix of the sphere is generated for you and transmitted to the GPU.

CYOS: Create Your Own Shader

So let’s go bigger and create a page where you can dynamically create your own shaders and see the result immediately. This page is going to use the same code that we previously discussed, and is going to use a BABYLON.ShaderMaterial object to compile and execute shaders that you will create.

I used ACE code editor for CYOS. This is an incredible code editor with syntax highlighters. Feel free to have a look at it here. You can find CYOS here.

Using the first combo box, you will be able to select pre-defined shaders. We will see each of them right after.

You can also change the mesh (the 3D object) used to preview your shaders using the second combo box.

The Compile button is used to create a new BABYLON.ShaderMaterial from your shaders. The code used by this button is the following: 

1
// Compile
2
3
shaderMaterial = new BABYLON.ShaderMaterial("shader", scene, {
4
5
    vertexElement: "vertexShaderCode",
6
7
    fragmentElement: "fragmentShaderCode",
8
9
},
10
11
    {
12
13
        attributes: ["position", "normal", "uv"],
14
15
        uniforms: ["world", "worldView", "worldViewProjection"]
16
17
    });
18
19
20
21
var refTexture = new BABYLON.Texture("ref.jpg", scene);
22
23
refTexture.wrapU = BABYLON.Texture.CLAMP_ADDRESSMODE;
24
25
refTexture.wrapV = BABYLON.Texture.CLAMP_ADDRESSMODE;
26
27
28
29
var amigaTexture = new BABYLON.Texture("amiga.jpg", scene);
30
31
32
33
shaderMaterial.setTexture("textureSampler", amigaTexture);
34
35
shaderMaterial.setTexture("refSampler", refTexture);
36
37
shaderMaterial.setFloat("time", 0);
38
39
shaderMaterial.setVector3("cameraPosition", BABYLON.Vector3.Zero());
40
41
shaderMaterial.backFaceCulling = false;
42
43
44
45
mesh.material = shaderMaterial;

Brutally simple, right? The material is ready to send you three pre-computed matrices (world, worldView and worldViewProjection). Vertices will come with position, normal and texture coordinates. Two textures are also already loaded for you:

amiga textureamiga textureamiga texture
amiga.jpg
ref textureref textureref texture
ref.jpg

And finally, here is the renderLoop where I update two convenient uniforms:

  • one called time in order to get some funny animations 
  • one called cameraPosition to get the position of the camera into your shaders (which will be useful for lighting equations) 
1
engine.runRenderLoop(function () {
2
3
    mesh.rotation.y += 0.001;
4
5
6
7
    if (shaderMaterial) {
8
9
        shaderMaterial.setFloat("time", time);
10
11
        time += 0.02;
12
13
14
15
        shaderMaterial.setVector3("cameraPosition", camera.position);
16
17
    }
18
19
20
21
    scene.render();
22
23
});

Thanks to the work we did on Windows Phone 8.1, you can also use CYOS on your Windows Phone (it is always a good time to create a shader):

CYOS on Windows PhoneCYOS on Windows PhoneCYOS on Windows Phone

Basic Shader

So let’s start with the very first shader defined on CYOS: the Basic shader.

We already know this shader. It computes the gl_position and uses texture coordinates to fetch a color for every pixel.

To compute the pixel position, we just need the worldViewProjection matrix and the vertex’s position:

1
precision highp float;
2
3
4
5
// Attributes
6
7
attribute vec3 position;
8
9
attribute vec2 uv;
10
11
12
13
// Uniforms
14
15
uniform mat4 worldViewProjection;
16
17
18
19
// Varying
20
21
varying vec2 vUV;
22
23
24
25
void main(void) {
26
27
    gl_Position = worldViewProjection * vec4(position, 1.0);
28
29
30
31
    vUV = uv;
32
33
}

Texture coordinates (uv) are transmitted unmodified to the pixel shader.

Please note that we need to add precision mediump float; on the first line for both vertex and pixel shaders because Chrome requires it. It defines that, for better performance, we do not use full precision floating values.

The pixel shader is even simpler, because we just need to use texture coordinates and fetch a texture color:

1
precision highp float;
2
3
4
5
varying vec2 vUV;
6
7
8
9
uniform sampler2D textureSampler;
10
11
12
13
void main(void) {
14
15
    gl_FragColor = texture2D(textureSampler, vUV);
16
17
}

We saw previously that the textureSampler uniform is filled with the “amiga” texture, so the result is the following:

Basic Shader resultBasic Shader resultBasic Shader result

Black and White Shader

Now let’s continue with a new shader: the black and white shader.

The goal of this shader is to use the previous one but with a "black and white only" rendering mode. To do so, we can keep the same vertex shader, but the pixel shader must be slightly modified.

The first option we have is to take only one component, such as the green one:

1
precision highp float;
2
3
4
5
varying vec2 vUV;
6
7
8
9
uniform sampler2D textureSampler;
10
11
12
13
void main(void) {
14
15
    gl_FragColor = vec4(texture2D(textureSampler, vUV).ggg, 1.0);
16
17
}

As you can see, instead of using .rgb (this operation is called a swizzle), we used .ggg.

But if we want a really accurate black and white effect, it would be a better idea to compute the luminance (which takes into account all color components):

1
precision highp float;
2
3
4
5
varying vec2 vUV;
6
7
8
9
uniform sampler2D textureSampler;
10
11
12
13
void main(void) {
14
15
    float luminance = dot(texture2D(textureSampler, vUV).rgb, vec3(0.3, 0.59, 0.11));
16
17
    gl_FragColor = vec4(luminance, luminance, luminance, 1.0);
18
19
}

The dot operation (or dot product) is computed like this:

result = v0.x * v1.x + v0.y * v1.y + v0.z * v1.z

So in our case:

luminance = r * 0.3 + g * 0.59 + b * 0.11 (these values are based on the fact that human eye is more sensible to green)

Sounds cool, doesn’t it?

Black and white shader resultBlack and white shader resultBlack and white shader result

Cell Shading Shader

Now let’s move to a more complex shader: the cell shading shader.

This one will require us to get the vertex’s normal and the vertex’s position in the pixel shader. So the vertex shader will look like this:

1
precision highp float;
2
3
4
5
// Attributes
6
7
attribute vec3 position;
8
9
attribute vec3 normal;
10
11
attribute vec2 uv;
12
13
14
15
// Uniforms
16
17
uniform mat4 world;
18
19
uniform mat4 worldViewProjection;
20
21
22
23
// Varying
24
25
varying vec3 vPositionW;
26
27
varying vec3 vNormalW;
28
29
varying vec2 vUV;
30
31
32
33
void main(void) {
34
35
    vec4 outPosition = worldViewProjection * vec4(position, 1.0);
36
37
    gl_Position = outPosition;
38
39
40
41
    vPositionW = vec3(world * vec4(position, 1.0));
42
43
    vNormalW = normalize(vec3(world * vec4(normal, 0.0)));
44
45
46
47
    vUV = uv;
48
49
}

Please note that we also use the world matrix, because position and normal are stored without any transformation and we must apply the world matrix to take into account the object’s rotation.

The pixel shader is the following:

1
precision highp float;
2
3
4
5
// Lights
6
7
varying vec3 vPositionW;
8
9
varying vec3 vNormalW;
10
11
varying vec2 vUV;
12
13
14
15
// Refs
16
17
uniform sampler2D textureSampler;
18
19
20
21
void main(void) {
22
23
    float ToonThresholds[4];
24
25
    ToonThresholds[0] = 0.95;
26
27
    ToonThresholds[1] = 0.5;
28
29
    ToonThresholds[2] = 0.2;
30
31
    ToonThresholds[3] = 0.03;
32
33
34
35
    float ToonBrightnessLevels[5];
36
37
    ToonBrightnessLevels[0] = 1.0;
38
39
    ToonBrightnessLevels[1] = 0.8;
40
41
    ToonBrightnessLevels[2] = 0.6;
42
43
    ToonBrightnessLevels[3] = 0.35;
44
45
    ToonBrightnessLevels[4] = 0.2;
46
47
48
49
    vec3 vLightPosition = vec3(0, 20, 10);
50
51
52
53
    // Light
54
55
    vec3 lightVectorW = normalize(vLightPosition - vPositionW);
56
57
58
59
    // diffuse
60
61
    float ndl = max(0., dot(vNormalW, lightVectorW));
62
63
64
65
    vec3 color = texture2D(textureSampler, vUV).rgb;
66
67
68
69
    if (ndl > ToonThresholds[0])
70
71
    {
72
73
        color *= ToonBrightnessLevels[0];
74
75
    }
76
77
    else if (ndl > ToonThresholds[1])
78
79
    {
80
81
        color *= ToonBrightnessLevels[1];
82
83
    }
84
85
    else if (ndl > ToonThresholds[2])
86
87
    {
88
89
        color *= ToonBrightnessLevels[2];
90
91
    }
92
93
    else if (ndl > ToonThresholds[3])
94
95
    {
96
97
        color *= ToonBrightnessLevels[3];
98
99
    }
100
101
    else
102
103
    {
104
105
        color *= ToonBrightnessLevels[4];
106
107
    }
108
109
110
111
    gl_FragColor = vec4(color, 1.);
112
113
}

The goal of this shader is to simulate a light, and instead of computing a smooth shading we will consider that light will apply according to specific brightness thresholds. For instance, if light intensity is between 1 (maximum) and 0.95, the color of the object (fetched from the texture) will be applied directly. If intensity is between 0.95 and 0.5, the color will be attenuated by a factor of 0.8, and so on.

So, there are mainly four steps in this shader:

  • First, we declare thresholds and levels constants.
  • Then, we need to compute the lighting using the Phong equation (we assume that the light is not moving): 
1
vec3 vLightPosition = vec3(0, 20, 10);
2
3
4
5
// Light
6
7
vec3 lightVectorW = normalize(vLightPosition - vPositionW);
8
9
10
11
// diffuse
12
13
float ndl = max(0., dot(vNormalW, lightVectorW));

The intensity of light per pixel is dependent on the angle between the normal and the light's direction.

  • Then we get the texture color for the pixel.
  • And finally we check the threshold and apply the level to the color.

The result looks like a cartoon object: 

Cell shading shader resultCell shading shader resultCell shading shader result

Phong Shader

We used a portion of the Phong equation in the previous shader. So let’s try to use the whole thing now.

The vertex shader is clearly simple here, because everything will be done in the pixel shader:

1
precision highp float;
2
3
4
5
// Attributes
6
7
attribute vec3 position;
8
9
attribute vec3 normal;
10
11
attribute vec2 uv;
12
13
14
15
// Uniforms
16
17
uniform mat4 worldViewProjection;
18
19
20
21
// Varying
22
23
varying vec3 vPosition;
24
25
varying vec3 vNormal;
26
27
varying vec2 vUV;
28
29
30
31
void main(void) {
32
33
    vec4 outPosition = worldViewProjection * vec4(position, 1.0);
34
35
    gl_Position = outPosition;
36
37
38
39
    vUV = uv;
40
41
    vPosition = position;
42
43
    vNormal = normal;
44
45
}

According to the equation, you must compute the diffuse and specular part by using the light direction and the vertex’s normal:

1
precision highp float;
2
3
4
5
// Varying
6
7
varying vec3 vPosition;
8
9
varying vec3 vNormal;
10
11
varying vec2 vUV;
12
13
14
15
// Uniforms
16
17
uniform mat4 world;
18
19
20
21
// Refs
22
23
uniform vec3 cameraPosition;
24
25
uniform sampler2D textureSampler;
26
27
28
29
void main(void) {
30
31
    vec3 vLightPosition = vec3(0, 20, 10);
32
33
34
35
    // World values
36
37
    vec3 vPositionW = vec3(world * vec4(vPosition, 1.0));
38
39
    vec3 vNormalW = normalize(vec3(world * vec4(vNormal, 0.0)));
40
41
    vec3 viewDirectionW = normalize(cameraPosition - vPositionW);
42
43
44
45
    // Light
46
47
    vec3 lightVectorW = normalize(vLightPosition - vPositionW);
48
49
    vec3 color = texture2D(textureSampler, vUV).rgb;
50
51
52
53
    // diffuse
54
55
    float ndl = max(0., dot(vNormalW, lightVectorW));
56
57
58
59
    // Specular
60
61
    vec3 angleW = normalize(viewDirectionW + lightVectorW);
62
63
    float specComp = max(0., dot(vNormalW, angleW));
64
65
    specComp = pow(specComp, max(1., 64.)) * 2.;
66
67
68
69
    gl_FragColor = vec4(color * ndl + vec3(specComp), 1.);
70
71
}

We already used the diffuse part in the previous shader, so here we just need to add the specular part. This picture from a Wikipedia article explains how the shader works:

Diffuse plus Specular equals Phong ReflectionDiffuse plus Specular equals Phong ReflectionDiffuse plus Specular equals Phong Reflection
By Brad Smith aka Rainwarrior.

The result on our sphere:

Phong shader resultPhong shader resultPhong shader result

Discard Shader

For the discard shader, I would like to introduce a new concept: the discard keyword. This shader will discard every non-red pixel and will create the illusion of a "dug" object.

The vertex shader is the same as that used by the basic shader:

1
precision highp float;
2
3
4
5
// Attributes
6
7
attribute vec3 position;
8
9
attribute vec3 normal;
10
11
attribute vec2 uv;
12
13
14
15
// Uniforms
16
17
uniform mat4 worldViewProjection;
18
19
20
21
// Varying
22
23
varying vec2 vUV;
24
25
26
27
void main(void) {
28
29
    gl_Position = worldViewProjection * vec4(position, 1.0);
30
31
32
33
    vUV = uv;
34
35
}

The pixel shader will have to test the color and use discard when, for instance, the green component is too high:

1
precision highp float;
2
3
4
5
varying vec2 vUV;
6
7
8
9
// Refs
10
11
uniform sampler2D textureSampler;
12
13
14
15
void main(void) {
16
17
    vec3 color = texture2D(textureSampler, vUV).rgb;
18
19
20
21
    if (color.g > 0.5) {
22
23
        discard;
24
25
    }
26
27
28
29
    gl_FragColor = vec4(color, 1.);
30
31
}

The result is funny:

Discard shader resultDiscard shader resultDiscard shader result

Wave Shader

We’ve played a lot with pixel shaders, but I also wanted to show you that we can do a lot of things with vertex shaders.

For the wave shader, we will reuse the Phong pixel shader.

The vertex shader will use the uniform called time to get some animated values. Using this uniform, the shader will generate a wave with the vertices’ positions:

1
precision highp float;
2
3
4
5
// Attributes
6
7
attribute vec3 position;
8
9
attribute vec3 normal;
10
11
attribute vec2 uv;
12
13
14
15
// Uniforms
16
17
uniform mat4 worldViewProjection;
18
19
uniform float time;
20
21
22
23
// Varying
24
25
varying vec3 vPosition;
26
27
varying vec3 vNormal;
28
29
varying vec2 vUV;
30
31
32
33
void main(void) {
34
35
    vec3 v = position;
36
37
    v.x += sin(2.0 * position.y + (time)) * 0.5;
38
39
40
41
    gl_Position = worldViewProjection * vec4(v, 1.0);
42
43
44
45
    vPosition = position;
46
47
    vNormal = normal;
48
49
    vUV = uv;
50
51
}

A sine is applied to position.y, and the result is the following:

Wave shader resultWave shader resultWave shader result

Spherical Environment Mapping

This one was largely inspired by this tutorial. I’ll let you read that excellent article and play with the associated shader. 

Spherical environment mapping shaderSpherical environment mapping shaderSpherical environment mapping shader

Fresnel Shader

I would like to finish this article with my favorite: the Fresnel shader.

This shader is used to apply a different intensity according to the angle between the view direction and the vertex’s normal.

The vertex shader is the same one used by the cell shading shader, and we can easily compute the Fresnel term in our pixel shader (because we have the normal and the camera’s position, which can be used to evaluate the view direction):

1
precision highp float;
2
3
4
5
// Lights
6
7
varying vec3 vPositionW;
8
9
varying vec3 vNormalW;
10
11
12
13
// Refs
14
15
uniform vec3 cameraPosition;
16
17
uniform sampler2D textureSampler;
18
19
20
21
void main(void) {
22
23
    vec3 color = vec3(1., 1., 1.);
24
25
    vec3 viewDirectionW = normalize(cameraPosition - vPositionW);
26
27
28
29
    // Fresnel
30
31
    float fresnelTerm = dot(viewDirectionW, vNormalW);
32
33
    fresnelTerm = clamp(1.0 - fresnelTerm, 0., 1.);
34
35
36
37
    gl_FragColor = vec4(color * fresnelTerm, 1.);
38
39
}
Fresnel Shader resultFresnel Shader resultFresnel Shader result

Your Shader?

You are now more prepared to create your own shader. Feel free to use the comments here or at the Babylon.js forum to share your experiments!

If you want to go further, here are some useful links:

And some more learning that I’ve created on the subject:

Or, stepping back, our team’s learning series on JavaScript: 

And of course, you are always welcome to use some of our free tools in building your next web experience: Visual Studio Community, Azure Trial, and cross-browser testing tools for Mac, Linux, or Windows.

This article is part of the web dev tech series from Microsoft. We’re excited to share Microsoft Edge and the new EdgeHTML rendering engine with you. Get free virtual machines or test remotely on your Mac, iOS, Android, or Windows device @ http://dev.modern.ie/.

Advertisement
Did you find this post useful?
Want a weekly email summary?
Subscribe below and we’ll send you a weekly email summary of all new Game Development tutorials. Never miss out on learning about the next big thing.
Advertisement
Looking for something to help kick start your next project?
Envato Market has a range of items for sale to help get you started.