Tag Archives: Fragment Shader

Red Screen Filter

This is one way to apply a filter to make the whole screen red. After rendering the scene to a texture like usual. Render a full screen quad with the following shader attached, to get the whole screen in red. This is a nice and quick effect.

 The original output to the left, the result of the filter to the right

By averaging the different components with the weights 0.3, 0.59, 0.11 you will get a better result than just taking equally much from each. Read more on this page about it

http://en.wikipedia.org/wiki/Grayscale

The fragment shader:

uniform sampler2D screenRT;
varying vec2 uv;
 
void main( void )
{
    // make the screen red, (but works fine for other channels too, of course)
    gl_FragColor.r = dot(texture2D( screenRT, uv ).xyz,vec3(0.3, 0.59, 0.11));
}

Gaussian Blur Filter Shader

There are different ways to perform blur and this is one of the most common way to do it in a shader. It’s a two step method with first a horizontal blur and then a vertical blur. By splitting the work in two directions (two passes) you can save a lot of computation.

The method can be divided in the following parts:

  1. Render the scene you want to blur to a texture (could be downsampled)
  2. Render a screen aligned quad with the horizontal blur shader to a texture
  3. Render a screen aligned quad with the vertical blur shader to the screen or texture depending on what you want to use it for

The following image shows how the blur works when splitted up in two directions.

Separable blur filter

Here’s the horizontal blur shader.

Vertex Shader (GLSL) . This shader screen align a quad with width 1. Any method to render a screen aligned quad will work. So you’re free to use other shaders.

varying vec2 vTexCoord;
 
// remember that you should draw a screen aligned quad
void main(void)
{
   gl_Position = ftransform();;
  
   // Clean up inaccuracies
   vec2 Pos;
   Pos = sign(gl_Vertex.xy);
 
   gl_Position = vec4(Pos, 0.0, 1.0);
   // Image-space
   vTexCoord = Pos * 0.5 + 0.5;
}

Fragment Shader (GLSL) 

uniform sampler2D RTScene; // the texture with the scene you want to blur
varying vec2 vTexCoord;
 
const float blurSize = 1.0/512.0; // I've chosen this size because this will result in that every step will be one pixel wide if the RTScene texture is of size 512x512
 
void main(void)
{
   vec4 sum = vec4(0.0);
 
   // blur in y (vertical)
   // take nine samples, with the distance blurSize between them
   sum += texture2D(RTScene, vec2(vTexCoord.x - 4.0*blurSize, vTexCoord.y)) * 0.05;
   sum += texture2D(RTScene, vec2(vTexCoord.x - 3.0*blurSize, vTexCoord.y)) * 0.09;
   sum += texture2D(RTScene, vec2(vTexCoord.x - 2.0*blurSize, vTexCoord.y)) * 0.12;
   sum += texture2D(RTScene, vec2(vTexCoord.x - blurSize, vTexCoord.y)) * 0.15;
   sum += texture2D(RTScene, vec2(vTexCoord.x, vTexCoord.y)) * 0.16;
   sum += texture2D(RTScene, vec2(vTexCoord.x + blurSize, vTexCoord.y)) * 0.15;
   sum += texture2D(RTScene, vec2(vTexCoord.x + 2.0*blurSize, vTexCoord.y)) * 0.12;
   sum += texture2D(RTScene, vec2(vTexCoord.x + 3.0*blurSize, vTexCoord.y)) * 0.09;
   sum += texture2D(RTScene, vec2(vTexCoord.x + 4.0*blurSize, vTexCoord.y)) * 0.05;
 
   gl_FragColor = sum;
}

And here’s the vertical blur shader.

Vertex Shader (GLSL) (the same as for the blur in horizontal direction)

varying vec2 vTexCoord;
 
// remember that you should draw a screen aligned quad
void main(void)
{
   gl_Position = ftransform();;
  
   // Clean up inaccuracies
   vec2 Pos;
   Pos = sign(gl_Vertex.xy);
 
   gl_Position = vec4(Pos, 0.0, 1.0);
   // Image-space
   vTexCoord = Pos * 0.5 + 0.5;
}

Fragment Shader (GLSL) 

uniform sampler2D RTBlurH; // this should hold the texture rendered by the horizontal blur pass
varying vec2 vTexCoord;
 
const float blurSize = 1.0/512.0;
 
void main(void)
{
   vec4 sum = vec4(0.0);
 
   // blur in y (vertical)
   // take nine samples, with the distance blurSize between them
   sum += texture2D(RTBlurH, vec2(vTexCoord.x, vTexCoord.y - 4.0*blurSize)) * 0.05;
   sum += texture2D(RTBlurH, vec2(vTexCoord.x, vTexCoord.y - 3.0*blurSize)) * 0.09;
   sum += texture2D(RTBlurH, vec2(vTexCoord.x, vTexCoord.y - 2.0*blurSize)) * 0.12;
   sum += texture2D(RTBlurH, vec2(vTexCoord.x, vTexCoord.y - blurSize)) * 0.15;
   sum += texture2D(RTBlurH, vec2(vTexCoord.x, vTexCoord.y)) * 0.16;
   sum += texture2D(RTBlurH, vec2(vTexCoord.x, vTexCoord.y + blurSize)) * 0.15;
   sum += texture2D(RTBlurH, vec2(vTexCoord.x, vTexCoord.y + 2.0*blurSize)) * 0.12;
   sum += texture2D(RTBlurH, vec2(vTexCoord.x, vTexCoord.y + 3.0*blurSize)) * 0.09;
   sum += texture2D(RTBlurH, vec2(vTexCoord.x, vTexCoord.y + 4.0*blurSize)) * 0.05;
 
   gl_FragColor = sum;
}

And this is a scene without blur.

Scene before bluring

And this is the same scene but with gaussian blur.

Blured Scene

You can tweak the blur radius to change the size of the blur and change the number of samples in each direction.

Cost for separable blur shader : 9+9 = 18 (number of texture samples)
Cost for shader if blured in one pass: 9*9 = 81 (number of texture samples)
So splitting up in two directions saves a lot.

The gaussian weights are calculated accordingly to the gaussian function with standard deviation of 2.7. These calculations were done in the excel document found [2].

Here’s a description of blur shaders and other image processing shaders in DirectX:
[1] http://ati.amd.com/developer/shaderx/ShaderX2_AdvancedImageProcessing.pdf

More info about calculating weights for separable gaussian blur:
[2] http://theinstructionlimit.com/?p=40

Bilinear Interpolation

When sampling a texel from a texture that has been re-sized (which is nearly always the case in 3D rendering) you need to use some kind of filter to select what result you should get. Bilinear interpolation uses the four nearest neighbors to interpolate an average texel value.

Bilinear Interpolation

This is a built in filter in OpenGL and to activate it you set the following lines when setting up a texture:

// set the minification filter
glTexParameterf( GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR_MIPMAP_NEAREST );
// set the magnification filter
glTexParameterf( GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR );

If you for some reason wants to do bilinear interpolation manually in a shader then the function to do so looks like the following in GLSL. Note that in vertex shaders you have to do manual bilinear interpolation between texture samples.

const float textureSize = 512.0; //size of the texture
const float texelSize = 1.0 / textureSize; //size of one texel 
 
vec4 texture2DBilinear( sampler2D textureSampler, vec2 uv )
{
    // in vertex shaders you should use texture2DLod instead of texture2D
    vec4 tl = texture2D(textureSampler, uv);
    vec4 tr = texture2D(textureSampler, uv + vec2(texelSize, 0));
    vec4 bl = texture2D(textureSampler, uv + vec2(0, texelSize));
    vec4 br = texture2D(textureSampler, uv + vec2(texelSize , texelSize));
    vec2 f = fract( uv.xy * textureSize ); // get the decimal part
    vec4 tA = mix( tl, tr, f.x ); // will interpolate the red dot in the image
    vec4 tB = mix( bl, br, f.x ); // will interpolate the blue dot in the image
    return mix( tA, tB, f.y ); // will interpolate the green dot in the image
}

Here’s a magnification of a texture using using three different types of filters. The texture is mapped on a sphere and the viewport has been zoomed in on a small part of it.

Nearest Neigbour  (OpenGL fixed function implementation)

Nearest Neigbor filter

Bilinear Interpolation (OpenGL fixed function implementation)

Bilinear Interpolation OpenGL

Bilinear Interpolation (GLSL implementation, the code above)

Bilinear Interpolation GLSL

Some information about OpenGL texture mapping and how to set the filtering properties:
http://www.nullterminator.net/gltexture.html

Here’s some discussion why you should not always use bilinear interpolation:
http://gregs-blog.com/2008/01/14/how-to-turn-off-bilinear-filtering-in-opengl/

Link to a bilinear interpolation function for DirectX:
http://www.catalinzima.com/?page_id=85

Article on GameDev.net about bilinear filtering:
http://www.gamedev.net/reference/articles/article669.asp

Screen Space

The last step of transformations (world to screen) when rendering is to get the rendered objects to screen space. This is a 2D space with the origin in the middle of the screen. After the vertex shader but before the fragment shader the fragments will be normalized device coordinates and will automatically be transformed to screen space. The fragments x,y coordinates will be mapped to the x and y coordinates on the screen (the exact mapping is set by viewportbounds). And the z coordinates will be mapped to the depth buffer (z-buffer) as the following [-1] -> [near clip plane limit] and [1] -> [far clip plane limit].

The viewport mapping is the following [-1,1] in X is mapped to the left to right of the viewport range. The [-1,1] in Y is mapped to the bottom to top of the viewport range

Framents will automatically be transformed to screen space before the fragment shader, you just need to set the viewport mapping with the following OpenGL call.

glViewport(0, screenHeight/2, screenWidth/2, screenHeight/2);

You can access the current fragment depth in a fragment shader with the variable gl_FragDepth which in default is the same as gl_FragCoord.z. The screen coordinates can be accessed in the variable gl_FragCoord.xy.

Information about pixel shaders in DirectX (most is the same in OpenGL):
http://www.gamedev.net/columns/hardcore/dxshader3/

A quick reference guide to GLSL:
http://www.opengl.org/sdk/libs/OpenSceneGraph/glsl_quickref.pdf

Description of the mapping in DirectX:
http://msdn.microsoft.com/en-us/library/bb219690.aspx