Category Archives: Particle Systems

Instanced Billboards

In DirectX9, one can use instanced billboards to render lots of particles with good performance. Since we presume all particles to be billboards constructed from two triangles forming a quad, by using instancing, we can reuse this geometry data (only uv-coordinates needed) for each particle and therefore saving bandwidth. When rendering, two streams with different frequency should be used.  The quad-geometry data makes the first stream, the second stream consists of the per instance data that is unique for each particle. This instance data could be the position, rotation, color and more. The big drawback with this rendering approach is that it requires hardware instancing support, which means Shader Model 3.0. (or Shader Model 2.0 for ATI cards if using a trick described in the first source below).

This particle rendering approach described in details:
http://zeuxcg.blogspot.com/2007/09/particle-rendering-revisited.html

DirectX9 info about rendering multiple streams
http://msdn.microsoft.com/en-us/library/bb147299(VS.85).aspx

DirectX9 instancing info
http://msdn.microsoft.com/en-us/library/bb173349(VS.85).aspx

Soft Particles

Normal particles on the left, soft particles on the right

The aim with soft particles is to remove the ugly artifact that appears when the particle quad intersects the scene. There are a lot of different approaches to solve this, some more complicate than others. The simplest formula for soft particles is to just fade the particle if it’s getting to close to the scene. To do this, the scene without particles has to be rendered first and the depth saved in a texture. When drawing the particles, the depth of the particle will be compared to the scene depth. The alpha should be increased by a smooth fade by this depth difference. The formula below in HLSL is the simplest possible for soft particles, and works very well. Scene_depth is the sampled depth (in viewspace) of the scene in the direction of the current pixel. Particle_depth is the depth(in viewspace) of the current particle pixel. Scale is used to control the “softness” of the intersection between particles and scene:

fade = saturate((scene_depth – particle_depth) * scale);

NVIDIA [1] proposes a method that the following fade should be used instead of the linear one described above, to make the fade even smoother.

float Output = 0.5*pow(saturate(2*(( Input > 0.5) ? 1-Input : Input)), ContrastPower);
Output = ( Input > 0.5) ? 1-Output : Output;

Umenhoffer [2] proposes a method called spherical billboards to deal with these problems. In this method, the volume is approximated by a sphere. This method also deals with the near clipplane problem that particles will instantly disappear if they get to close to the camera.

There is also an idea [3] that the alpha channel can be used to represent the density of the particles. Although this method has the drawback that the textures might need to be redone by the artists.

The method by Microsoft [4] uses a combination of spherical billboards and a texture representation of the volume. But instead of using the alpha channel, they ray march the sphere and sample the density and volume from a 3D noise texture. The result can be seen in the image below.

Volumetric Particles

The video below shows how soft shadows can increase realism in games using large particles. It’s originally an ad for Torque 3D engine.

[1] Soft Particles by NVIDIA
http://developer.download.nvidia.com/whitepapers/2007/SDK10/SoftParticles_hi.pdf

[2] Spherical Billboards and their Application to Rendering Explosions
http://www.iit.bme.hu/~szirmay/firesmoke.pdf

[3] A Gamasutra article about soft particles
http://www.gamasutra.com/view/feature/3680/a_more_accurate_volumetric_.php

[4] A DirectX 10 implementation of soft particles by Microsoft, called Volumetric Particles
http://msdn.microsoft.com/en-us/library/bb172449(VS.85).aspx

s683fcw9dj

Normal Mapped Billboards

This technique doesn’t actually invent something new. It’s just a combination of normal mapping and billboards to realistically lit particle systems. It has been used successfully in many games to fake volumetric smoke.

The movie below shows an example of a lot of particles, rendered as billboards that are normal mapped to look like spheres.

Lit smoke and Post-process system design (also in the book ShaderX 5)
http://www.gamedev.net/community/forums/topic.asp?topic_id=432218&whichpage=1?

A thesis that tried to implement this method (some interesting info, but results aren’t good enough)
http://epubl.ltu.se/1404-5494/2008/011/LTU-HIP-EX-08011-SE.pdf

Some discussion about normal mapped billboards
http://www.drone.org/tutorials/lighting_flat_objects.html

Fluid Simulation and Rendering

For effects like smoke or water, a fluid simulation and rendering approach is needed. There are currently two popular methods for this:

  1. Simulate the fluid on the CPU and send the result as particles to the GPU for rendering as billboards. This is often called a particle system. The technique has been around since the dawn of computer graphics.

  2. Simulate the fluid on the GPU and render the result into textures. This will then be rendered by doing volume ray casting (or ray marching) on the GPU. This technique is new and rather unexplored, and there are few real-life implementations. The result can be very realistic but slow.

Technique one burdens both CPU, bandwidth and GPU. Although in modern solutions, it’s the bandwidth that’s the bottleneck. The GPU based technique only burdens the GPU ( but a lot ).

The movie shows the GPU method of fluid simulation and rendering. More info about this particular implementation in the two last links.

Building an Advanced Particle System
http://www.mysticgd.com/misc/AdvancedParticleSystems.pdf
Building a Million Particle System
http://www.2ld.de/gdc2004/MegaParticlesPaper.pdf
Real-Time Simulation and Rendering of 3D Fluids
http://http.developer.nvidia.com/GPUGems3/gpugems3_ch30.html
The previous page’s authors homepage:
http://www.cs.caltech.edu/~keenan/project_fluid.html