Ambient OcclusionAmbient occlusion is typically defined as a measure of the amount of light reaching any particular point from all directions, or from a hemisphere tangent to the surface on which the point is embedded. Variations on ambient occlusion general attempt to bring higher dimensional information to the point. Aperture occlusion adds a vector pointing to the widest gap to the surrounding environment, and an angle describing the width of the aperture. Spherical Harmonic maps encode accessibility information modulated continuously around the point.
Accessibility shading preceded screen space methods. It was introduced into film production by Hayden Landis of ILM, and described in a Siggraph course in 2002 http://www.renderman.org/RMR/Books/sig02.course16.pdf.gz. A good overview and history of these methods are in Hayden's presentation, and can also be found in Chapter 17 of GPU Gems. Accessibility information is stored on a mesh and used to modify the lighting calculation during rasterization. An up to date implementation on GPU is described in this paper by Samuli Laine and Tero Karras of nVidia.
Screen Space Ambient Occlusion
Screen Space Ambient Occlusion is based on the notion that the depth buffer contains much of the information needed to compute lighting similar to accessibility shading.
The technique was popularized by Crytek's Martin Mittring in a presentation at Siggraph 2007, see section 188.8.131.52 for details. A full screen render pass is performed wherein z-buffer data is sampled around each pixel and a darkening value is computed based on depth differences. Sampling occurs randomly in three space in a sphere around each pixel, and darkening is proportional to the number of sampled occluders. Here's an example of their results:
Some early debate on the technique occurred on gamedev.net, with the initial discussion here resulting in an algorithm and techniques for resolving sampling artifacts that occur on flat gradients in the scene. Ultimately a nice 4k demo using the technique on rendered Julia fractals is presented. This method, based strictly on depth sampling, works best with this sort of highly detailed geometry.
RGBA (Iñigo Quilez) of kinernoiser and kindercrasher fame has a treasure trove of ambient occlusion experimentation and research references. This explanation of the lighting in Plastic 195/95/256 is an example of the quality, in depth information you'll find there.
Plastic by RGBA.
Robust methods for achieving hardware accelerated AO are presented in the influential paper Hardware Accelerated Ambient Occlusion Techniques on GPUs by Perumaal Shanmugam and Okan Arikan. Unlike most presentations, this method yields results similar to ground truth AO calculations. In this paper, the AO search is split into two phases, one for high frequency near detail, and another phase for low frequency detail with a wider search. The second phase in particular is of interest, it allows large objects to occlude each other as they pass. The low frequency pass is described by this image:
Peter-Pike Sloan et al., have recently published Image-Based Proxy Accumulation for Real-Time Soft Global Illumination which works by accumulating shadowing and indirect illumination in screen space splats. As with other AO methods, a further pass is necessary to introduce direct illumination.
A very cheap method is developed within Blender by Mike Pan, with convincing results. The effect is subjectively a too strong at deep depth discontinuities, but given the simplicity of the approach, it could be considered acceptable. The image on the left is the simple flat shaded image for comparison. The trick involved here is very clever - the depth buffer is blurred, then subtracted from the original depth buffer. The image is clamped and scaled to reduce artifacts and subtracted from the original image. This method is about a zillion times cheaper than any of the others presented on this page.
This technique and several variations are discussed in detail in the Siggraph 2006 paper, Image Enhancement by Unsharp Masking the Depth Buffer. Several NPR effects are achieved by mapping the depth discontinuities to look up tables to get Gooch like effects. For more, see the project website.
Another technique mentioned by Ignacio in the comments: Ambient Occlusive Crease Shading, by Fox & Compton. For each pixel, nearby pixels are sampled, and dot products between normals are scaled by distance between samples to compute occlusion. The resulting pass is bloomed, blurred, and contrast enhanced before being applied to the framebuffer. The page includes the pixel shader doing the work.
Via Ziggyware: Alex Urbano Alvarez has generously released an SSAO implementation for XNA GS 2.
The movie shows that in general screen space ambient occlusion makes for a lively and attractive image. It also highlights artifacts that can result from not taking linear depth into account. The effect here is not scaling well with depth; shadows are uniformly thick no matter how far the object is from the camera, note that the shadow halo around the top of the pillars is as wide as the halo on the bottom of the pillars. The effect is not taking depth discontinuity into account. This variation of the effect could be further developed for NPR applications; for example, with some thresholding an attractive toon shader could result.
This thread at BlitzBasic shows what I believe to be the same algorithm as Alvarez. The artifacts in the game example late in the thread are largely hidden by the lighting in the scene. It seems if you are not after a photorealistic result, this algorithm can sweeten your look.
Aras Pranckevičius has a
A Siggraph course presenting techniques used in StarCraft II. StarCraft's unique twist is that they use SSAO to fake low frequency diffuse inter-reflection as well as crease emphasis. They accomplish it by taking samples at two distances; near samples for creasing, and far samples for fake diffuse inter-reflection.
This Russian tutorial (translated) contains several code samples, and improvements based on data available from G-buffers. http://steps3d.narod.ru/tutorials/ssao-tutorial.html, as well as an OpenGL test program.
An article with some phonecam capture at GDC08 (slides). This lecture does a review of multiple depth-buffer-based ambient occlusion techniques. Three of the described algorithms are ray-marching in the depth buffer, an algorithm based on accumulating solid angles, and a new hybrid method called tangent tracing. Real-time demonstrations included.
- A Wikipedia article. < ahref="http://en.wikipedia.org/wiki/Screen_Space_Ambient_Occlusion">http://en.wikipedia.org/wiki/Screen_Space_Ambient_Occlusion
- ActionScript3 implementation http://kode80.com/ssao2
- nVidia can impose SSAO on a game through the driver. It looks good, although I'm not convinced it's kosher to go back in and modify a game's look that way without involving the developers.
- An nVidia's SSAO demo: http://developer.download.nvidia.com/SDK/10.5/direct3d/samples.html
- An nVidia SSAO whitepaper: http://developer.download.nvidia.com/SDK/10.5/direct3d/Source/ScreenSpaceAO/doc/ScreenSpaceAO.pdf
- A preliminary note on Horizon Split Ambient Occlusion: http://levelofdetail.wordpress.com/2008/02/27/recovering-from-i3d-and-gdc/
- NullSquared posted a demo on the Ogre forums http://www.ogre3d.org/phpBB2/viewtopic.php?t=41320
- Implemented in 3dsMax http://www.malmer.nu/index.php/2008-04-09_ssao-c-style-pseudo-source-code/
- An interesting discussion in a thread at Level of Detail.
- Shalinor discusses the related technique of Crease Shading
Thanks to everyone's contributions in the comments!