Jump to content
Sign in to follow this  
Cortex Mods

Under the Hood: Explaining SSAO

Recommended Posts

81005b6c9b5196807463f465450ae55d91327c9b

Today we'd like to explain a new graphical feature that we are introducing to our games with update 1.38. The article is very technical - we have asked our programmers to help out, and the explanation is quite complex. However, we felt that it may actually be interesting for at least some of the people in our audience to be exposed to this material - to see that what is happening under the hood of a game engine involves a lot of research and hard work of our programming team. In addition to the technical details, we thought that providing the context and explaining the performance trade-offs may be useful and important for most of the players.

9ced6c043d3ad87279350d3a36a63e2206c945f8
Jaroslav a.k.a. Cim (one of our brave and skilled programmers working on graphics improvements)

The TL;DR summary of the text below is that Screen Space Ambient Occlusion is a cool new but performance-heavy technique to enrich the rendering of our game world. You do not have to use it if you feel it lowers performance too much for your liking, or you may like it and can afford to trade a few frames per second for improved shadow and depth perception. The effect may be subtle, it mostly works on subconsciousness level, but once you get used to it, it may be hard to go back. It is another milestone in our lighting/shadowing improvements plan that we are now executing, to be followed by new HDR light processing and introduction of more normal-mapped surfaces in upcoming updates.

dd83500bba273c58d4288033045ce4d5c8b9c447

The technique has its limitations and quirks. It has been used by several AAA games in recent years, and even if it's not perfect, it helps the human perception system to understand the scene better, and we hope that adding it to the technology mix of our truck sims is beneficial. We will no doubt want to introduce additional ways of shadowing computation that will improve or even supersede it.

3f16494380b930a59e513c9f583451076acebca9
1f801fe58b346176a05825d825bb84534c7ae153

We are under constant pressure to improve the looks of our game by a vocal subset of our fan base. At the same time, there is always a desire to make the games run faster. On top of these sometimes competing requests coming from the playerbase, our very own art department is ever eager to get hold of new graphical toys to make our game richer and better. Whenever we introduce a new graphical feature into the game, we try to do it in a way that doesn't hurt the performance for players with older computers, we don't want to make the game incompatible for our existing customers. That's why there is an option to switch SSAO off completely, and several settings for its quality/performance.

bcbbc1c9d08cc1d0c47e38b7411e2cde98a44e33
15af81906c5f5afa0429f6e98d8d943d104b0258

The work of our programmers on the new SSAO/HBAO techniques also required changes to our art and art creation pipeline. All the 3D models in our games had to be revisited by the art department, and any instances, where any fake shadows and darkening were already applied to a model by an artist, were changed. For some more complex game models, this was a simplification that actually reduced the number of triangles in them enough to improve rendering performance. To some extent, we have traded a part of tentative future manual effort that would be needed for building individual good-looking 3D models for an algorithmic rendering pass that unifies the shadowing look for the whole scene, helping to "root" objects like buildings, lamp-posts, and vegetation to the terrain.

9f9adec40b99f019d075545ef63b83ce382c5cff
0b0c0d35a8763b7d8b4f06cdbe02485d14eab663
d8ad29385d981cd56b8bbfd3636d1f4669f0fb38
485fe3f7f2cdf7a1d2cedd235901e2e81628a932
b33dd93232059ec3b4b24c64b81a521192045aec
e4ef03cd75b3477514945ee870f4d76ece15f7da
d8f9e49ea2bd27634950223d06d0931ba025692e
5b1a4a42dfb09752723341adb25090ce10d1fe1d

What SSAO stands for and how it works

Before we start - note that SSAO is a general acronym for "screen space ambient occlusion". The name encompasses all of the various ambient occlusion (AO) techniques and their variants that work in screen space (it means that they obtain all information at runtime from data that are rendered on the computer screen and into related memory buffers). There is SSAO (Crytek 2007 tech that basically gave a general name to all techniques), MSSAO, HBAO, HDAO, GTAO, and many more other techniques each using differently tuned approaches, each having its benefits and downsides. We have based our approach on a horizon-based technique called GTAO that was introduced in a 2016 paper by Activision.

The ambient occlusion (AO) name part means that we evaluate how much of incoming light (predominantly sky light, but sometimes the computed occlusion gets applied also to other lights) gets occluded at a particular place in the game world. Imagine that you are standing on flat ground - you would see the whole sky above, so there is 0% occlusion, the ground gets fully lit by the sky. Now imagine that you are at the bottom of a well - you would see only a small patch of sky, that means sky gets occluded almost 100% and contributes only a little to the ambient lighting in that well, and naturally it is quite dark at the bottom of the well. A specific level of ambient occlusion at a particular place affects lighting computations and creates shadowed areas in creases, holes, and other 'complex' places. It can get anywhere between 0% and 100% based on their surroundings.

55479ea6a67c35ae70a73f4402a99e2f042b967b
2d40134ceed987cd57bac0525b9534c238dcc61a
f1c716d26d814c6ae761ed0f57556116b4520368
212551c7637ca99a55c1485d40f3d34587adafef

Computing the occlusion in high detail and precision is resource-intensive; basically, you would need to shoot rays from any evaluated position in all directions and test whether they hit the sky or not, and then average the result. The more rays you shoot, the better information you get but at a greater computation cost. This process could be possibly processed off-line, like when the game map gets saved by its designer. Some games and engines use this approach. But that way you are only able to bake ambient occlusion information about static non-moving objects because there are no vehicles, no animated objects present at that time.

So instead of baking static information (which would also take a lot of time and storage space given the scale of our world map), we want to compute it on the fly, in run-time. That way we can compute it also for interaction with vehicles, opening bridges, animated objects, and so on. There is a catch though. For such a computation approach, we only have data that are visible on the screen (recall "screen space"), so once some part of the game world gets out of the visible frame, it can't be used for occlusion evaluation. This limitation creates various artifacts such as disappearing occlusion on a wall originally caused by an object that just got behind the edge of the screen and thus became invisible not just for you but also for the algorithm, so it ceased to contribute to occlusion computation.

9bf070a8a7946601cde3cc3a4bc8c27e8bec2a80
be3271fdd0d2d8f11df5cf650b5249af84a9a188
d461db93ae243a8347e5026bd11bf77ae512c176
d707a49f1626650eceee2af90e96aeca8eb3cf85

Ok, now we know what to evaluate (ambient occlusion) and we know what data we have (what we see on screen). What do we do? Well, for each pixel on the screen (that is 2 million pixels in HD resolution, times four(!) in 400% scaling!) our shader code needs to query the z-buffer value of its surrounding pixels trying to get a notion of the geometric shape of the area surrounding it. We can do only a limited number of these "taps" as there is a steeply increasing performance cost with increasing tap count, this is an operation that is really taxing the 3D accelerator. The limit on the number of taps, in turn, affects the ambient occlusion precision (and in certain situations may create inaccuracies and banding). Imagine that you want to evaluate your surroundings on a 2-meter straight line, and are willing to spend 8 taps to approximate it. You query the line every 25 centimeters, and any detail smaller than that may happen to be totally unnoticed unless you are lucky and hit it spot on (or unlucky, because you may miss it the next frame so the surroundings would suddenly appear to change between frames and cause flickering). The further your algorithm probes, the less precise it is. So you need to limit the size of an area you analyze around each game pixel which in turn limits how far the AO 'sees' - that is why it is not suitable for computing occlusion in large spaces like under bridge arches.

da7a3a3fb805ce34cb546774fa3ee05751293648
7e0cc00ccd7f447a1ae2ea35f870a2df2ffc9f85
5870a56e240035f3b6265f5098a11d9f1b35cf9e
3b40dde55ee3c060be437b0f9c9beaaff3d37ae1

We have mentioned that the technique of our choosing is horizon-based. This means that we are not probing the environment by shooting rays in the 3D world, instead, we analyze a hemisphere above/around each pixel to see how far it opens up until it is blocked, how much light is let in by the surrounding geometry using the z-buffer as our proxy. The hemisphere is actually approximated by several runs along a line rotated around the given pixel. If we can follow along this hemisphere in full, there is no occlusion. If the algorithm taps a value in the z-buffer that would block incoming light, it defines the level of occlusion. The algorithm is optimized for performance but its limitation is that once it hits anything, even possibly a small object, it stops probing any further. This may cause an "over occlusion" problem and may be spotted as a visual artifact when some relatively thin object such as a traffic sign post causes strong occlusion on a nearby wall. You can try to detect such small objects and skip them, which in turn may produce "under occlusion" on thin ledges. We have opted for the former.

a01d2920c041e3f27fdccf474a5f02eac29431d1
7403e254341b9792c480d29d08c381a1dff107fb
3f91649ddc4703fe0d54147a97e1e3bf51d137b2
e9536c62a440a53967f650159bc2396c678fe33c

There is also another interesting and useful property of horizon based techniques. Depending on how much of a hemisphere above a given pixel is occluded, you can compute the direction that is least occluded. The amount of occlusion can be thought of as an ice cream cone with varying apex angle oriented in that direction. This direction is called a "bent normal" and we use it for various light computations like for occluding a reflection on shiny surfaces. The idea is that if you look at the surface and the mirror-reflected direction gets out of this cone, we consider it (at least partially) occluded, tuning down the reflection intensity. The best way to see that effect is to look at bigger and round chrome parts, like the diesel tanks, with SSAO on and off.

c5f2f3229342ea49417af9f10da2230debfc6f36
862d6857088123ae0d546ee2ccbb0f1275ca4b44
b03bf42d5533104793d84acbf0c13f88b3419f6b
882a7d207bf66a205d0e2b25930b3c6d0acda49f

So you see, the idea is not that hard, for an expert graphics programmer anyway ;), but there is a lot of computation involved, putting quite some strain on the 3D accelerator. So we have created several performance profiles, each using a mix of optimization techniques:

  • Using less taps per direction - it is faster but lets AO miss bigger objects than with finer sampling.
  • Reprojecting AO results from the previous frame - it lets us hide the artifacts from undersampling, but may create ghosting when reprojection fails (when what you see between frames changes a lot).
  • Rendering in half-resolution - reduces the number of computations to 1/4 but creates less fine AO - the result may be slightly blurry shadowing

We hope that all this info was interesting and useful for you. We're sending you a virtual high-five if you read this article to this point. You deserve a cookie and a big cup of hot chocolate! If you still want to get more details about this topic, feel free to check for example this link[www.cse.chalmers.se].

Thank you for your time and support and we will see you again at some of the next articles from the "Under the Hood" section we bring for our #BestCommunityEver from time to time.

Also, don't forget that the Steam summer sale is ending soon! Check out our developer page.

View the full article

Share this post


Link to post
Share on other sites
Sign in to follow this  

×
×
  • Create New...