Most (if not all) optical phenomena that materials exhibit can be replicated by simulating how the individual rays of light propagate and interact. This approach is referred in the scientific literature as ray tracing, and it is often too computationally expensive for any real-time application. Most modern engines rely on massive simplifications that, despite being unable to reproduce photorealism, can produce a believable approximation. This tutorial introduces a fast, cheap and convincing solution that can be used to simulate translucent materials which exhibit subsurface scattering.
This is a two part series:
At the end of this post, you will find a link to download the Unity project.
Introduction
The Standard material in Unity comes with a Transparency mode, which allows rendering transparent materials. Transparency, in this context, is implemented with alpha blending. A transparent object is rendered on top of existing geometry, partially showing what is behind. While this works for many materials, transparency is a special case of a more general property, called translucency (sometimes also called translucidity). While transparent materials only affect the amount of light they let through (below, left), translucent ones can alter its path (below, right).
The result of this behaviour should be clear: translucent materials diffuse the light rays they let through, blurring what was behind them. Such a behaviour is rarely seen in games, since it is significantly more complex to implement. Transparent materials can be implemented naively with alpha blending, without ray tracing. Translucent materials, on the other hand, require simulating the deviation of the light rays. Such a computation is very expensive and is rarely worth it in real time rendering.
This often prevents from achieving other optical phenomena, such as subsurface scattering. When light hits the surface of a translucent material, a part propagates inside, bouncing between the molecules until it finds its way out. This often causes light absorbed at a specific point to be reemitted somewhere else. Subsurface scattering results in a diffuse glow that can be seen in materials such as skin, marble, and milk.
Real Time Translucency
There are two main obstacles that make translucency so expensive. The first one is that it requires simulating the scattering of light rays inside a material. Each ray can split in multiple ones, reflecting hundreds or even thousands of times inside a material. The second obstacle is that light received at one point is reemitted somewhere else. While this seems a minor issue, in reality, is a big deal.
To understand why, we first need to look at how most shaders work. In the realm of real-time rendering, GPUs expect a shader to be able to calculate the final colour of a material simply using local properties. For each vertex, shaders are designed to efficiently access only the properties that are local to that vertex. Reading the normal direction and albedo of a vertex is easy; retrieving the ones of its neighbours is not. Most real-time solution must work around these constraints, and find a way to fake the propagation of light within a material without relying on non-local information.
The approach described in this tutorial is based on the solution presented at GDC 2011 by Colin Barré-Brisebois and Marc Bouchard in a talk called Approximating Translucency for a Fast, Cheap and Convincing Subsurface Scattering Look. Their solution is integrated into the Frostbite 2 engine, which was used for DICE’s Battlefield 3. While not being physically accurate, the approach presented by Colin and Marc produces very believable results at a very small cost.
The idea behind their solution is very simple. In opaque materials, the light contribution comes directly from the light source. Vertices that are inclined more than 90 degrees in respects to the direction of the light, , receive no light (bottom, left). According to the model proposed in the presentation, translucent materials have an additional light contribution which is related to . Geometrically, can be seen as if some of the light actually passed through the material and made it to the other side (bottom, right).
Each light now accounts for two, distinct reflectances contributions: the front and back illuminations. Since we want our materials to be as realistic as possible, we will use Unity’s Standard PBR lighting models for the front illumination. What we need is to find a way to describe the contribution from , and render it in a way that somehow simulates the diffusion process which might have occurred inside the material.
⭐ Recommended Unity Assets
Unity is free, but you can upgrade to Unity Pro or Unity Plus subscription plans to get more functionalities and training resources for your games.
Back Translucency
As discussed before, the final colour of our pixels depend is the sum of two components. The first one is the “traditional” lighting. The second one is the light contribution from a virtual light source illuminating the back of our model. This gives the impression that light from the original source actually passed through the material.
To understand how to model this mathematically, let’s picture the following two scenarios (diagrams below). We are currently drawing the red point; since it’s in the “dark” side of the material, it should be illuminated by . From the perspective of an external viewer, let’s analyse the two extreme cases. We can see that is perfectly aligned with , meaning that the viewer should see the back translucency at its fullest. On the other hand, viewer should see the least amount of backlight as it is perpendicular to .
If you are not new to shader coding, this kind of reasoning should sound familiar. We have encountered something similar in the tutorial on Physically Based Rendering and Lighting Models in Unity 5, where we showed how such a behaviour can be obtained using a mathematical operator called the dot product.
As a first approximation, we can say that the amount of back lighting due to translucency is proportional to . In a traditional diffuse shader, this would be . We can see that we have not included the surface normal in the calculation, as light is simply coming out of the material, not reflecting on it.
Subsurface Distortion
However, the surface normal should have some influence, even if minor, on the angle at which the light is leaving the material. The authors of this technique introduced a parameter, called subsurface distortion , which forces the vector to point towards . Physically speaking, this the subsurface distortion controls how strongly the surface normal deflects the outgoing back light. Following the solution proposed, the intensity of the back translucency component becomes:
Where is a unit vector that points in the same direction of . If you are familiar with Cg/HLSL, that is the normalize
function.
When , we return to the derived in the previous paragraph. When , however, we are calculating the dot product between the view direction and . If you are familiar with the Blinn-Phong reflectance, you should know that is the vector “in between” and . For this reason, we will call it as the halfway direction .
The diagram above shows all the directions used so far. is indicated in purple, and you can see that it rests in between and . Geometrically speaking, varying from to causes a shift in the perceived direction of the light . The light shaded area shows the range of directions the backlight will come from. In the image below you can see that with , the object seems to be illuminated from the purple light source. When moved towards , the perceived direction of the light source shifts towards the purple one.
The purpose of is to simulate the tendency of certain translucent materials to diffuse the backlight with different intensities. Higher values of will cause the back light to scatter more.
❓ Is this H the same H used in the Blinn-Phong Reflectance?
No. The Blinn-Phong reflectance defines as . Here, we are using the same letter to indicate .
❓ Is 𝛿 really interpolating between L and L+N?
Yes. Values of from to linearly interpolates between and . This can be seen by unfolding the traditional definition of linear interpolation from to based on :
❓ How come the authors did not normalise L+N?
Geometrically speaking, the quantity does not have unit length; hence, it needs to be normalised. In their final solution, the authors are not performing this normalisation step.
Ultimately, this entire effect is intended to be neither photo-realistic or physically based. During their presentation, the authors made very clear that it was intended to be used as a fast approximation of translucency and subsurface scattering behaviours. Normalising does not change the results too much, but introduces a significant delay.
Back Light Diffusion
At this point in the tutorial, we already have an equation that we can use simulate translucent materials. The quantity can not be used to calculate the final light contribution.
There are two main approaches that can be used. The first one relies on a texture. If you want to have full artistic control on the way light diffuses in the material, you should clamp between and , and use it to sample the final intensity of the back light. Different ramp textures will simulate the light transport within different materials. We will see in the next part of this tutorial how this can be used to change the result of this shader dramatically.
The approach used by the author of this technique, however, does not rely on a texture. It creates a curve using Cg code only:
The two new parameters, (power) and (scale) are used to change the properties of the curve.
Conclusion
This post explains the technical challenges in rendering translucent materials. An approximate solution is introduced, followed the approach presented by Approximating Translucency for a Fast, Cheap and Convincing Subsurface Scattering Look. The next part of this tutorial will focus on how to actually implement this effect in a shader in Unity.
- Part 1. Fast Subsurface Scattering in Unity
- Part 2. Fast Subsurface Scattering in Unity
If you are interested in more sophisticated approaches to simulate subsurface scattering for real time applications, GPU Gems provides one of the best tutorials you can find.
Become a Patron!
You can download all the necessary files to run this project (shader, textures, models, scenes) on Patreon.
Leave a Reply