This is the third part of the online series dedicated to Journey Sand Shader.

- Part 1. A Journey Into Journey’s Sand Shader
- Part 2. Journey Sand Shader: Diffuse Colour
**Part 3. Journey Sand Shader: Sand Normal**- Part 4. Journey Sand Shader: Specular Reflection
- Part 5. Journey Sand Shader: Glitter Reflection
- Part 6. Journey Sand Shader: Sand Ripples

In this third post, we will focus on the normal mapping that will turn smooth 3D models into sandy dunes.

In the previous part of this online course, we have implemented the diffuse lighting of Journey’s sand. With that effect alone, the desert dunes would appear rather flat and dull.

One of the most intriguing effects that can be seen in Journey is the granularity of the sand. By looking at any screenshot, we have the impression that the dunes are not smooth and homogeneous; they are made out of millions of microscopic grains of sand.

This effect can be achieved using a technique called **bump mapping**, which allows light to reflect on a flat surface as it would on a more complex one. You can see how this effect changes the rendering below:

You can appreciate the subtle differences in the zoomed boxes below:

## Understanding Normal Mapping

Sand is made out of countless grains, all different in shape and composition (below). Each individual grain reflects light in a potentially random direction. One way to achieve such an effect would be to create a 3D model that contains all of those microscopic grains. That is infeasible, due to the immense number of polygons it would require.

There is another solution, which is often used to simulate a more complex geometry than the one that a 3D model actually has. Each vertex or face of a 3D model is associated with a parameter called its **normal direction**. This is a vector of length one, that is used to calculate how light reflects on the surface of the 3D model. Modelling sand means modelling the seemingly random distribution of those grains and, consequently, the way they affect the surface normals.

There are countless ways in which this could be done. The most simple is to author a texture to alter the original normal directions of the dune’s model.

The **surface normal**, , is generally computed from the geometry of the 3D model. However, it is possible to perturb it using a **normal map**. Normal maps are textures that allow simulating a more complex geometry than the one actually present, by changing the local orientation of the surface normals. This technique is often called **bump mapping**.

Altering the normals is a relatively easy task, that can be done in the `surf`

function of a **surface shader**. This function receives two parameters, one of which is a `struct`

called `SurfaceOutput`

. It contains all the properties necessary to draw a part of the 3D model, from its colour (`o.Albedo`

) to its transparency (`o.Alpha`

). Another parameter it contains is the normal direction (`o.Normal`

), which can be overwritten to alter how light will reflect on the model.

Following Unity’s documentation on surface shaders (Writing Surface Shaders), all normals written to the `o.Normal`

field of `SurfaceOutput`

must be expressed in **tangent space**:

struct SurfaceOutput { fixed3 Albedo; // diffuse color fixed3 Normal; // tangent space normal, if written fixed3 Emission; half Specular; // specular power in 0..1 range fixed Gloss; // specular intensity fixed Alpha; // alpha for transparencies };

This is a way of saying that the unit vectors must be expressed in a coordinate system that is relative to the actual normal of the mesh. For instance, writing `float3(0, 0, 1)`

to `o.Normal`

leaves the normal unchanged.

void surf (Input IN, inout SurfaceOutput o) { o.Albedo = _SandColor; o.Alpha = 1; o.Normal = float3(0, 0, 1); }

That is because the vector `float3(0, 0, 1)`

is indeed the normal vector, expressed relative to the 3D model geometry.

So, all we need to do to alter the surface normal in a **surface shader** is to write the new vector to `o.Normal`

in the **surface function**:

void surf (Input IN, inout SurfaceOutput o) { o.Albedo = _SandColor; o.Alpha = 1; o.Normal = ... // change the normal here }

The rest of this post will provide an initial approximation, which will be further expanded in the sixth instalment of this series: Journey Sand Shader #6: Sand Ripples.

### ⭐ Recommended Unity Assets

Unity is free, but you can upgrade to **Unity Pro** or **Unity Plus** subscription plans to get more functionalities and training resources for your games.

## Sand Normal

The most problematic part is to understand *how* the grains of sand are altering the surface normal. While it is true that, individually, each grain can scatter light in any direction, this is not what happens overall. Any physically-based approach should study the distribution of normal vectors on a patch of sand, and modelling that mathematically. While there are indeed models that do that, the solution presented in this course is much simpler, yet very effective.

For each point on the model, a **random unit vector **is sampled from a texture. Then, the surface normal is tilted towards that vector by a certain amount. By carefully authoring the random texture and choosing an appropriate blending amount, we can perturb the surface normal just enough to add a grainy feeling to it, without losing the overall curvature of the dunes.

Random values can be sampled using a texture filled with random colours. The R, G and B components of each pixel are used as the X, Y and Z components of a normal vector. Colour components are in the range so they must be remapped to . The resulting vector is then normalised to ensure its length is equal to .

❗ Creating random textures

There are many ways to generate a random texture. What is critical for this effect, is the overall distribution of the random vectors that can be sampled from it.

In the image shown above, each pixel is completely random. There is no overall direction (colour) which is more prevalent, as each one has the same probability as any other. This translates to a type of sand that scatters light in all directions.

During his GDC talk, John Edwards clearly stated that the random texture that was used for Journey’s sand was generated from a Gaussian distribution. That ensured that the predominant direction was the one aligned with the surface normal.

❓ Do I need to normalise the random vectors?

The image that I have used to sample random vectors was generated with a completely random process. Not only each pixel is generated independently, but also the R, G and B components within the same pixels are independent. This means that, generally speaking, vectors sampled from that texture are not guaranteed to have length equal to .

You can indeed generate a texture in which each pixel, once remapped from to is indeed supposed to have length . However, there are two problems.

Inaccuracies, filtering and floating-point errors might introduce a significant error in your calculation. Secondly, you lose the guarantee of *unit sampling* when mipmapping is being used, as multiple colours are blended together to create lower resolution versions of the original texture.

To avoid any issue, you should always normalise your vectors.

## Implementation

The previous part of this course introduced the concept of normal mapping when it presented the very first draft for the **surface function** `surf`

. Recalling the diagram presented at the beginning of this article, you can see that there are two effects that are necessary to reproduce Journey’s sand rendering. The first one (the *sand normal*) is discussed in this article, while the other one (the *sand ripples*) will be explored in Journey Sand Shader #6: Sand Ripples.

void surf (Input IN, inout SurfaceOutput o) { o.Albedo = _SandColor; o.Alpha = 1; float3 N = float3(0, 0, 1); N = RipplesNormal(N); // Covered in Journey Sand Shader #6 N = SandNormal (N); // Covered in this article o.Normal = N; }

In the section above we have introduced the idea of bump mapping, indicating that part of the effect will require to sample a texture (referred to, in the code, as `uv_SandTex`

).

One problem of the code above is that the calculations require to know the actual position of the point we are currently drawing. In fact, sampling a texture requires a **UV coordinate**, which indicates which pixel to read from. If the 3D model that we are using is relatively flat and is UV mapped, it possible to use its UV to sample the random texture

N = WavesNormal(IN.uv_SandTex.xy, N); N = SandNormal (IN.uv_SandTex.xy, N);

Alternatively, one could also use the world position (`IN.worldPos`

) of the point rendered.

We can now finally focus on `SandNormal`

, and its implementation. As said in the previous sections, the idea is to sample a pixel from a random texture, and using that (once appropriately transformed into a unit vector) as the new normal.

sampler2D_float _SandTex; float3 SandNormal (float2 uv, float3 N) { // Random vector float3 random = tex2D(_SandTex, uv).rgb; // Random direction // [0,1]->[-1,+1] float3 S = normalize(random * 2 - 1); return S; }

❓ How do I rescale the random texture?

Based on the UV mapping of your 3D model, you might either have very large or very small grains. The best way to move forward is to introduce parameters to scale the texture, so that you can tweak it as much as you like from the inspector.

This is a standard feature that Unity offers on all textures. To use it, is necessary to define another variable called _SandText_ST. Unity will associate it with already existing variable (and its property) _SandTex.

The variable _SandText_ST will contain four values: the preferred size and offset of the texture. These values can be tweaked directly from the inspector, and will automatically appear under the texture slot as *Tiling* and *Offset*:

In order for these changes to be reflected in the texture sampling, we need to use the TRANSFORM_TEX macro as seen below:

sampler2D_float _SandTex; float4 _SandTex_ST; float3 SandNormal (float2 uv, float3 N) { // Random vector float3 random = tex2D(_SandTex, TRANSFORM_TEX(uv, _SandTex)).rgb; // Random direction // [0,1]->[-1,+1] float3 S = normalize(random * 2 - 1); return S; }

## Tilting the Normal

The snipped presented in the section above works, but does not yield very good results. The reason is simple: if we simply return a completely random normal, we are effectively losing the perception of curvature. In fact, the normal direction is used to calculate how light should reflect on a surface, and its primary use is to shade the model according to its curvature.

You can see the difference in the images below. On the left, the normals of the dunes are completely random, and is impossible to see where one ends and the next one starts. On the right, only the normal of the model is used, resulting in an aesthetics that is too smooth.

Both solutions are inadequate. What we need is a blend of the two. The random direction sampled from the texture should be used to *tilt* the normal direction by some amount, as seen below:

The operation described in the diagram above is known as **slerp**, which stands for **spherical linear interpolation**. *Slerp* works exactly like lerp, with the difference that it can be used to safely interpolate between unit vectors, producing other unit vectors.

Unfortunately, the proper implementation of slerp is rather expensive. And for an effect that is mostly based on randomness, it makes little sense to use it.

It is important to notice that if we use the traditional **linear interpolation**, the resulting vector would look quite different:

Lerping between two distinct unit vectors is not guaranteed to produce another unit vectors. In fact, it never does except when the coefficient is either or .

That being said, normalising the result of lerp indeed produces a unit vector that is surprisingly close to the actual result that slerp would produce:

float3 nlerp(float3 n1, float3 n2, float t) { return normalize(lerp(n1, n2, t)); }

This technique, called **nlerp**, has been proposed a close approximation of slerp. Its usage has been popularised by Casey Muratori, one of the developers behind The Witness. If you are interested in reading more about these topics, I suggest Understanding Slerp. Then Not Using It by Jonathan Blow, and Math Magician – Lerp, Slerp, and Nlerp.

Using nlerp, we can now efficiently tilt the normal vectors towards the randomised direction that was sampled from `_SandTex`

:

sampler2D_float _SandTex; float _SandStrength; float3 SandNormal (float2 uv, float3 N) { // Random vector float3 random = tex2D(_SandTex, uv).rgb; // Random direction // [0,1]->[-1,+1] float3 S = normalize(random * 2 - 1); // Rotates N towards Ns based on _SandStrength float3 Ns = nlerp(N, S, _SandStrength); return Ns; }

The result can be seen below:

### 📰 Ad Break

## What’s Next…

In this third part of the online series about the sand rendering in Journey, we focused on how its dishomogeneous look was achieved using random textures and normal maps.

In the next part, Journey Sand Shader: Specular Reflection, we focus on the shimmering reflections that make Journey’s dunes appear almost like an ocean.

- Part 1. A Journey Into Journey’s Sand Shader
- Part 2. Journey Sand Shader: Diffuse Colour
**Part 3. Journey Sand Shader: Sand Normal**- Part 4. Journey Sand Shader: Specular Reflection
- Part 5. Journey Sand Shader: Glitter Reflection
- Part 6. Journey Sand Shader: Sand Ripples

### Credits

The videogame Journey is developed by **Thatgamecompany** and published by **Sony Computer Entertainment**. It is available for PC (Epic Store) and PS4 (PS Store).

The 3D models of the dunes, backgrounds and lighting settings were made by Jiadi Deng.

The 3D model of the Journey’s player was found on the (now closed) FacePunch forum.

## Download Unity Package

Become a Patron!

If you want to recreate this effect, the full Unity package is available for download on Patreon. It includes everything needed, from the shaders to the 3D models.

## Leave a Reply