This online course is dedicated to interactive maps, and how to create them using Shaders in Unity.

This is a tutorial in three parts:
- Part 1: Interactive Map Shader: Vertex Displacement
- Part 2: Interactive Map Shader: Scrolling Effect
- Part 3: Interactive Map Shader: Terrain Shading
This effect will serve as the base for more advanced techniques, such as holographic projections and even Black Panther’s sand table.
A link to download the Unity package for this tutorial can be found at the end of this article.
The inspiration for this tutorial comes from a tweet that Baran Kahyaoglu posted to showcase some of the work he has been doing for Mapbox.
The scene (minus the map) comes from the Unity Visual Effect Graph Spaceship demo (below), which you can download here.
Anatomy of the Effect
The first thing is easy to notice is that geographical maps are flat: when used as textures, they lack the three-dimensionality that a true 3D model of that same region would have.
The first solution you can implement is creating a 3D model of the region you want in your game, and them using the geographical map as its texture. That works perfectly, but is time-consuming and stops you from implementing the “scrolling” effect seen in Baran Kahyaoglu’s video.
It is obvious that the best way to move forward is to go for a more technical approach. Luckily, shaders can be used to alter the geometry of a 3D model. This can be exploited to shape any flat plane into the valleys and mountains of the region we want.
For this tutorial, I will use a map of the Quillota region in Chile, which is known for its characteristic hills. The image below shows a texture of the region applied to a circular mesh.

While hills and mountains can be seen, they appear completely flat. This destroys any illusion of realism.
Normal Extrusion
The first step is to use shaders to alter the geometry using a technique called normal extrusion. What is needed is a vertex modifier: a function capable of manipulating the individual vertices of a 3D model.
How you use a vertex modifier changes based on the type of shader you have. In this tutorial, we are showing how to edit a Surface Standard Shader, which is one of the types of shaders that you can create with Unity.
There are many ways we can manipulate the vertices of a 3D model. One of the very first techniques that most vertex shaders tutorial teach is the normal extrusion. The idea is to push each vertex “outwards” (extrude), giving a more inflated look to a 3D model. The concept of “outwards” comes from the fact that each vertex is moved along its normal direction.

This works very well for smooth surfaces, but can create some weird artefacts for models which vertices are not properly welded. This effect was also explained in one of my very first tutorials: A Gentle Introduction to Shaders, where I showed how to extrude and intrude a 3D model.

Adding normal extrusion to a surface shader is easy. Each surface shader has a #pragma
directive, which is used to provide additional pieces of information and commands. One of these is vertex:vert
, which indicates that the function called vert
will be used to process each vertex of the 3D model.
The edited shader looks like this:
#pragma surface surf Standard fullforwardshadows addshadow vertex:vert ... float _Amount; ... void vert(inout appdata_base v) { v.vertex.xyz += v.normal * _Amount; }
Since we are changing the position of the vertices, we also need to use addshadow
if we want the model to correctly cast shadows on itself.
❓ What is appdata_base?
We can see that we have added the vertex modifier function (vert
) which takes as a parameter a structure called appdata_base
. This structure is what stores the information about every single vertex of the 3D model. It contains not just the vertex position (v.vertex
), but also other fields such as the normal direction (v.normal
) and the texture information (v.texcoord
) associated with it.
For certain applications, that is not enough and we might need other properties, such as the vertex colour (v.color
) and the tangent direction (v.tangent
). Vertex modifiers can be defined with a variety of other input structures, including appdata_tan
and appdata_full
, which provide more information at a small performance cost. You can read more about appdata
(and its variants) on the Unity3D wiki.
❓ How are values returned from vert?
The vertex function has no return value. If you are familiar with C#, you should also know that structures are passed by value, meaning that editing v.vertex
only affect the copy of v which scope is limited to the function body.
However, v is also declared as inout
, which means that is both used as an input and output. Any change we do to it, changes the actual variable that was passed to vert. The keywords inout
and out are very often used in Cg, and they loosely correspond to ref and out in C#, respectively.
⭐ Recommended Unity Assets
Unity is free, but you can upgrade to Unity Pro or Unity Plus subscription plans to get more functionalities and training resources for your games.
Normal Extrusion With Textures
The code we have used in the section above works correctly, but is far from the effect we want to achieve. The reason is that we do not want to extrude all vertices by the same amount. We want the surface of our 3D model to match the valleys and peaks of the geographical region it represents. Firstly, we need to somehow store and retrieve the information of how raised each point on the map is. We want, in a nutshell, the extrusion to be modulated by a texture, which encodes the heights of our landscape. Such textures are often referred to as heightmaps, although it is not uncommon to see them called depthmaps, based on the context. Once the height information is available, we can modulate the extrusion of a flat plane based on the heightmap. As seen in the diagram below, this allows controlling which areas will be raised and which ones will be lowered.

It is relatively easy to find a satellite image of the geographical area of your interest, and its associated heightmap. Below, you can see a satellite map of Mars (left) and its heightmap (right), which have been used in this tutorial:
I have covered the concept of depthmaps extensive in another series titled Inside Facebook 3D Photos: Parallax Shaders.
For this tutorial, we will assume that the heightmap is stored is a grayscale image in which black and white correspond to the lower and higher altitudes, respectively. We also need these values to be scaled linearly, meaning that (for instance) a difference in colours of corresponds to the same difference in height whether is between
and
, or
and
. When it comes to depthamps, this is not always the case since many of them store the depth information in a logarithmic scale.
Sampling a texture requires two pieces of information: the texture itself, and the UV coordinates of the point we want to sample. The latter can be accessed through the field texcoord
stored in the appdata_base
structure. That is the UV coordinate associated with the vertex currently being processed. Sampling textures in a surface function is done using tex2D
, although tex2Dlod
is required when we are in a vertex function.
In the snippet below, a texture called _HeightMap
is used to modulate the amount of extrusion performed on each vertex:
sampler2D _HeightMap; ... void vert(inout appdata_base v) { fixed height = tex2Dlod(_HeightMap, float4(v.texcoord.xy, 0, 0)).r; vertex.xyz += v.normal * height * _Amount; }
❓ Why can’t tex2D be used in a vertex function?
If you look at the shader code that Unity generates for a Standard Surface Shader, you will notice that it already contains an example of how to sample textures. In particular, it sampled the main texture (called _MainTex
) in the surface function (called surf) using a built-in function called tex2D
.
Indeed, tex2D
is the right function to sample pixels from a texture, whether that is used to store colours or heights. However, you might notice that tex2D
cannot be used within a vertex function.
The reason is that tex2D
does not only read pixels from a texture. It also decides which version of the texture to use, based on the distance from the camera. Loosely speaking, this is known as mipmapping, and allows having smaller versions of the same texture to be used at different distances, automatically.
In the surface function, the shader already knows which mipmap to use. That information might not be yet available in a vertex function, which is why tex2D
cannot be used reliably. The function tex2Dlod
, however, allows for two extra parameters which, for the purpose of this tutorial, can be set to zero.
The result can be seen quite clearly below:
There is one small simplification that can be done in our case. The code seen so far is supposed to work on any geometry. However, we can assume that our surface is completely flat. In fact, what we really want is to use this effect on a flat plane.
Consequently, we can remove v.normal
and replace it with float3(0, 1, 0)
:
void vert(inout appdata_base v) { float3 normal = float3(0, 1, 0); fixed height = tex2Dlod(_HeightMap, float4(v.texcoord.xy, 0, 0)).r; vertex.xyz += normal * height * _Amount; }
This was possible because all coordinates in appdata_base
are stored in model space, meaning that they are relative to the centre and orientation of the 3D model. Translating, rotating and scaling an object using its transform in Unity change the position, rotation and scale of the object, but leaves its original 3D model unaffected.
📰 Ad Break
What’s Next…
In the next part of this online course, we will explore how to implement a scrolling effect, so that we can actually move the geometry around.
- Part 1: Interactive Map Shader: Vertex Displacement
- Part 2: Interactive Map Shader: Scrolling Effect
- Part 3: Interactive Map Shader: Terrain Shading
Unity Package Download
Become a Patron!
The full package for this tutorial is available on Patreon, and it includes all the assets necessary to reproduce the technique here presented.
Leave a Reply