in C#, Maths, Unity

How to generate Gaussian distributed numbers

In a previous post I’ve introduced the Gaussian distribution and how it is commonly found in the vast majority of natural phenomenon. It can be used to dramatically improve some aspect of your game, such as procedural terrain generation, enemy health and attack power, etc. Despite being so ubiquitous, very few gaming frameworks offer functions to generate numbers which follow such distribution. Unity developers, for instance, heavily rely on Random.Range which generates uniformly distributed numbers (in blue). This post will show how to generate Gaussian distributed numbers (in red) in C#.

gvu

I’ll be explaining the Maths behind it, but there is no need to understand it to use the function correctly. You can download the RandomGaussian Unity script here.

Step 1: From Gaussian to uniform

Many gaming frameworks only include functions to generate continuous uniformly distributed numbers. In the case of Unity3D, for instance, we have Random.Range(min, max) which samples a random number from min and max. The problem is to create a Gaussian distributed variable out of a uniformly distributed one.

Sample two Gaussian distributed values

Gaussian - Copy - CopyLet’s imagine we already have two independent, normally distributed variables:

    \[X \sim \mathcal{N} \left(0,1 \right)\]

    \[Y \sim \mathcal{N} \left(0,1\right)\]

from which we sampled two values, x and y, respectively. Sampling several of these points in the Cartesian plane will generate a roundly shaped cloud centred at (0,0).

Calculate their joint probability

Gaussian - Copy - Copy (2)The probability of having a certain (x,y) is defined as the probability sampling x from X, times the probability of sampling y from Y. This is called joint probability and since samplings from X and Y are independent from each other:

    \[P(x,y) = P(x)P(y) =\]

    \[ =\frac{1}{\sqrt{2\pi }}e^{-{\frac{x^2}{2}}}\frac{1}{\sqrt{2\pi }}e^{-{\frac{y^2}{2}}} =\frac{1}{2\pi}e^{-{\frac{x^2+y^2}{2}}}\]

Switch to polar coordinates

The point (x,y) can be represented in the plane using polar coordinates, with an angle \theta and its distance from the origin R:

Gaussian - Copy

    \[R=\sqrt{x^2+y^2}\]

    \[\theta=arctan\left (\frac{y}{x}\right )\]

Now the original point (x,y) can be expressed as a function of R and \theta:

    \[x = R cos\left(\theta\right)\]

    \[y = R sin\left(\theta\right)\]

Rewrite the joint probability

We can now rewrite the joint probability P(x,y):

    \[\frac{1}{2\pi }e^{-{\frac{x^2+y^2}{2}}}=\frac{1}{2\pi}e^{-{\frac{R^2}{2}}}=\left (\frac{1}{2\pi }  \right )\left (e^{-{\frac{R^2}{2}}}  \right )\]

which is the product of the two probability distributions:

    \[R^2 \sim Exp\left(\frac{1}{2}\right)\]

    \[\theta \sim Unif(0,2\pi) = 2\pi Unif(0,1)\]

Expanding the exponential distribution

While \theta is already in a form which can be expressed using a uniformly distributed variable, a little bit more work is necessary for R^2. Remembering the definition of the exponential distribution:

    \[Exp(\lambda)=\frac{-log(Unif(0,1))}{\lambda}\]

    \[R \sim \sqrt{-2log\left( Unif(0,1)\right)}\]

Now both \theta and R, which are coordinates of a point generated from two independent Gaussian distributions, can be expressed using two uniformly distributed values.

Step 2: From uniform to Gaussian

We can now reverse the procedure done in Step 1 to derive a simple algorithm:

  1. Generate two random numbers u_1,u_2 \sim Unif(0,1)
  2. Use them to create the radius R = \sqrt{-2log\left(u_1\right)} and the angle \theta = 2\pi u_2
  3. Convert (R, \theta) from polar to Cartesian coordinates: (R cos\theta, R sin\theta)

This is know as the Box-Muller transform. The image below (from Wikipedia) shows how the uniformly distributed points from the unit square are re-mapped by the Box-Muller transform onto the Cartesian plane, in a Gaussian fashion.

Box-Muller_transform_visualisation.svg

Step 3: The Marsaglia polar method

The Box-Muller transform has a problem: it uses trigonometric functions which are notoriously slow. To avoid that, a slightly different technique exists, called the Marsaglia polar method. Despite being similar, it stars from an uniformly distributed point in the interval (-1,+1). This point must fit within the unit circle and shouldn’t be the origin (0,0); if it doesn’t, another one has to be chosen.

public static float NextGaussian() {
    float v1, v2, s;
    do {
        v1 = 2.0f * Random.Range(0f,1f) - 1.0f;
        v2 = 2.0f * Random.Range(0f,1f) - 1.0f;
        s = v1 * v1 + v2 * v2;
    } while (s >= 1.0f || s == 0f);

    s = Mathf.Sqrt((-2.0f * Mathf.Log(s)) / s);
 
    return v1 * s;
}

Approximately 21% of the points will be rejected with this method.

Step 4: Mapping to arbitrary Gaussian curves

The algorithm described in Step 3 provides a way to sample from \mathcal{N} \left(0,1 \right). We can transform that into any arbitrary \mathcal{N} \left(\mu,\sigma^2 \right) like this:

public static float NextGaussian(float mean, float standard_deviation)
{
    return mean + NextGaussian() * standard_deviation;
}

GaussianbThere is yet another problem: Gaussian distributions have the nasty habit to generate numbers which can be quite far from the mean. However, clamping a Gaussian variable between a min and a max can have quite catastrophic results. The risk is to squash the left and right tails and having a rather bizarre function with three very likely values: the mean, the min and the max. The most common technique to avoid this is to take another sample if it falls outside its range:

public static float NextGaussian (float mean, float standard_deviation, float min, float max) {
	float x;
	do {
		x = NextGaussian(mean, standard_deviation);
	} while (x < min || x > max);
	retun x;
}

Another solution changes the parameter of the curve so that min and max are at 3.5 standard deviations from the mean (which should contain more than 99.9% of the sampled points).

Conclusion

This tutorial shows how to sample Gaussian distributed numbers starting from uniformly distributed ones. You can download the complete RandomGaussian script for Unity here.

Other resources


💖 Support this blog

This website exists thanks to the contribution of patrons on Patreon. If you think these posts have either helped or inspired you, please consider supporting this blog.

Patreon Patreon_button
Twitter_logo

YouTube_logo
📧 Stay updated

You will be notified when a new tutorial is released!

📝 Licensing

You are free to use, adapt and build upon this tutorial for your own projects (even commercially) as long as you credit me.

You are not allowed to redistribute the content of this tutorial on other platforms, especially the parts that are only available on Patreon.

If the knowledge you have gained had a significant impact on your project, a mention in the credit would be very appreciated. ❤️🧔🏻

Write a Comment

Comment

  1. I would be interested to learn more about other random distributions – would you ever consider writing a blog about different distributions that exist with the maths involved, and their uses in games of course- i think that would be quite interesting.

    • Hey! This is a very interesting topic, and I do have a few more Maths-related posts in the making!
      However, there I don’t have any precise timeline for them yet!

Webmentions

  • Understanding the Gaussian distribution - Alan Zucconi April 25, 2021

    […] Part 2: How to generate Gaussian distributed numbers […]