in c#, maths, Unity3D

How to generate Gaussian distributed numbers

Share Button

In a previous post I’ve introduced the Gaussian distribution and how it is commonly found in the vast majority of natural phenomenon. It can be used to dramatically improve some aspect of your game, such as procedural terrain generation, enemy health and attack power, etc. Despite being so ubiquitous, very few gaming frameworks offer functions to generate numbers which follow such distribution. Unity developers, for instance, heavily rely on Random.Range which generates uniformly distributed numbers (in blue). This post will show how to generate Gaussian distributed numbers (in red) in C#.


I’ll be explaining the Maths behind it, but there is no need to understand it to use the function correctly. You can download the RandomGaussian Unity script here.

Step 1: From Gaussian to uniform

Many gaming frameworks only include functions to generate continuous uniformly distributed numbers. In the case of Unity3D, for instance, we have  Random.Range(min, max) which samples a random number from min and max. The problem is to create a Gaussian distributed variable out of a uniformly distributed one.

Sample two Gaussian distributed values

Gaussian - Copy - CopyLet’s imagine we already have two independent, normally distributed variables:

    \[X \sim \mathcal{N} \left(0,1 \right)\]

    \[Y \sim \mathcal{N} \left(0,1\right)\]

from which we sampled two values, x and y, respectively. Sampling several of these points in the Cartesian plane will generate a roundly shaped cloud centred at (0,0).

Calculate their joint probability

Gaussian - Copy - Copy (2)The probability of having a certain (x,y) is defined as the probability sampling x from X, times the probability of sampling y from Y. This is called joint probability and since samplings from X and Y are independent from each other:

    \[P(x,y) = P(x)P(y) =\]

    \[ =\frac{1}{\sqrt{2\pi }}e^{-{\frac{x^2}{2}}}\frac{1}{\sqrt{2\pi }}e^{-{\frac{y^2}{2}}} =\frac{1}{2\pi}e^{-{\frac{x^2+y^2}{2}}}\]

Switch to polar coordinates

The point (x,y) can be represented in the plane using polar coordinates, with an angle \theta and its distance from the origin R:

Gaussian - Copy


    \[\theta=arctan\left (\frac{y}{x}\right )\]

Now the original point (x,y) can be expressed as a function of R and \theta:

    \[x = R cos\left(\theta\right)\]

    \[y = R sin\left(\theta\right)\]

Rewrite the joint probability

We can now rewrite the joint probability P(x,y):

    \[\frac{1}{2\pi }e^{-{\frac{x^2+y^2}{2}}}=\frac{1}{2\pi}e^{-{\frac{R^2}{2}}}=\left (\frac{1}{2\pi }  \right )\left (e^{-{\frac{R^2}{2}}}  \right )\]

which is the product of the two probability distributions:

    \[R^2 \sim Exp\left(\frac{1}{2}\right)\]

    \[\theta \sim Unif(0,2\pi) = 2\pi Unif(0,1)\]

Expanding the exponential distribution

While \theta is already in a form which can be expressed using a uniformly distributed variable, a little bit more work is necessary for R^2. Remembering the definition of the exponential distribution:


    \[R \sim \sqrt{-2log\left( Unif(0,1)\right)}\]

Now both \theta and R, which are coordinates of a point generated from two independent Gaussian distributions, can be expressed using two uniformly distributed values.

Step 2: From uniform to Gaussian

We can now reverse the procedure done in Step 1 to derive a simple algorithm:

  1. Generate two random numbers u_1,u_2 \sim Unif(0,1)
  2. Use them to create the radius R = \sqrt{-2log\left(u_1\right)} and the angle \theta = 2\pi u_2
  3. Convert (R, \theta) from polar to Cartesian coordinates: (R cos\theta, R sin\theta)

This is know as the Box-Muller transform. The image below (from Wikipedia) shows how the uniformly distributed points from the unit square are re-mapped by the Box-Muller transform onto the Cartesian plane, in a Gaussian fashion.


Step 3: The Marsaglia polar method

The Box-Muller transform has a problem: it uses trigonometric functions which are notoriously slow. To avoid that, a slightly different technique exists, called the Marsaglia polar method. Despite being similar, it stars from an uniformly distributed point in the interval (-1,+1). This point must fit within the unit circle and shouldn’t be the origin (0,0); if it doesn’t, another one has to be chosen.

Approximately 21% of the points will be rejected with this method.

Step 4: Mapping to arbitrary Gaussian curves

The algorithm described in Step 3 provides a way to sample from \mathcal{N} \left(0,1 \right). We can transform that into any arbitrary \mathcal{N} \left(\mu,\sigma^2 \right) like this:

GaussianbThere is yet another problem: Gaussian distributions have the nasty habit to generate numbers which can be quite far from the mean. However, clamping a Gaussian variable between a min and a max can have quite catastrophic results. The risk is to squash the left and right tails and having a rather bizarre function with three very likely values: the mean, the min and the max. The most common technique to avoid this is to take another sample if it falls outside its range:

Another solution changes the parameter of the curve so that min and max are at 3.5 standard deviations from the mean (which should contain more than 99.9% of the sampled points).


This tutorial shows how to sample Gaussian distributed numbers starting from uniformly distributed ones. You can download the complete RandomGaussian script for Unity here.

Other resources

📧 Stay updated

A new tutorial is released every week.

💖 Support this blog

This websites exists thanks to the contribution of patrons on Patreon. If you think these posts have either helped or inspired you, please consider supporting this blog.


Write a Comment



  • Understanding the Gaussian distribution - Alan Zucconi June 11, 2017

    He did not ask for money, he posted a link which only looked like it was free. That does not make him an asshole, but it’s definitely a thing assholes do all the time, too. 🙂