in programming, shader, tutorial, Unity3D

Arrays & Shaders in Unity 5.4+

Share Button

This post shows how to use arrays and shaders in Unity 5.4. Back in January I already covered this topic in an article called Arrays & shaders: Heatmaps in Unity. My original approach exposed an undocumented feature that allowed to pass arrays to shaders. Since then, Unity 5.4 has introduced proper support in its API. This tutorial replaces the previous article. If you have read the previous tutorial, you do not need any changes to your shader code and you can skip to Step 2.

Step 1. The Shader

Every shader has a section called Property, which allows to expose certain variables in the material inspector. At the time this article is written, Unity does not support an array type. Consequently, you cannot access arrays directly from the inspector. It’s worth notice that Unity supports a type called 2DArray, but that is reserved for texture arrays; what we want is an array of numbers instead.

All arrays, have to be declared as variables and initialised externally via a script. Arrays in shaders need to have a predefined length. If you don’t know in advance how many items you’ll need to store, allocate more space and keep a variable (for instance, _ArrayLength) that will indicates how many items are actually present.

In the example above, both variables has been decorated with the uniform keyword. This is because their value is changed from an external script, and we assume that those changes do not happen in between frames.

You can now look your array in the shader code like any other type of array:

Step 2. The Script

If you want to use your shader, you need to initialise the array using an external script. The new API of Unity 5.4+ supports the following SetFloatArray, SetMatrixArray and SetVectorArray. As expected, they are used to initialise arrays of float, Matrix4x4 and Vector4, respectively. This is a snippet on how to correctly use those functions:

where material is the Unity material that uses your shader. You can drag it directly from the inspector, or retrieving it via code:

Unity 5.4 also supports global arrays. Those are properties that are set once and are then shared by all shaders. They work in a similar way, and have signatures  SetGlobalFloatArray, SetGlobalMatrixArray and SetGlobalVectorArray. However, they are static methods of the Shader class.

Step 3. Limitations

If you need to pass other types of arrays (such as int, long, Vector3, …) you have to use the method that closely matches your needs. For instance, you can fit int values in an array of floats. Similarly, if you want to provide Vector3s to your shader, you’ll need to wrap them into Vector4s. You can automatically assign a Vector3 to a Vector4, as Unity will automatically fit them in the right way, leaving the last coordinate set to zero. However, you cannot assign a Vector3[] to a Vector4[].

The second consideration that you have to keep in mind it involves some poor design choices made by Unity. It seems that the first time that you use set an array (whether it’s locally or globally), Unity fixes the size of the array itself. For instance, if you initialise an array defined in the shader as uniform float _Array[10]; with a C# array defined as float[] array = new float[5];, you will not be able to set more than 5 elements to your array. Whether this is a bug or a feature, it makes for some very nasty bugs. Waiting for this to be corrected, I advice you to initialise your arrays with the maximum size allowed, directly on the Awake function of a script:

Some users reported that once the arrays have been initialised, you need to restart the editor to be able to reset their size. Well, now you’ve been warned…

📧 Stay updated

A new tutorial is released every week.

💖 Support this blog

This websites exists thanks to the contribution of patrons on Patreon. If you think these posts have either helped or inspired you, please consider supporting this blog.


Write a Comment



  1. I’m working with somewhat large dataset.
    At first I’ve noticed there’s an array size limit on cg (it’s mentioned on nvidia web site, but I didn’t see the exact size limit) – could you elaborate on this subject?
    I’m able to have an array size of 2000 without compilation errors.
    But it seems I’m unable to read values of this array at places greater than 1023.
    Any help on this subject would be appreciated. Thanks for the great tutorials.

    • Hey!
      Yes, the size of arrays in shaders is definitely limited.
      I don’t think this is something you can change from Unity, unfortunately.
      However, you could split your render mesh in multiple quads, each one with a different material. You can check which points ends up in which quad, and assign only them.
      As long as no quad has more than 1023 points in it, it should work!

  2. I copied your example but it does not seem to work. The array values are always just 0 even though I attempt to set them every “OnRenderImage”

    I was wondering how do the variables look in the shader itself? Are you just declaring them in the subshader or are you using the “property” section? I didn’t think you’d have to since you don’t with regular floats.

    Also, in regards to your final point about fixed array size, isn’t that just the way shaders work? I thought you could never dynamically allocate memory in a shader, only reinstantiate. Which is why you can’t use lists in an array.

    But I’m a total noob so I could be recalling that incorrectly.

    Anyway, thanks a lot for the help.

    • Alright, just got it working through adding a second Color array which actually is a Vector4 array and copied all the Color elements on creation as Vector4 into the Vector4 array. And a removed the SetVector lines in each loop iteration as it didnt work with Unity 5.6. Instead im setting the whole array at once at the end of the loop with SetVectorArray, providing both needed arrays:

      material.SetVectorArray(“_Points”, points);
      material.SetVectorArray(“_Colors”, coloursAsV4); //this is the vector4 array a created, just copy the elements from the other color array

  3. Hi Alan,

    Great work, very helpful!

    The shader works perfectly in the Editor; however, instead of a circular heatmap data visualization I see triangles on my Android device (Galaxy S6).

    I’ve tried changing the Graphics and Quality settings in Unity and enabled 32 bit display buffer in Build Settings, but the shader is still rendering triangles instead of circles. Any guidance you could give would be greatly appreciated. Thanks again!

    • Hey!
      This is really bizarre! I have not experienced this error.
      If you change your shader, can you get the entire quad to display correctly?
      Or it still only draws triangles?

      • Thanks a lot for the quick response!

        Hmm, I’m not sure if I understand your question, so sorry if my reply doesn’t make sense. The entire quad displays correctly in the Unity Editor but nothing I’ve attempted fixes the triangle issue on my Android device.

        I’ve tried many different values for the Properties (radius & intensity), but the shader continues to render triangles. I’ve also tried various edits to the Shader code, like adding:
        #pragma fragmentoption ARB_precision_hint_nicest
        #include “UnityCG.cginc”

        But no luck… I wonder if changing either of these lines would work?
        half di = distance(vertOut.worldPos, _Points[i].xyz);
        half ri = _Properties[i].x;

        I know nothing about coding Shaders, so I’m not sure if there’s any edits that could be made to force it to render circles rather than triangles.

        Thanks again!

        • This is something that is very hard to debug, especially since I can’t debug it on an Android device at the moment.

          I want to understand where the problem is. It could be a problem of how arrays and shaders work on Android. Or it could be a problem with the precision on Android. Is hard to tell!

          The best way to see what the problem is is to progressively strip the shader from all of its code to see what is the minimum bit that works. If you change the shader so that it draws a solid colour, does it work? If you re-enable the shader, piece by piece, you can identify at which point the shader breaks.

  4. Here is the updated version that works with unity 2017.3

    // Alan Zucconi
    using UnityEngine;
    using System.Collections;

    public class Heatmap : MonoBehaviour

    public Vector4[] positions;
    public float[] radiuses;
    public float[] intensities;
    Vector4[] properties;

    public Material material;

    public int count = 50;

    void Start ()
    positions = new Vector4[count];
    radiuses = new float[count];
    intensities= new float[count];
    properties = new Vector4[count];

    for (int i = 0; i < positions.Length; i++)
    positions[i] = new Vector4(Random.Range(-0.4f, +0.4f), Random.Range(-0.4f, +0.4f));
    radiuses[i] = Random.Range(0f, 0.25f);
    intensities[i] = Random.Range(-0.25f, 1f);

    void Update()
    material.SetInt("_Points_Length", positions.Length);
    for (int i = 0; i < positions.Length; i++)
    positions[i] += new Vector4(Random.Range(-0.1f,+0.1f), Random.Range(-0.1f, +0.1f),0) * Time.deltaTime ;

    properties[i] = new Vector4(radiuses[i], intensities[i],0,0);

    material.SetVectorArray("_Points", positions);
    material.SetVectorArray("_Properties", properties);

  5. Hello I am working on Fove (Eye tracking) HMD and I want to generate headmap using your technique. I have passed the rays from an eye to the shader as position but the problem is every time i do it. I dont really get anything. If you could please assist me this. I would really appreciate it.
    here is my code.

    using System.Collections;
    using System.Collections.Generic;
    using UnityEngine;
    using UnityEngine.UI;

    public class EyesTrack : MonoBehaviour {

    public FoveInterface fove;
    Vector4[] pos;
    Vector4[] properties;
    public float[] radiuses;
    public float[] intensities;

    public Material material;

    // Use this for initialization
    void Start () {
    int obj1 = (int) GetComponent ().bounds.size.magnitude;
    print (“magniture”+ obj1);
    pos = new Vector4[obj1];
    radiuses = new float[obj1];
    intensities= new float[obj1];
    properties = new Vector4[obj1];
    void Update () {
    // position of the object
    material.SetInt(“_Points_Length”, pos.Length);
    FoveInterfaceBase.EyeRays eyeRay = fove.GetGazeRays ();
    Ray rays = new Ray ();
    rays = eyeRay.left;
    RaycastHit hit;
    if (Physics.Raycast (rays, out hit, Mathf.Infinity))
    for (int i = 0; i < pos.Length; i++)
    pos [i] = (Vector4)hit.point* Time.deltaTime;
    properties [i] = new Vector4 (radiuses [i], intensities [i], 0, 0);
    print ("pos at the update " + pos [i]);

    print ("hit points" + hit.point);

    material.SetVectorArray("_Points", pos);
    material.SetVectorArray("_Properties", properties);

    • forgot to mention your example work is working very much fine with me and I am using Unity 2017.2.
      Thank you


  • Postprocessing and image effects in Unity - Shader Tutorial April 9, 2018

    forgot to mention your example work is working very much fine with me and I am using Unity 2017.2.
    Thank you

  • Unity3D: Tutoriais e Documentação de Shaders | April 9, 2018

    forgot to mention your example work is working very much fine with me and I am using Unity 2017.2.
    Thank you

  • Arrays & shaders: heatmaps in Unity - Alan Zucconi April 9, 2018

    forgot to mention your example work is working very much fine with me and I am using Unity 2017.2.
    Thank you