To make my own rays, I have deconstructed the rays from the mobile temple demo. These are a bit less complicated than the blueprints found in the office demo.
The material is split up into emissive colour, opacity, a vertex shader that stretches the model in the light direction, and a custom set of uvs to go with this.
The emissive colour is made by creating a scalar parameter to deal with the contrast of the dust and linear interpolating with with an inverse version of a cloudy texture. This is then clamped and multiplied with a colour vector parameter. Vertex colour and a scalar parameter for edge brightness is then added together, clamped and multiplied with the other set of nodes.
The opacity is created by dividing the pixel depth with a parameter that decides how far away from the camera the effect should start to fade. This is then multiplied by the inverted vertex colour and depth faded with a scalar parameter.
WorldPositionOffset – Vertex Shader
This moves the vertices along the sun direction. Vertex colour is multiplied by direction, a vector parameter which is multiplied by length, a scalar parameter.
Custom Uvs – For Vertex Shader
A scalar parameter for the ratio is appended to a constant of 1, which is them multiplied by the scale value and divided by the texture coordinate. This is then panned to create the shimmering down look. The time slot in the panner is filled by time multiplied by a scalar parameter for speed.
After determining all of this, there were a few concepts I did not understand or nodes that I had not used before, so I did a bit of research into these.
“Each vertex in a mesh can optionally store a RGB color value, called vertex color. These can be used for a variety of interesting effects and shader inputs”
I had to have a chat with a programmer to help get this in my head, but essentially, it is a bit like skinning, it assigns a colour value to each vertex and that can then be used in calculations that manipulate the vertices.
“The Clamp expression takes in value(s) and constrains them to a specified range, defined by a minimum and maximum value. A minimum value of 0.0 and maximum value of 0.5 means that the resulting value(s) will never be less than 0.0 and never greater than 0.5. “
The clamp takes whatever you input, and converts it to a value within a specified range. Why would you use this?
“The CustomizedUV pins pass through the texture coordinates (UVs) in the mesh by default. Then when you place a Texcoord node in a pixel shader input (like BaseColor), you are still getting the mesh’s texture coordinates. However, if you do something in CustomizedUV0 and then use Texcoord 0 in the BaseColor input, you will get the modified UV value. Note that Texture sample nodes use TexCoord 0 by default. “
Custom UVs will modify the uv value on pixel shaders (like base colour, or emissive colour) if a texture coordinate node set to 0 is in there.
“The AppendVector expression allows you to combine channels together to create a vector with more channels than the original. For example, you can take two individual Constants values and append them to make a two-channel Constant2Vector value. This can be useful for reordering the channels within a single texture or for combining multiple grayscale textures into one RGB color texture.”
Append essentially makes a vector out of whatever you put into it, to make something with multiple channels. You could use this to put different textures together and treat it like you would a Pmask texture.
The mesh is very simple. Because the shape of the ray is derived from the vertex shader, it is just a curved plane with enough geometry to stretch nicely.
Time to make my own version now!
Mobile Temple Demo. 2014. [Digital Download]. PC. Epic Games.
Chadwick, E. 2014. Vertex Colour [online]. Polycount Wiki,
Programme Executives: CCT. Available from:
http://oldwiki.polycount.com/VertexColor [Accessed 3 October 2014].
Unreal Engine Documentation. 2014. Math Expressions [online]. Available from:
[Accessed 3 October 2014].
Unreal Engine Documentation. 2014. Customised Uvs [online]. Available from:
[Accessed 3 October 2014].