Creating a grass shader in the Unity engine

Original author: Erik Roystan Ross
  • Transfer

This tutorial will show you how to write a geometric shader to generate blades of grass from the tops of the incoming mesh and use tessellation to control the density of the grass.

The article describes the step-by-step process of writing a grass shader in Unity. The shader receives the incoming mesh, and from each vertex of the mesh generates a blade of grass using the geometric shader . For the sake of interest and realism, the blades of grass will have a randomized size and rotation , and they will also be affected by the wind . To control the density of the grass, we use tessellation to separate the incoming mesh. The grass will be able to cast and receive shadows.

Finished projectlaid out at the end of the article. The generated shader file contains a large number of comments that make understanding easier.

Requirements


To complete this tutorial, you will need practical knowledge about the Unity engine and an initial understanding of the syntax and functionality of shaders.

Download the draft of the project (.zip) .

Getting to work


Download the draft of the project and open it in the Unity editor. Open the scene Main, and then open the shader in your code editor Grass.

This file contains a shader that produces white color, as well as some functions that we will use in this tutorial. You will notice that these functions, together with the vertex shader, are included in the block CGINCLUDElocated outsideSubShader . The code placed in this block will be automatically included in all passes in the shader; this will come in handy later because our shader will have several passes.

We'll start by writing a geometric shader that generates triangles from each vertex on the surface of our mesh.

1. Geometric Shaders


Geometric shaders are an optional part of the rendering pipeline. They are executed after the vertex shader (or tessellation shader if tessellation is used) and before the vertices are processed for the fragment shader.


Direct3D Graphics Pipeline 11. Notice that in this diagram the fragment shader is called pixel shader .

Geometric shaders receive a single primitive at the input and can generate zero, one or many primitives. We will start by writing a geometric shader that receives a vertex (or point ) at the input , and that feeds one triangle representing a blade of grass.

// Add inside the CGINCLUDE block.
struct geometryOutput
{
	float4 pos : SV_POSITION;
};
[maxvertexcount(3)]
void geo(triangle float4 IN[3] : SV_POSITION, inout TriangleStream triStream)
{
}
…
// Add inside the SubShader Pass, just below the #pragma fragment frag line.
#pragma geometry geo

The code above declares a geometric shader named geowith two parameters. The first one triangle float4 IN[3], reports that he will take one triangle (consisting of three points) as input. The second, like TriangleStream, sets up a shader to output a stream of triangles so that each vertex uses a structure to transmit its data geometryOutput.

We said above that the shader will receive one vertex and output a blade of grass. Why then do we get a triangle?
It will be less expensive to take as input точку. This can be done as follows.

void geo(point vertexOutput IN[1], inout TriangleStream triStream)

However, since our incoming mesh (in this case GrassPlane10x10, located in the folder Mesh) has a mesh topology of triangles , this will cause a mismatch between the topology of the incoming mesh and the required input primitive. Although this is allowed in DirectX HLSL, it is not allowed in OpenGL , so an error will be displayed.

In addition, we add the last parameter in the brackets over the function declaration: [maxvertexcount(3)]. He tells the GPU that we will output (but are not required to do) no more than 3 vertices. We also make it SubShaderuse the geometric shader by declaring it inside Pass.

Our geometric shader is not doing anything yet; to draw a triangle, add the following code inside the geometric shader.

geometryOutput o;
o.pos = float4(0.5, 0, 0, 1);
triStream.Append(o);
o.pos = float4(-0.5, 0, 0, 1);
triStream.Append(o);
o.pos = float4(0, 1, 0, 1);
triStream.Append(o);


This gave very strange results. When you move the camera, it becomes clear that the triangle is rendered in the screen space . This is logical: since the geometric shader is executed immediately before processing the vertices, it takes away from the vertex shader the responsibility for the vertices to be displayed in the truncation space . We will change our code to reflect this.

// Update the return call in the vertex shader.
//return UnityObjectToClipPos(vertex);
return vertex;
…
// Update each assignment of o.pos in the geometry shader.
o.pos = UnityObjectToClipPos(float4(0.5, 0, 0, 1));
…
o.pos = UnityObjectToClipPos(float4(-0.5, 0, 0, 1));
…
o.pos = UnityObjectToClipPos(float4(0, 1, 0, 1));


Now our triangle is rendered correctly in the world. However, it seems that only one is created. In fact, one triangle is drawn for each vertex of our mesh, but the positions assigned to the vertices of the triangle are constant - they do not change for each incoming vertex. Therefore, all triangles are located one on top of the other.

We will fix this by making the outgoing vertex positions offsets relative to the incoming point.

// Add to the top of the geometry shader.
float3 pos = IN[0];
…
// Update each assignment of o.pos.
o.pos = UnityObjectToClipPos(pos + float3(0.5, 0, 0));
…
o.pos = UnityObjectToClipPos(pos + float3(-0.5, 0, 0));
…
o.pos = UnityObjectToClipPos(pos + float3(0, 1, 0));


Why don't some vertices create a triangle?

Although we have determined that the incoming primitive will be a triangle , a blade of grass is transmitted only from one of the points of the triangle, discarding the other two. Of course, we can transfer a blade of grass from all three incoming points, but this will lead to the fact that neighboring triangles excessively create blades of grass on top of each other.

Or you can solve this problem by taking meshes having the type of Topology points as incoming meshes of the geometric shader .

Triangles are now drawn correctly, and their base is located at the peak emitting them. Before moving on, make the object GrassPlaneinactive in the scene, and GrassBallmake the object active . We want the grass to generate correctly on different types of surfaces, so it’s important to test it on meshes of different shapes.


So far, all triangles are emitted in one direction, and not outward from the surface of the sphere. To solve this problem, we will create blades of grass in a tangent space .

2. Tangent space


Ideally, we would like to create blades of grass by setting a different width, height, curvature and rotation, not taking into account the angle of the surface from which the blade of grass is emitted. Simply put, we define a blade of grass in a space local to the vertex emitting it , and then transform it so that it is local to the mesh . This space is called tangent space .


In tangent space, the X , Y, and Z axes are defined relative to the normal and the position of the surface (in our case, the vertices).

Like any other space, we can define the tangent space of a vertex with three vectors: right , forward and up . Using these vectors, we can create a matrix for turning the blade of grass from tangent to local space.

You can access the vectors right and up by adding new input vertex data.

// Add to the CGINCLUDE block.
struct vertexInput
{
	float4 vertex : POSITION;
	float3 normal : NORMAL;
	float4 tangent : TANGENT;
};
struct vertexOutput
{
	float4 vertex : SV_POSITION;
	float3 normal : NORMAL;
	float4 tangent : TANGENT;
};
…
// Modify the vertex shader.
vertexOutput vert(vertexInput v)
{
	vertexOutput o;
	o.vertex = v.vertex;
	o.normal = v.normal;
	o.tangent = v.tangent;
	return o;
}
…
// Modify the input for the geometry shader. Note that the SV_POSITION semantic is removed.
void geo(triangle vertexOutput IN[3], inout TriangleStream triStream)
…
// Modify the existing line declaring pos.
float3 pos = IN[0].vertex;

The third vector can be calculated by taking the vector product between two others. A vector product returns a vector perpendicular to two incoming vectors.

// Place in the geometry shader, below the line declaring float3 pos.		
float3 vNormal = IN[0].normal;
float4 vTangent = IN[0].tangent;
float3 vBinormal = cross(vNormal, vTangent) * vTangent.w;

Why is the result of the vector product multiplied by the coordinate of the tangent w?
When exporting a mesh from a 3D editor, it usually has binormals (also called tangents to two points ) already stored in the mesh data. Instead of importing these binormals, Unity simply takes the direction of each binormal and assigns them to the coordinate of the tangent w . This allows you to save memory, while at the same time providing the ability to recreate the correct binormal. A detailed discussion of this topic can be found here .

Having all three vectors, we can create a matrix for the transformation between tangent and local spaces. We will multiply each vertex of the blade of grass by this matrix before passing in UnityObjectToClipPos, which expects a vertex in local space.

// Add below the lines declaring the three vectors.
float3x3 tangentToLocal = float3x3(
	vTangent.x, vBinormal.x, vNormal.x,
	vTangent.y, vBinormal.y, vNormal.y,
	vTangent.z, vBinormal.z, vNormal.z
	);

Before using the matrix, we transfer the vertex output code to the function so as not to write the same lines of code again and again. This is called the DRY principle , or don’t repeat yourself .

// Add to the CGINCLUDE block.
geometryOutput VertexOutput(float3 pos)
{
	geometryOutput o;
	o.pos = UnityObjectToClipPos(pos);
	return o;
}
…
// Remove the following from the geometry shader.
//geometryOutput o;
//o.pos = UnityObjectToClipPos(pos + float3(0.5, 0, 0));
//triStream.Append(o);
//o.pos = UnityObjectToClipPos(pos + float3(-0.5, 0, 0));
//triStream.Append(o);
//o.pos = UnityObjectToClipPos(pos + float3(0, 1, 0));
//triStream.Append(o);
// ...and replace it with the code below.
triStream.Append(VertexOutput(pos + float3(0.5, 0, 0)));
triStream.Append(VertexOutput(pos + float3(-0.5, 0, 0)));
triStream.Append(VertexOutput(pos + float3(0, 1, 0)));

Finally, we multiply the output vertices by the matrix tangentToLocal, correctly aligning them with the normal of their input point.

triStream.Append(VertexOutput(pos + mul(tangentToLocal, float3(0.5, 0, 0))));
triStream.Append(VertexOutput(pos + mul(tangentToLocal, float3(-0.5, 0, 0))));
triStream.Append(VertexOutput(pos + mul(tangentToLocal, float3(0, 1, 0))));

image

This is more like what we need, but not quite right. The problem here is that initially we assigned the direction “up” (up) of the Y axis ; however, in tangent space, the up direction is usually located along the Z axis . Now we will make these changes.

// Modify the position of the third vertex being emitted.
triStream.Append(VertexOutput(pos + mul(tangentToLocal, float3(0, 0, 1))));


3. Appearance of grass


To make the triangles look more like blades of grass, you need to add colors and variations. We begin by adding a gradient going down from the top of the blade of grass.

3.1 color gradient


Our goal is to allow the artist to set two colors - top and bottom, and to interpolate between these two colors he tip to the base of the blade of grass. These colors are already defined in the shader file as _TopColorwell _BottomColor. For their proper sampling, you need to pass UV coordinates to the fragment shader .

// Add to the geometryOutput struct.
float2 uv : TEXCOORD0;
…
// Modify the VertexOutput function signature.
geometryOutput VertexOutput(float3 pos, float2 uv)
…
// Add to VertexOutput, just below the line assigning o.pos.
o.uv = uv;
…
// Modify the existing lines in the geometry shader.
triStream.Append(VertexOutput(pos + mul(tangentToLocal, float3(0.5, 0, 0)), float2(0, 0)));
triStream.Append(VertexOutput(pos + mul(tangentToLocal, float3(-0.5, 0, 0)), float2(1, 0)));
triStream.Append(VertexOutput(pos + mul(tangentToLocal, float3(0, 0, 1)), float2(0.5, 1)));

We created UV-coordinates for a blade of grass in the shape of a triangle, the two vertices of the base of which are located on the lower left and right, and the tip top is located in the center at the top.


UV coordinates of the three vertices of the blades of grass. Although we paint the blades of grass with a simple gradient, a similar arrangement of textures allows you to overlay textures.

Now we can sample the top and bottom colors in the fragment shader using UV, and then interpolate them using lerp. We also need to modify the parameters of the fragment shader, making the input data geometryOutput, and not just the position float4.

// Modify the function signature of the fragment shader.
float4 frag (geometryOutput i, fixed facing : VFACE) : SV_Target
…
// Replace the existing return call.
return float4(1, 1, 1, 1);
return lerp(_BottomColor, _TopColor, i.uv.y);


3.2 Random blade direction


To create variability and give the grass a more natural look, we will make each blade of grass look in a random direction. To do this, we need to create a rotation matrix that rotates the blade of grass a random amount around its up axis .

There are two functions in the shader file that will help us do this: randgenerating a random number from three-dimensional input, and AngleAxis3x3receiving an angle (in radians ) and returning a matrix that rotates this value around the specified axis. The latter function works exactly the same as the C # Quaternion.AngleAxis function (only AngleAxis3x3returns a matrix, not a quaternion).

The function randreturns a number in the range 0 ... 1; we will multiply it by2 Pi to get the full range of angular values.

// Add below the line declaring the tangentToLocal matrix.
float3x3 facingRotationMatrix = AngleAxis3x3(rand(pos) * UNITY_TWO_PI, float3(0, 0, 1));

We use the incoming position posas seed for a random rotation. Due to this, each blade of grass will have its own rotation, constant in each frame.

The rotation can be applied to the blade of grass by multiplying it by the created matrix tangentToLocal. Note that matrix multiplication is not commutative ; the order of the operands is important .

// Add below the line declaring facingRotationMatrix.
float3x3 transformationMatrix = mul(tangentToLocal, facingRotationMatrix);
…
// Replace the multiplication matrix operand with our new transformationMatrix.
triStream.Append(VertexOutput(pos + mul(transformationMatrix, float3(0.5, 0, 0)), float2(0, 0)));
triStream.Append(VertexOutput(pos + mul(transformationMatrix, float3(-0.5, 0, 0)), float2(1, 0)));
triStream.Append(VertexOutput(pos + mul(transformationMatrix, float3(0, 0, 1)), float2(0.5, 1)));


3.3 Random forward bending


If all the blades of grass are perfectly aligned, they will appear the same. This may be suitable for well-groomed grass, for example, on a trimmed lawn, but in nature the grass does not grow like that. We will create a new matrix to rotate the grass along the X axis , as well as a property to control this rotation.

// Add as a new property.
_BendRotationRandom("Bend Rotation Random", Range(0, 1)) = 0.2
…
// Add to the CGINCLUDE block.
float _BendRotationRandom;
…
// Add to the geometry shader, below the line declaring facingRotationMatrix.
float3x3 bendRotationMatrix = AngleAxis3x3(rand(pos.zzx) * _BendRotationRandom * UNITY_PI * 0.5, float3(-1, 0, 0));

Again we use the position of the blade of grass as a random seed, this time by sweeping it to create a unique seed. We will also multiply UNITY_PIby 0.5 ; this will give us a random interval of 0 ... 90 degrees.

We again apply this matrix through rotation, multiplying everything in the correct order.

// Modify the existing line.
float3x3 transformationMatrix = mul(mul(tangentToLocal, facingRotationMatrix), bendRotationMatrix);


3.4 Width and height


While the size of the blade of grass is limited to a width of 1 unit and a height of 1 unit. We will add properties to control the size, as well as properties to add random variation.

// Add as new properties.
_BladeWidth("Blade Width", Float) = 0.05
_BladeWidthRandom("Blade Width Random", Float) = 0.02
_BladeHeight("Blade Height", Float) = 0.5
_BladeHeightRandom("Blade Height Random", Float) = 0.3
…
// Add to the CGINCLUDE block.
float _BladeHeight;
float _BladeHeightRandom;	
float _BladeWidth;
float _BladeWidthRandom;
…
// Add to the geometry shader, above the triStream.Append calls.
float height = (rand(pos.zyx) * 2 - 1) * _BladeHeightRandom + _BladeHeight;
float width = (rand(pos.xzy) * 2 - 1) * _BladeWidthRandom + _BladeWidth;
…
// Modify the existing positions with our new height and width.
triStream.Append(VertexOutput(pos + mul(transformationMatrix, float3(width, 0, 0)), float2(0, 0)));
triStream.Append(VertexOutput(pos + mul(transformationMatrix, float3(-width, 0, 0)), float2(1, 0)));
triStream.Append(VertexOutput(pos + mul(transformationMatrix, float3(0, 0, height)), float2(0.5, 1)));


Triangles are now much more like blades of grass, but also too little. There are simply not enough peaks in the incoming mesh to create the impression of a densely overgrown field.

One solution is to create a new, denser mesh, either using C # or in a 3D editor. This will work, but will not allow us to dynamically control the density of the grass. Instead, we will split the incoming mesh using tessellation .

4. Tessellation


Tessellation is an optional stage of the rendering pipeline, performed after the vertex shader and before the geometric shader (if any). Its task is to subdivide one incoming surface into many primitives. Tessellation is implemented in two programmable steps: hull and domain shaders.

For surface shaders, Unity has a built-in tessellation implementation . However, since we do not use surface shaders, we will have to implement our own shell and domain shaders. In this article, I will not discuss in detail the implementation of tessellation, and we just use the existing file CustomTessellation.cginc. This file is adapted from Catlike Coding., which is an excellent source of information on tessellation implementation in Unity.

If we include an object TessellationExamplein the scene, we will see that it already has material that implements tessellation. Changing the Tessellation Uniform property demonstrates the subdivision effect.


We implement tessellation in the grass shader to control the density of the plane, and therefore to control the number of blades of grass generated. First you need to add a file CustomTessellation.cginc. We will refer to it by its relative path to the shader.

// Add inside the CGINCLUDE block, below the other #include statements.
#include "Shaders/CustomTessellation.cginc"

If you open CustomTessellation.cginc, you will notice that structures vertexInputand vertexOutputvertex shaders are already defined in it . No need to redefine them in our grass shader; they can be deleted.

/*struct vertexInput
{
	float4 vertex : POSITION;
	float3 normal : NORMAL;
	float4 tangent : TANGENT;
};
struct vertexOutput
{
	float4 vertex : SV_POSITION;
	float3 normal : NORMAL;
	float4 tangent : TANGENT;
};
vertexOutput vert(vertexInput v)
{
	vertexOutput o;
	o.vertex = v.vertex;
	o.normal = v.normal;
	o.tangent = v.tangent;
	return o;
}*/

Note that the vertex shader vertin CustomTessellation.cgincsimply passes the input directly to the tessellation stage; the task of creating the structure is vertexOutputtaken over by the function tessVertcalled inside the domain shader.

Now we can add shell and domain shaders to the grass shader. We will also add a new property _TessellationUniformto control the unit size - the variable corresponding to this property has already been declared in CustomTessellation.cginc.

// Add as a new property.			
_TessellationUniform("Tessellation Uniform", Range(1, 64)) = 1
…
// Add below the other #pragma statements in the SubShader Pass.
#pragma hull hull
#pragma domain domain

Now changing the Tessellation Uniform property allows us to control the density of the grass. I found that good results are obtained with a value of 5 .


5. The wind


We implement the wind by sampling the distortion texture . This texture will look like a normal map , only in it there will be only two instead of three channels. We will use these two channels as the wind direction on the X and the Y .


Before sampling the wind texture, we need to create a UV coordinate. Instead of using the texture coordinates assigned to the mesh, we apply the position of the incoming point. Thanks to this, if there are several grass meshes in the world, the illusion will be created that they are all part of the same wind system. We also use the built-in shader variable_Time to scroll the wind texture along the surface of the grass.

// Add as new properties.
_WindDistortionMap("Wind Distortion Map", 2D) = "white" {}
_WindFrequency("Wind Frequency", Vector) = (0.05, 0.05, 0, 0)
…
// Add to the CGINCLUDE block.
sampler2D _WindDistortionMap;
float4 _WindDistortionMap_ST;
float2 _WindFrequency;
…
// Add to the geometry shader, just above the line declaring the transformationMatrix.
float2 uv = pos.xz * _WindDistortionMap_ST.xy + _WindDistortionMap_ST.zw + _WindFrequency * _Time.y;

We apply the scale and the offset to the position _WindDistortionMap, and then we further shift it by _Time.y, scaled by _WindFrequency. Now we will use these UVs to sample the texture and create a property to control the strength of the wind.

// Add as a new property.
_WindStrength("Wind Strength", Float) = 1
…
// Add to the CGINCLUDE block.
float _WindStrength;
…
// Add below the line declaring float2 uv.
float2 windSample = (tex2Dlod(_WindDistortionMap, float4(uv, 0, 0)).xy * 2 - 1) * _WindStrength;

Note that we scale the sampled value from the texture from the interval 0 ... 1 to the interval -1 ... 1. Next, we can create a normalized vector denoting the direction of the wind.

// Add below the line declaring float2 windSample.
float3 wind = normalize(float3(windSample.x, windSample.y, 0));

Now we can create a matrix to rotate around this vector and multiply it by ours transformationMatrix.

// Add below the line declaring float3 wind.
float3x3 windRotation = AngleAxis3x3(UNITY_PI * windSample, wind);
…
// Modify the existing line.
float3x3 transformationMatrix = mul(mul(mul(tangentToLocal, windRotation), facingRotationMatrix), bendRotationMatrix);

Finally, transfer the texture Wind(located at the root of the project) to the Wind Distortion Map field of the grass material in the Unity editor . We also set the Tiling parameter of the texture to a value 0.01, 0.01.


If the grass is not animating in the Scene window , then click the Toggle skybox, fog, and various other effects button to enable animated materials.

From a distance, the grass looks right, but if we look closely at the blade of grass, we notice that the entire blade of grass is turning, which is why the base is no longer attached to the ground.


The base of the blade of grass is no longer attached to the ground, but intersects with it (shown in red ), and hangs above the plane of the ground (indicated by the green line).

We will fix this by defining a second transformation matrix, which applies only to two vertices of the base. In this matrix will not be included matrix windRotationand bendRotationMatrix, thanks to which the base is attached to the grass surface.

// Add below the line declaring float3x3 transformationMatrix.
float3x3 transformationMatrixFacing = mul(tangentToLocal, facingRotationMatrix);
…
// Modify the existing lines outputting the base vertex positions.
triStream.Append(VertexOutput(pos + mul(transformationMatrixFacing, float3(width, 0, 0)), float2(0, 0)));
triStream.Append(VertexOutput(pos + mul(transformationMatrixFacing, float3(-width, 0, 0)), float2(1, 0)));

6. Curvature of blades of grass


Now individual blades of grass are defined by one triangle. At large distances, this is not a problem, but near the blade of grass they look very rigid and geometric, rather than organic and lively. We will fix this by building blades of grass from several triangles and bending them along the curve .

Each blade of grass will be divided into several segments . Each segment will have a rectangular shape and consist of two triangles, with the exception of the upper segment - it will be one triangle denoting the tip of the blade of grass.

So far, we have drawn only three vertices, creating a single triangle. How, then, if there are more vertices, does the geometric shader know which ones to join and form triangles? The answer is in the data structuretriangle strip . The first three vertices join and form a triangle, and each new vertex forms a triangle with the previous two.


A subdivided blade of grass, represented as a triangle strip and created one vertex at a time. After the first three vertices, each new vertex forms a new triangle with the previous two vertices.

This is not only more efficient in terms of memory usage, but also allows you to easily and quickly create triangles sequences in your code. If we wanted to create several stripes of triangles, we could call RestartStrip for the TriangleStreamfunction . Before we begin to draw more vertices from the geometric shader, we need to increase it . We will use the design to allow the shader author to control the number of segments and calculate the number of displayed vertices from it.

maxvertexcount#define

// Add to the CGINCLUDE block.
#define BLADE_SEGMENTS 3
…
// Modify the existing line defining the maxvertexcount.
[maxvertexcount(BLADE_SEGMENTS * 2 + 1)]

Initially, we set the number of segments to 3 and update maxvertexcountto calculate the number of vertices based on the number of segments.

To create a segmented blade of grass, we use a cycle for. Each iteration of the loop will add two vertices : left and right . After completing the tip, we add the last vertex at the tip of the blade of grass.

Before we do this, it will be useful to move part of the computing position of the vertices of the grass blades of the code into the function, because we will use this code several times inside and outside the loop. Add the CGINCLUDEfollowing to the block :

geometryOutput GenerateGrassVertex(float3 vertexPosition, float width, float height, float2 uv, float3x3 transformMatrix)
{
	float3 tangentPoint = float3(width, 0, height);
	float3 localPosition = vertexPosition + mul(transformMatrix, tangentPoint);
	return VertexOutput(localPosition, uv);
}

This function performs the same tasks because it passes the arguments that we previously passed VertexOutputto generate the vertices of the blade of grass. Obtaining a position, height and width, it correctly transforms the vertex using the transmitted matrix and assigns it a UV coordinate. We will update the existing code for the function to work properly.

// Update the existing code outputting the vertices.
triStream.Append(GenerateGrassVertex(pos, width, 0, float2(0, 0), transformationMatrixFacing));
triStream.Append(GenerateGrassVertex(pos, -width, 0, float2(1, 0), transformationMatrixFacing));
triStream.Append(GenerateGrassVertex(pos, 0, height, float2(0.5, 1), transformationMatrix));

The function started working correctly, and we are ready to move the vertex generation code into the loop for. Add the float widthfollowing under the line :

for (int i = 0; i < BLADE_SEGMENTS; i++)
{
	float t = i / (float)BLADE_SEGMENTS;
}

We announce a cycle that will be run once for each blade of grass segment. Inside the loop, add a variable t. This variable will store a value in the range 0 ... 1, indicating how far we have moved along the blade of grass. We use this value to calculate the width and height of the segment in each iteration of the loop.

// Add below the line declaring float t.
float segmentHeight = height * t;
float segmentWidth = width * (1 - t);

When moving up a blade of grass, the height increases and the width decreases. Now we can add calls to the loop GenerateGrassVertexto add vertices to the stream of triangles. We will also add one call GenerateGrassVertexoutside the loop to create the tip of the blade of grass.

// Add below the line declaring float segmentWidth.
float3x3 transformMatrix = i == 0 ? transformationMatrixFacing : transformationMatrix;
triStream.Append(GenerateGrassVertex(pos, segmentWidth, segmentHeight, float2(0, t), transformMatrix));
triStream.Append(GenerateGrassVertex(pos, -segmentWidth, segmentHeight, float2(1, t), transformMatrix));
…
// Add just below the loop to insert the vertex at the tip of the blade.
triStream.Append(GenerateGrassVertex(pos, 0, height, float2(0.5, 1), transformationMatrix));
…
// Remove the existing calls to triStream.Append.
//triStream.Append(GenerateGrassVertex(pos, width, 0, float2(0, 0), transformationMatrixFacing));
//triStream.Append(GenerateGrassVertex(pos, -width, 0, float2(1, 0), transformationMatrixFacing));
//triStream.Append(GenerateGrassVertex(pos, 0, height, float2(0.5, 1), transformationMatrix));

Take a look at the line with the declaration float3x3 transformMatrix- here we select one of two transformation matrices: we take transformationMatrixFacingfor the vertices of the base and transformationMatrixfor all the others.


Blades of grass are now divided into many segments, but the blade surface is still flat - new triangles are not yet involved. We will add a blade of grass curvature, shifting the position of the vertex of the Y . First, we need to modify the function GenerateGrassVertexso that it gets an offset in Y , which we will call forward.

// Update the function signature of GenerateGrassVertex.
geometryOutput GenerateGrassVertex(float3 vertexPosition, float width, float height, float forward, float2 uv, float3x3 transformMatrix)
…
// Modify the Y coordinate assignment of tangentPoint.
float3 tangentPoint = float3(width, forward, height);

To calculate the displacement of each vertex, we substitute a powvalue into the function t. After raising tto a power, its effect on the forward displacement will be nonlinear and turn the blade of grass into a curve.

// Add as new properties.
_BladeForward("Blade Forward Amount", Float) = 0.38
_BladeCurve("Blade Curvature Amount", Range(1, 4)) = 2
…
// Add to the CGINCLUDE block.
float _BladeForward;
float _BladeCurve;
…
// Add inside the geometry shader, below the line declaring float width.
float forward = rand(pos.yyz) * _BladeForward;
…
// Add inside the loop, below the line declaring segmentWidth.
float segmentForward = pow(t, _BladeCurve) * forward;
…
// Modify the GenerateGrassVertex calls inside the loop.
triStream.Append(GenerateGrassVertex(pos, segmentWidth, segmentHeight, segmentForward, float2(0, t), transformMatrix));
triStream.Append(GenerateGrassVertex(pos, -segmentWidth, segmentHeight, segmentForward, float2(1, t), transformMatrix));
…
// Modify the GenerateGrassVertex calls outside the loop.
triStream.Append(GenerateGrassVertex(pos, 0, height, forward, float2(0.5, 1), transformationMatrix));

This is a fairly large piece of code, but all the work is done similarly to what was done for the width and height of the blade of grass. With lower values _BladeForwardand _BladeCurvewe get an ordered, well-groomed lawn, and larger values ​​will give the opposite effect.


7. Lighting and shadows


As a final step to complete the shader, we will add the ability to cast and receive shadows. We will also add simple lighting from the main source of directional light.

7.1 Casting Shadows


To cast shadows in Unity, you need to add a second pass to the shader. This passage will be used by the shadow-creating light sources in the scene to render the depth of grass into their shadow map . This means that the geometric shader will have to be launched in the shadow passage, so that the blades of grass can cast shadows.

Since the geometric shader is written inside blocks CGINCLUDE, we can use it in any passes of the file. Create a second pass that will use the same shaders as the first, with the exception of the fragment shader - we will define a new one into which we will write a macro that processes the output.

// Add below the existing Pass.
Pass
{
	Tags
	{
		"LightMode" = "ShadowCaster"
	}
	CGPROGRAM
	#pragma vertex vert
	#pragma geometry geo
	#pragma fragment frag
	#pragma hull hull
	#pragma domain domain
	#pragma target 4.6
	#pragma multi_compile_shadowcaster
	float4 frag(geometryOutput i) : SV_Target
	{
		SHADOW_CASTER_FRAGMENT(i)
	}
	ENDCG
}

In addition to creating a new fragment shader, there are a couple of important differences in this passage. The label LightModematters ShadowCaster, not ForwardBase- this tells Unity that this passage should be used to render the object into shadow maps. There is also a preprocessor directive here multi_compile_shadowcaster. It ensures that the shader compiles all the necessary options required to cast shadows.

Make the game object Fenceactive in the scene; so we get a surface on which the blades of grass can cast a shadow.


7.2 Getting Shadows


After Unity renders the shadow map from the point of view of the light source creating the shadow, it launches a passage that "collects" the shadows into the texture of the screen space . To sample this texture, we will need to calculate the positions of the vertices in the screen space and transfer them to the fragment shader.

// Add to the geometryOutput struct.
unityShadowCoord4 _ShadowCoord : TEXCOORD1;
…
// Add to the VertexOutput function, just above the return call.	
o._ShadowCoord = ComputeScreenPos(o.pos);

In the fragment shader of the passage, ForwardBasewe can use a macro to get a value floatindicating whether the surface is in shadows or not. This value is in the range 0 ... 1, where 0 is full shading, 1 is full illumination.

Why is the UV coordinate of the screen space called _ShadowCoord? This does not comply with previous naming conventions.
Many built-in Unity shaders make assumptions about the names of certain fields in various shader structures (some even make assumptions about the names of the structures themselves). The same applies to the macro used below SHADOW_ATTENUATION. If we extract the source code of this macro from Autolight.cginc, we will see that the coordinate of the shadow must have a specific name.

#define SHADOW_ATTENUATION(a) unitySampleShadow(a._ShadowCoord)

If we would like to create another name for this coordinate, or for some reason we would need it, then we could just copy this definition into our own shader.

// Add to the ForwardBase pass's fragment shader, replacing the existing return call.
return SHADOW_ATTENUATION(i);
//return lerp(_BottomColor, _TopColor, i.uv.y);

Finally, we need to make the shader properly configured to receive shadows. To do this, we will add a ForwardBasepreprocessor directive to the pass so that it compiles all the necessary shader options.

// Add to the ForwardBase pass's preprocessor directives, below #pragma target 4.6.
#pragma multi_compile_fwdbase


Having brought the camera closer, we can notice artifacts on the surface of the blades of grass; they are caused by the fact that individual blades of grass cast shadows on themselves. We can fix this by applying a linear shift or moving the positions of the vertices in the truncation space slightly away from the screen. We will use the Unity macro for this and include it in the design #ifso that the operation is performed only in the shadow path.

// Add at the end of the VertexOutput function, just above the return call.
#if UNITY_PASS_SHADOWCASTER
	// Applying the bias prevents artifacts from appearing on the surface.
	o.pos = UnityApplyLinearShadowBias(o.pos);
#endif


After applying the linear shadow shift, shadow artifacts in the form of stripes disappear from the surface of the triangles.

Why are there artifacts along the edges of the shaded blades of grass?

Even when multisample anti-aliasing MSAA is turned on, Unity does not apply smoothing to the depth texture of the scene, which is used to create a shadow map of the screen space. Therefore, when a smoothed scene samples a non-smoothed shadow map, artifacts arise.

One solution is to use anti-aliasing applied at the post-processing stage, available in the Unity post-processing package . However, sometimes smoothing post-processing is not applicable (for example, when working with virtual reality); alternative solutions to the problem are discussed in this thread of Unity forums .

7.3 Lighting


We will implement lighting using a very simple and common diffused lighting calculation algorithm.


... where N is the normal to the surface, L is the normalized direction of the main source of directional lighting, and I is the calculated lighting. In this tutorial we will not implement indirect lighting.

At the moment, normals are not assigned to the vertices of the blades of grass. As with vertex positions, we first compute the normals in the tangent space and then convert them to local.

When the Blade Curvature Amount is 1 , all blades of grass in the tangent space are directed in one direction: directly opposite the Y axis. As the first pass of our solution, we calculate the normal, assuming no curvature.

// Add to the GenerateGrassVertex function, belowing the line declaring tangentPoint.
float3 tangentNormal = float3(0, -1, 0);
float3 localNormal = mul(transformMatrix, tangentNormal);

tangentNormal, defined as directly opposite the Y axis , is transformed by the same matrix that we used to convert the tangent points to local space. Now we can pass it to a function VertexOutput, and then to a structure geometryOutput.

// Modify the return call in GenerateGrassVertex.
return VertexOutput(localPosition, uv, localNormal);
…
// Add to the geometryOutput struct.
float3 normal : NORMAL;
…
// Modify the existing function signature.
geometryOutput VertexOutput(float3 pos, float2 uv, float3 normal)
…
// Add to the VertexOutput function to pass the normal through to the fragment shader.
o.normal = UnityObjectToWorldNormal(normal);

Notice that before the conclusion, we transform the normal into world space ; Unity conveys to the shaders the direction of the main source of directional light in world space, so this transformation is necessary.

Now we can visualize the normals in the shader fragment ForwardBaseto check the result of our work.

// Add to the ForwardBase fragment shader.
float3 normal = facing > 0 ? i.normal : -i.normal;
return float4(normal * 0.5 + 0.5, 1);
// Remove the existing return call.
//return SHADOW_ATTENUATION(i);

Since a Cullvalue is assigned in our shader Off, both sides of the blade of grass are rendered. In order for the normal to be directed in the right direction, we use an auxiliary parameter VFACEthat we added to the fragment shader.

The argument fixed facingwill return a positive number if we display the front face of the surface, and a negative number if it is the opposite. We use this in the code above to flip the normal if necessary.


When the Blade Curvature Amount is greater than 1, the tangent Z position of each vertex will be shifted by the amount forwardpassed to the function GenerateGrassVertex. We will use this value to proportionally scale the Z axis of the normals.

// Modify the existing line in GenerateGrassVertex.
float3 tangentNormal = normalize(float3(0, -1, forward));

Finally, add the code to the fragment shader to combine the shadows, directional lighting, and ambient lighting. I recommend studying more detailed information about the implementation of custom lighting in shaders in my tutorial on toon shaders .

// Add to the ForwardBase fragment shader, below the line declaring float3 normal.
float shadow = SHADOW_ATTENUATION(i);
float NdotL = saturate(saturate(dot(normal, _WorldSpaceLightPos0)) + _TranslucentGain) * shadow;
float3 ambient = ShadeSH9(float4(normal, 1));
float4 lightIntensity = NdotL * _LightColor0 + float4(ambient, 1);
float4 col = lerp(_BottomColor, _TopColor * lightIntensity, i.uv.y);
return col;
// Remove the existing return call.
//return float4(normal * 0.5 + 0.5, 1);


Conclusion


In this tutorial, grass covers a small area of ​​10x10 units. In order for the shader to cover large open spaces while maintaining high performance, optimizations must be introduced. You can apply tessellation based on distance so that fewer blades of grass are rendered away from the camera. In addition, over long distances, instead of individual blades of grass, groups of blades of grass can be drawn using a single quadrangle with a superimposed texture.


Grass texture included in the Unity engine’s Standard Assets package . Many blades of grass are drawn on one quadrangle, which reduces the number of triangles in the scene.

Although natively we cannot use geometric shaders with surface shaders, to improve or expand the functionality of lighting and shading, if you need to use the standard Unity lighting model, you can study this GitHub repository , which demonstrates the solution to the problem by delayed rendering and manual filling of G-buffers.

Shader source code in the GitHub repository

Addition: cooperation


Without interoperability, graphic effects may seem static or lifeless to players. This tutorial is already very long, so I did not add a section on the interaction of world objects with grass.

A naive implementation of interactive herbs would contain two components: something in the game world that can transmit data to the shader to tell it which part of the grass is being interacted with, and code in the shader to interpret this data.

An example of how this can be implemented with water is shown here . It can be adapted to work with grass; instead of drawing ripples in the place where the character is, you can turn the blade of grass down to simulate the effects of steps.

Also popular now: