Trending
Opinion: How will Project 2025 impact game developers?
The Heritage Foundation's manifesto for the possible next administration could do great harm to many, including large portions of the game development community.
In this reprinted <a href="http://altdevblogaday.com/">#altdevblogaday</a> opinion piece, game programer Simon Yeung describes how to generate vertex and pixel shader source codes for different render passes by defining a surface shader.
August 3, 2012
[In this reprinted #altdevblogaday opinion piece, game programer Simon Yeung describes how to generate vertex and pixel shader source codes for different render passes by defining a surface shader.] In the last few weeks, I was busy with rewriting my iPhone engine so that it can also run on the Windows platform (so that I can use Visual Studio in stead of Xcode~) and most importantly, I can play around with D3D11. During the rewrite, I want to improve the process of writing shaders so that I don't need to write similar shaders multiple times for each shader permutation (say, for each surface, I have to write a shader for static mesh, skinned mesh, instanced static mesh… multiplied by the number of render passes), and instead I can focus on coding how the surface would looks like. So I decided to write a shader generator that will generate those shaders, which is similar to the surface shader in Unity. I choose the surface shader approach instead of a graph based approach like Unreal Engine because being a programmer, I feel more comfortable (and faster) writing code than dragging tree nodes using the GUI. In the current implementation of the shader generator, it can only generate vertex and pixel shaders for the light pre pass renderer, which is the lighting model used before.
To generate the target vertex and pixel shaders by the shader generator, we need to define how the surface looks like by writing surface shader. In my version of surface shader, I need to define three functions: vertex function, surface function and lighting function. The vertex function defines the vertex properties like position and texture coordinates.
VTX_FUNC_OUTPUT vtxFunc(VTX_FUNC_INPUT input)
{
VTX_FUNC_OUTPUT output;
output.position = mul( float4(input.position, 1), worldViewProj );
output.normal = mul( worldInv, float4(input.normal, 0) ).xyz;
output.uv0 = input.uv0;
return output;
}
The surface function describes how the surface looks like by defining the diffuse color of the surface, glossiness and the surface normal.
SUF_FUNC_OUTPUT sufFunc(SUF_FUNC_INPUT input)
{
SUF_FUNC_OUTPUT output;
output.normal = input.normal;
output.diffuse = diffuseTex.Sample( samplerLinear, input.uv0 ).rgb;
output.glossiness = glossiness;
return output;
}
Finally the lighting function will decide which lighting model is used to calculate the reflected color of the surface.
LIGHT_FUNC_OUTPUT lightFuncLPP(LIGHT_FUNC_INPUT input)
{
LIGHT_FUNC_OUTPUT output;
float4 lightColor = lightBuffer.Sample(samplerLinear, input.pxPos.xy * renderTargetSizeInv.xy );
output.color = float4(input.diffuse * lightColor.rgb, 1);
return output;
}
By defining the above functions, the writer of the surface shader only need to fill in the output structure of the function by using the input structure with some auxiliary functions and shader constants provided by the engine.
As you can see in the above code snippet, my surface shader is just defining normal HLSL function with a fixed input and output structure for the functions. So to generate the vertex and pixel shaders, we just need to copy these functions to the target shader code, which will invoke those functions defined in the surface shader. Taking the above vertex function as an example, the generated vertex shader would look like:
#include "include.h"
struct VS_INPUT
{
float3 position : POSITION0;
float3 normal : NORMAL0;
float2 uv0 : UV0;
};
struct VS_OUTPUT
{
float4 position : SV_POSITION0;
float3 normal : NORMAL0;
float2 uv0 : UV0;
};
typedef VS_INPUT VTX_FUNC_INPUT;
typedef VS_OUTPUT VTX_FUNC_OUTPUT;
/********************* User Defined Content ********************/
VTX_FUNC_OUTPUT vtxFunc(VTX_FUNC_INPUT input)
{
VTX_FUNC_OUTPUT output;
output.position = mul( float4(input.position, 1), worldViewProj );
output.normal = mul( worldInv, float4(input.normal, 0) ).xyz;
output.uv0 = input.uv0;
return output;
}
/******************** End User Defined Content *****************/
VS_OUTPUT main(VS_INPUT input)
{
return vtxFunc(input);
}
During code generation, the shader generator needs to figure out what input and output structure are needed to feed into the user defined functions. This task is simple and can be accomplished by using some string functions.
As I mentioned before, my shader generator is used for generating shaders used in the light pre pass renderer. There are two passes in the light pre pass renderer which need different shader input and output. For example in the G-buffer pass, the shaders are only interested in the surface normal data but not the diffuse color while the data needed by second geometry pass are the opposite. However all the surface information (surface normal and diffuse color) are defined in the surface function inside the surface shader. If we simply generate shaders like last section, we will generate some redundant code that cannot be optimized by the shader compiler. For example, the pixel shader in G buffer pass may need to sample the diffuse texture, which requires the texture coordinates input from vertex shader, but since the diffuse color is actually not needed in this pass, the compiler may not be able to figure out we don't need the texture coordinates output in vertex shader. Of course, we can force the writer to define some #if preprocessor inside the surface function for the particular render pass to eliminate the useless output, but this will complicate the surface shader authoring process, as writing the surface shader to describe how the surface looks like ideally doesn't need to worry about the output of a render pass. So the problem is to figure out what output data are actually needed in a given pass, and eliminate those outputs that are not needed. For example, given we are generating shaders for the G buffer pass and a surface function:
SUF_FUNC_OUTPUT sufFunc(SUF_FUNC_INPUT input)
{
SUF_FUNC_OUTPUT output;
output.normal = input.normal;
output.diffuse = diffuseTex.Sample( samplerLinear, input.uv0 ).rgb;
output.glossiness = glossiness;
return output;
}
We only want to keep the variables output.normal and output.glossiness. And the variable output.diffuse, and other variables that are referenced by output.diffuse (diffuseTex, samplerLinear, input.uv0) are going to be eliminated. To find out such variable dependencies, we need to teach the shader generator to understand HLSL grammar and find out all the assignment statements and branching conditions to derive the variable dependency. To do this, we need to generate an abstract syntax tree from the shader source code. Of course we can write our own LALR parser to achieve this goal, but I chose to use lex&yacc (or flex&bison) to generate the parse tree. Luckily we are working on a subset of the HLSL syntax (only need to define functions and don't need to use pointers) and HLSL syntax is similar to C language, so modifying the ANSI-C grammar rule for lex&yacc would do the job. Here is my modified grammar rule used to generate the parse tree. By traversing the parse tree, the variable dependency can be obtained, hence we know which variables need to be eliminated and eliminate them by taking out the assignment statements, then the compiler will do the rest. Below is the simplified pixel shader generated in the previous example:
#include "include.h"
cbuffer _materialParam : register( MATERIAL_CONSTANT_BUFFER_SLOT_0 )
{
float glossiness;
};
Texture2D diffuseTex: register( MATERIAL_SHADER_RESOURCE_SLOT_0 );
struct PS_INPUT
{
float4 position : SV_POSITION0;
float3 normal : NORMAL0;
};
struct PS_OUTPUT
{
float4 gBuffer : SV_Target0;
};
struct SUF_FUNC_OUTPUT
{
float3 normal;
float glossiness;
};
typedef PS_INPUT SUF_FUNC_INPUT;
/********************* User Defined Content ********************/
SUF_FUNC_OUTPUT sufFunc(SUF_FUNC_INPUT input)
{
SUF_FUNC_OUTPUT output;
output.normal = input.normal;
;
output.glossiness = glossiness;
return output;
}
/******************** End User Defined Content *****************/
PS_OUTPUT main(PS_INPUT input)
{
SUF_FUNC_OUTPUT sufOut= sufFunc(input);
PS_OUTPUT output;
output.gBuffer= normalToGBuffer(sufOut.normal, sufOut.glossiness);
return output;
}
As I use lex&yacc to parse the surface shader, I can extend the surface shader syntax by adding more grammar rule, so that the writer of the surface shader can define what shader constants and textures are needed in their surface function to generate the constant buffer and shader resources in the source code. Also my surface shader syntax permits the user to define their struct and function other than their three main functions (vertex, surface and lighting function), where they will also be copied into the generated source code. Here is a sample of how my surface shader would looks like:
RenderType{
opaque;
};
ShaderConstant
{
float glossiness: ui_slider_0_255_Glossiness;
};
TextureResource
{
Texture2D diffuseTex;
};
VTX_FUNC_OUTPUT vtxFunc(VTX_FUNC_INPUT input)
{
VTX_FUNC_OUTPUT output;
output.position = mul( float4(input.position, 1), worldViewProj );
output.normal = mul( worldInv, float4(input.normal, 0) ).xyz;
output.uv0 = input.uv0;
return output;
}
SUF_FUNC_OUTPUT sufFunc(SUF_FUNC_INPUT input)
{
SUF_FUNC_OUTPUT output;
output.normal = input.normal;
output.diffuse = diffuseTex.Sample( samplerLinear, input.uv0 ).rgb;
output.glossiness = glossiness;
return output;
}
LIGHT_FUNC_OUTPUT lightFuncLPP(LIGHT_FUNC_INPUT input)
{
LIGHT_FUNC_OUTPUT output;
float4 lightColor = lightBuffer.Sample(samplerLinear, input.pxPos.xy * renderTargetSizeInv.xy );
output.color = float4(input.diffuse * lightColor.rgb, 1);
return output;
}
h1>ConclusionsThis post described how I generate vertex and pixel shader source codes for different render passes by defining a surface shader, which allowed me to avoid writing similar shaders multiple times, and not worry about the particular shader input and output for each render pass. Currently, the shader generator can only generate vertex and pixel shader in HLSL for static mesh in the light pre pass renderer. The shader generator is still under progress, and generating shader source code for the forward pass is incomplete. Besides domain, hull and geometry shaders are not implemented. Also GLSL support is missing, but this can be generated (in theory…) by building a more sophisticated abstract syntax tree during parsing the surface shader grammar or defining some new grammar rule in the surface shader (using lex&yacc) for easier generation of both HLSL and GLSL source code. But these will be left for the future as I still need to rewrite my engine and get it running again…
[1] Unity – Surface Shader Examples http://docs.unity3d.com/Documentation/Components/SL-SurfaceShaderExamples.html [2] Lex & Yacc Tutorial http://epaperpress.com/lexandyacc/ [3] ANSI C grammar, Lex specification http://www.lysator.liu.se/c/ANSI-C-grammar-l.html [4] ANSI C Yacc grammar http://www.lysator.liu.se/c/ANSI-C-grammar-y.html [5] http://www.ibm.com/developerworks/opensource/library/l-flexbison/index.html [6] http://www.gamedev.net/topic/200275-yaccbison-locations/ [This piece was reprinted from #AltDevBlogADay, a shared blog initiative started by @mike_acton devoted to giving game developers of all disciplines a place to motivate each other to write regularly about their personal game development passions.]
You May Also Like