Trending
Opinion: How will Project 2025 impact game developers?
The Heritage Foundation's manifesto for the possible next administration could do great harm to many, including large portions of the game development community.
Featured Blog | This community-written post highlights the best of what the game industry has to offer. Read more like it on the Game Developer Blogs or learn how to Submit Your Own Blog Post
You can recreate some of Unity Pro's features in Unity Free, given the time and know-how. Here, I demonstrate how you can do Level of Detail meshes, limited RenderTextures/Post-Processing, and LightProbes.
Unity Pro is a wonderful piece of software, but that initial pricetag of $1,500 US is somewhat high. And if you’re working on a mobile title, that number gets multiplied by the number of platforms you’re targeting. Which - let’s face it - is a tad daunting in these days of indie market saturation and publishing-by-popularity-contest.
That means that Unity Pro might not be a prudent investment for you. However, you can still create a feature-rich, polished game with Unity Free if you keep your expectations reasonable and your design document conservative. Heck, you can also recreate many of Pro’s features in Free if you’ve got the time and the know-how!
Unity Pro’s top-shelf features come in three varieties:
Making Your Life Easier
Available in Free but optimized in Pro
Completely unavailable in Free
Category (1) includes stuff that Pro gives you but can be custom-built in Free, like pathfinding. Category (2) includes stuff that can be roughly approximated in Free but can only be used to its true extent in Pro, like RenderTextures. And Category (3) includes stuff that Unity Free will flat-out not let you use, like native plugins or the Stencil Buffer.
That leaves Categories 1 and 2 as fair game. And here’s a few examples of what you can pull off:
Level of Detail Meshes
LODs are a simple matter of an object storing information on multiple meshes. To start, you need a MeshFilter to store mesh info and a MeshRenderer or SkinnedMeshRenderer to display it. After that, creating a simple MyLOD component to judge distance to the camera and swap between meshes is all you need to get the job done.
Component Architecture
References to each Mesh desired
References to each distance level desired
Do a distance check (or a cheaper square distance check, see here) in the Update loop to toggle which Mesh the MeshRenderer is using
Consider using ExecuteInEditMode to see realtime LOD changes while in the Editor
Limitations
Unity Pro can automatically bake lighting on all meshes assigned to an LOD. To duplicate that in Unity Free, you’ll have to have multiple mesh gameObjects already in-scene, bake their lighting individually, and then toggle the gameObjects’ visibility at runtime. Tedious but doable.
RenderTextures/Post-Processing Effects
Unity Free’s lack of RenderTextures is a bitter pill to swallow for veteran programmers accustomed to complete freedom in this category. And although they can be approximated in Free, they’re not optimized and you probably can’t get away with decent realtime performance.
But they are possible, and will leave you with a nice Texture2D that you can do some post-processing on.
Component Architecture
Create a Coroutine named “RenderToTexture”
Immediately yield the coroutine with a new WaitForEndOfFrame(). This’ll ensure that the screen has finished rendering and you can do a full capture.
Create a new Texture2D and set it to your screen’s resolution
Run ReadPixels on your new Texture2D to render the screen to it
End the Coroutine and return to wherever you started
Run a custom post-processing shader on your new Texture2D
Limitations
Pro RenderTextures run on the GPU. When combined with fairly cheap, optimized shaders, Pro can create consistent realtime effects at a steady framerate. On the other hand, Free's ReadPixels is a CPU-driven function, meaning that you’ll see a noticeable hiccup when it runs. That hiccup will turn into a stutter-stop slideshow if you try running it in realtime.
Also, ReadPixels is “atomic.” That means that Unity will refuse to run it from a separate thread, preventing you from hiding the slowdown via multithreading. Besides, multithreading won’t cancel out the fact that ReadPixels is a CPU-driven function and can’t run as quickly as a GPU-driven one.
However… remember when i said that your design document should be conservatively specced? Well, consider using this workaround in a non-realtime situation. Perhaps you might want to have your screen go to grayscale when you pause your game. If you run ReadPixels a single time to capture your screen upon pausing, run a grayscale shader on the resulting Texture2D, and then display that texture over your in-game screen, the act of pausing will hide the ReadPixels hiccup.
I firmly believe that RenderTexture-via-ReadPixels can still be effective if you get creative.
LightProbes
If you’re working on a mobile game, you’ll need to wring every last drop of performance out of Unity. That means that realtime pixel lighting will probably have to go bye-bye, leaving your game to the mercy of vertex lighting, all of which is handled by Unity behind the scenes.
That’s where LightProbes come in. It’s kind of like baked lighting for dynamic objects: discreet sampling of lighting at various points in the scene that are then cheaply applied to non-static objects based on their proximity. They optimize performance and cut down on lighting errors. Fun!
The not fun part? LightProbes aren't available in Free, and creating your own custom LightProbe system means overriding Unity’s lighting system and replacing it with your own via custom shaders and careful programming. Abandoning a perfectly good in-engine lighting system is a huge pain, but if you’re so inclined, here’s what you need to consider:
Theory
Although you almost never see the code behind Unity’s vanilla lighting system, it’s available for perusal. (Beast’s lighting system, however, is not.) All dynamic object lighting is handled by Unity’s shaders, and most shaders (both vanilla and custom) fall back on default lighting algorithms in Unity’s code. Here’s something you see quite often in a surface shader:
CGPROGRAM
#include "UnityCG.cginc"
#pragma surface surf Lambert
The “Lambert” bit instructs Unity to use default Lambert pixel lighting code, which is actually a function buried inside Lighting.cginc that looks like this:
inline fixed4 LightingLambert (SurfaceOutput s, fixed3 lightDir, fixed atten)
{
fixed diff = max (0, dot (s.Normal, lightDir));
fixed4 c;
c.rgb = s.Albedo * _LightColor0.rgb * (diff * atten * 2);
c.a = s.Alpha;
return c;
}
The surface shader handles the inputs for you, naturally.
But maybe you don’t want to rely on surface shaders. For complete control, you’d want to write your own vertex and fragment shaders. If so, vertex lighting would be handled in the vertex shader and pixel lighting would be handled in the fragment shader. Either would use custom functions, or one of many that Unity already provides in UnityCG.cginc.
In fact, the ShadeVertexLights function listed at the end of that article is a great place to start. That function looks like this:
float3 ShadeVertexLights (float4 vertex, float3 normal)
{
float3 viewpos = mul (UNITY_MATRIX_MV, vertex).xyz;
float3 viewN = mul ((float3x3)UNITY_MATRIX_IT_MV, normal);
float3 lightColor = UNITY_LIGHTMODEL_AMBIENT.xyz;
for (int i = 0; i < 4; i++) {
float3 toLight = unity_LightPosition[i].xyz - viewpos.xyz *
unity_LightPosition[i].w;
float lengthSq = dot(toLight, toLight);
float atten = 1.0 / (1.0 + lengthSq * unity_LightAtten[i].z);
float diff = max (0, dot (viewN, normalize(toLight)));
lightColor += unity_LightColor[i].rgb * (diff * atten);
}
return lightColor;
}
You run that from the vertex shader, plug in the vertex and its normal, and voila! Lighting information from four lights comes back, calculated by a pretty standard lighting function. The kicker, you’ll notice, is that Unity handles population of the unity_LightPosition and unity_LightAtten (attenuation, i.e. falloff) arrays on its own. So if you’re going to write a custom system, you have to design shaders with new functions that use new light info arrays and populate that by yourself.
If you’re insane enough to still think that’s something you want or need to do...
Architecture
Write your own “custom.cginc” include file that your custom shaders can rely on
Variables that Unity will need to populate
Ambient Light Color
Direction Light Color
Direction Light Position
Direction Light Normal
Point Light Color[4 or 8 or whatever]
Point Light Position[4 or 8 or whatever]
Functions
PixelShade4Lights, PixelShade8Lights
VertexShade4Lights, VertexShade8Lights
Maybe Toon lighting, etc.
RegisterLight Component (apply on dynamic objects that you want to use the system)
Link to the object’s Renderer (MeshRenderer or SkinnedMeshRenderer)
Option to specify registration of individual materials or all materials attached to the Renderer
Sample lighting from nearest MyLightProbe (below)
Assign light positions, colors, and normals on a per material basis (This means that you can’t use any of the Shader.SetGlobal functions)
MyLightProbe component
Compiles all RegisterLight scripts and adds them to database
Compiles all Lights (and Ambient light) and adds them to database
Ranks top lights by attenuation at range to probe’s origin
Recompute all values if a light moves (optional)
Limitations
Aside from the fact that scrapping a perfectly good lighting system is a tremendous headache? Well, this method is also incapable of reading baked lighting and can’t tell if a custom LightProbe is sitting in a shadow.
Also, you may very well be dead before you can successfully implement all this.
Making Your Life Easier
The fact that you can recreate or approximate a lot of Unity Pro’s features in Unity Free isn’t much of a secret. After all, the Asset Store does booming business by selling you perfectly good systems that you either don’t know how to create or don’t want to spend the time creating. So if you’re ever stymied by a Pro feature that’s unavailable in free, look through the Asset Store first to see if anyone has approximated it.
Read more about:
Featured BlogsYou May Also Like