Before Start

I put here a brief, direct and very rustic introduction to Shaders. While it can be useful, there`s ton of better references. From the top of my mind, I really recommend the official ShaderLab from Unity as a major reference. Also you can find Shaders on Polycount (Also with an artists approach)

There`s some useful stuff into Amplify Shader Creation page, Unity's Shader Graph Example Library,  ShaderForge Wiki and on ShaderForge Tutorial Site.


As myself, most programmers are confident in their coding skills and prefer the sense of control of their codes to node-based editors. As a programmer, one of my straights are the instrumental use of 3D Math, for that reason, even being an workflow explorer, it took a while before I start to use node-based editors.


Most programmers spend too much time debugging and trying to find what`s wrong on their code. In a rustic context, as shader coding, this tasks cost a huge amount of time. A cost that I was aware before start to use node-based shader editors. In fact, the precision of this approach not only speed up my shaders creation but increase the possibility of tests and the overall quality/complexity of each shader Don't worry, you always can make some hand optimizations before create your shader using a node-based editor.

Don`t matter if you are an awesome programmer or an non-technical artist, you can get a faster result in a very appropriate way using this sort of solution.

What`s a shader?

Initially shader was a way to code light and shades, however, an huge expansion of this definition happens on the latest years motivated by the increase of GPU power processing and the increase of shader code flexibility allowing much more things to happen. So the best definition to shader now are an way to communicate with GPU. Generic but accurate.

The secret on Shader programming is to look to everything as a data. Doesn`t matter if it`s a color, a value, a position, a magnitude, a direction, a distance...

How a shader looks like?

There`s different sort os parts each one responsible for a task, however, for the sake of simplicity we can start thinking only on Vertex and Fragment Shaders. Unity adds also the "Surface" shader which is just a smart wrapper for the mentioned ones. You really can know more about it on Unity manual and Polycount.


Less precision, less real-time processing, less per-pixel stuff and more per-vertex stuff. That`s the rule of thumb to good shaders. You can see some of this tips here.

To understand how GPU works, I really recommend Render Hell 2.0 from Simon Schreibt.

If you just want some guidelines...

Use profiler, including GPU profiler (Frame Debugger at Unity), on your target device.

Look for the max number of Batches and Polys before art starts.

On Mobiles (Including VR Mobiles) avoid overdraw (transparency) as much as you can.

On Consoles, transparency is ok to be used.

Create fallbacks to your Shaders. (You should avoid to use fallbacks while still coding)

Control your shader LOD.

Control your render order and double check it using frame debugger over and over.

Avoid per-pixel calculations on low-end devices. Try per-vertex alternatives.

Avoid multiple passes on your shaders.

If can be baked or can be processed beforehand, don`t compute on real-time.


We can pass several types of Input to shaders. Luckily, Shader Forge expose several ways to do that in a very organized way.

Probably the first thing that comes to your mind when thinking about input are the ones exposed into Unity`s Inspector. While, you can have several input types by codes, Shader Forge bounds you options to the ones below (Properties).

Beside the ones exposed by the inspector, you can create your input inside the shader or take from the Geometry, the Scene or others External components.