Hiiiii!!

I'm looking for a shader that's compatible with Unity terrain that would have a flat shader look like so:

The theory is to simply eliminate the "smoothing" effects of the usual shaders by averaging out the vertices normals to produce a single normal and flood the poly with that one colour, instead of the usual smoothed out colour from each edge/vertice.

I have no idea how that would work, and googling hasn't come up with anything...

Is this impossibly difficult?

• edited
Nah, this is incredibly easily actually. I do shaders for a living, not Unity though, so I'll look around and see how its done. Essentially all you need to do, is take the dot product of the surface normal, with the normal vector of your light source and multiply that with your colour. Optionally add an ambient term before the multiplication, to prevent pitch black areas.

Edit: You'll have to make sure though that you have per-triangle normals, and not per-vertex! For per-vertex normals, the normals of each triangle around a vertex is averaged, so for one triangle, there will be 3 normals specified... one at each corner. It will then interpolate to create a smooth effect. HOWEVER, for per triangle normals, you will have a flat normal on the entire surface, hence a flat colour!

Something like this could do the trick:
```Shader "Example/Diffuse Texture" {
Properties {
_MainTex ("Texture", 2D) = "white" {}
}
Tags { "RenderType" = "Opaque" }
CGPROGRAM
#pragma surface surf SimpleLambert

half4 LightingSimpleLambert (SurfaceOutput s, half3 lightDir, half atten) {
half NdotL = dot (s.Normal, lightDir);
half4 c;
c.rgb = s.Albedo * _LightColor0.rgb * (NdotL * atten);
c.a = s.Alpha;
return c;
}

struct Input {
float2 uv_MainTex;
};

sampler2D _MainTex;

void surf (Input IN, inout SurfaceOutput o) {
o.Albedo = tex2D (_MainTex, IN.uv_MainTex).rgb;
}
ENDCG
}
Fallback "Diffuse"
}```

And in fact, an ordinary diffuse shader will do the trick. Let's just find out how to do those per triangle normals :)

EDIT, AGAIN!!!

I'm not sure how easy it will be to modify your unity terrain to have per face normals. It might in fact be easier to generate your own terrain mesh from a height map if this is the case.
• edited
EDIT: Nevermind the error! I got it working, I copied some html with the code, dumb me... but it doesn't do flat shading... It does somewhat flat but not really:

Yes, I've read in places that generating your own mesh terrain is the way to go, but then we need a way to move vertices at runtime... Because we need that too :/
• Ah, well that was a copy-paste, so not sure what's wrong. Anyway, if you do go the generating own mesh route, it's really simple. Have a look at the docs here: http://docs.unity3d.com/ScriptReference/Mesh.html

You'll have something like this attached to your game object containing the Mesh and the MeshFilter:
```public class ExampleClass : MonoBehaviour {
public Vector3[] newVertices;
public Vector2[] newUV;
public int[] newTriangles;
void Update() {
Mesh mesh = GetComponent<MeshFilter>().mesh;
mesh.Clear();
mesh.vertices = newVertices;
mesh.uv = newUV;
mesh.triangles = newTriangles;
}
}```

The trick now is just to generate the vertices, triangles, and normals. You'll need to modify that code to have normals, and I'm sure you can leave out the UVs, unless you want to texture your terrain in addition to shading it. I'm quickly figuring it out again and then I'll post some code. I think you specify an array of vertices, so something like:

```newVertices[0] = new Vector3(0, 0, 0);
newVertices[1] = new Vector3(1, 0, 0);
newVertices[2] = new Vector3(0, 0, 1);
newVertices[3] = new Vector3(1, 0, 1);```

Those will give you two triangles, which line up to make a flat square. You will need to tell unity which vertices to use though for each triangle, so that's where the mesh.triangles comes in. Your triangle array will be multiples of 3, so that's 3 per triangle. In this example we will have 2 triangles, so the triangles array will be 6 long. Each element of this array points to one of the vertices in the vertices array. I strongly suspect that you can re-use vertices for each triangle but I might be wrong.

So it will be something like:

```newTriangles[0] = 0;
newTriangles[1] = 1;
newTriangles[2] = 2;
newTriangles[3] = 1;
newTriangles[4] = 3;
newTriangles[5] = 2;```

I think you want to draw it out on paper, and make sure you index them in a counter-clockwise order. The winding order (ie, 0, 1, 2 vs 2, 1, 0) determines whether the surface faces to the front or to the back. If it faces to the back it might not be rendered.

I think at this point you can specify normals, either per vertex or per triangle. I'll see if I can figure it out. So I might come back to you and admit that all of these were lies :D
• PS. I had a project where I accidentally had flat shaded terrain, where I really wanted smooth. I don't have that project anymore, but I think I can reproduce it :D
Thanked by 1Tuism
• The thing is I'm not coding this myself, and we've already gone down the route of using terrains, if we can get away with still using Unity terrain that would be the best, because otherwise we'd have to write a whole custom terrain thing ourselves... On something that's already stupidly overscoped :/

Thanks so much for your time!
• I got you this GLSL shader:
```Shader "GLSL flat shader" {
Pass {
GLSLPROGRAM

flat varying vec4 color;

#ifdef VERTEX
void main()
{

gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
color = gl_ModelViewProjectionMatrix * vec4(gl_Normal, 1.0);
}
#endif

#ifdef FRAGMENT
void main()
{
gl_FragColor = color; // set the output fragment color
}
#endif

ENDGLSL
}
}
}```
• edited
There might be some limitations to using GLSL (OpenGL shading language) in Unity, but if you aren't going to set per face normals, you need to use the flat qualifier in your shader
`flat varying vec4 color;`

This shader does not regard colour or anything. We can expand it to do this though!
• So Ahmed said basically that whether a model is smooth shaded or flat shaded is dependent on the model, and nothing you do in the shader would be able to fix that :/

The GLSL shader... I have no idea what it does, it just gives me the unity error pink, but i'm sure you knew that already :P
• Haha, it worked for me. It might be limited to OS. Are you running windows? I'm on Linux. I think Unity uses only DirectX for Windows, which will give the error pink. Until Unity CG scripts support the flat modifier, it will be model dependent!
• edited
• Yeah I saw that Nick, apparently we can't use paid stuff because this should be completely open-source. (Oh this is ggj)

@Denzil I'm on OSX, so... It should be platform independent? But I guess I really don't know what I'm talking about :/
Thanked by 1bischonator
• @Tuism Ah well there goes that theory :P Anyway, I don't see a simple solution to this! Sorry!
• @tuism yeah unity does not support the flat shaded flag, we had a similar problem, and went the route of creating a custom mesh.

You mention you need to move verts at runtime, if you generate your own mesh you should be able to do that, though?
• edited
Yeah generating our own mesh would allow that but it also wouldn't have all the functions of terrains we wanted and was already available.

Anyway, on this, @elyaradine was a mythical unicorn machine and wrote a shader for us that allowed the exact look we needed. Omg he's a genius.

• Awesome!

@elyaradine More or less, what was the trick? :)
• edited
You can calculate your normals based on adjacent pixels using ddx() and ddy(). These wouldn't get smoothed (they're per pixel, not per vertex).

```struct v2f {
...
float3 worldPos: TEXCOORD1;
...
};```

```v2f vert(appdata_full v)
{
v2f o;
o.worldPos = mul(_Object2World, v.vertex);
...
}```

```fixed4 frag(v2f i): SV_Target
{
float3 x = ddx(i.worldPos);
float3 y = ddy(i.worldPos);

float3 norm = -normalize(cross(x,y));
...
}```

I just stuck the negative in there because it looked upside down. :P (It was a bit inconsistent for me on DirectX where in the Game view while the game was not running it would show one direction, but in my Scene view and in a running Game view it would show another direction. I guess it doesn't matter as long as the Scene and running Game views sync up, but just so you know.)

From there you've got your per pixel normal that's the same as if you had hard edges, and you can take it from there with lighting/fresnel/whatever.
Thanked by 2Denzil critic
• Oh by the way, is there anything about this shader that could bug out between windows and osx machines? We put this project on a windows computer and it looked fine in editor view, but on game view it seemed that the shader wasn't working at all, reverting back to the directional light's colour. Didn't test it on a second windows machine yet.
• Yeah, did you try it while the game was running? :/
• ... Omg how did I not try that. I'm sure @UberGeoff did (It was on his machine that there was that problem). Lemme ping him about it.
• @Elyaradine Clever little trick, didn't even think about that! Thanks for the share!
• edited
Tuism said:
... Omg how did I not try that. I'm sure @UberGeoff did (It was on his machine that there was that problem). Lemme ping him about it.
Yeah, maybe test it on a default sphere too, just to see if it's working but just upside down (in which case, remove the negative I guess? But then also see what platform he's using, whether it's behaving differently because it's OpenGL or DX9/11/12 and maybe add some checks for whether the negative should be there or not).
Denzil said:
@Elyaradine Clever little trick, didn't even think about that! Thanks for the share!
It was just learning through Googling stuff. Which, to be honest, is probably more than 80% of what I know of anything... XD
• Quick question (Unity shaders still confuse me): But is there a reason you're not just calculating the normal for a triangle in a vertex function and then passing that normal to the fragments?

I'm also not 100% sure what passing the world position vector (I assume it's a vector?) into the derivative functions achieves, don't those just take floats between 0 and 1?
• edited
Yeah, you ordinarily grab the normal in the vertex shader and pass it to the fragment shader. But when you do that with meshes in general you get an interpolated/smoothed normal across your triangle, which is not the look they wanted in this game. (You ordinarily work around this by changing your input mesh to have hard edges (duplicated/split normals) for the same look.) But since Unity's terrain doesn't do hard edges out-of-the-box, calculating the normal in the fragment shader is a workaround so that all pixels on the same triangle have the same normal.

To be honest, I'm not really sure how the derivative functions work myself. I mean, I've read a bit about them, and I can make a guess, but I'm hardly an authority. My understanding is that it actually just takes "some value" that you're checking based on adjacent screen space pixels. That value could be a position, or normal, or anything, and I don't think it even matters what space it's in (as long as you know what you're doing with the result of course); it can be a scalar or vector (or even a matrix?), and just returns what the difference is between the left/right and top/bottom pixels' values for ddx and ddy respectively. It feels like it could be a way to do certain things you're normally do in an postprocess/image shader, but on actual geometry.
• Yeah, you ordinarily grab the normal in the vertex shader and pass it to the fragment shader. But when you do that with meshes in general you get an interpolated/smoothed normal across your triangle, which is not the look they wanted in this game. (You ordinarily work around this by changing your input mesh to have hard edges (duplicated/split normals) for the same look.) But since Unity's terrain doesn't do hard edges out-of-the-box, calculating the normal in the fragment shader is a workaround so that all pixels on the same triangle have the same normal.
Hmm. Actually I meant that you'd get the other two vertices that make up the current triangle being rendered, calculate a perpendicular normal to that triangle's plane and pass that through for all of them so it wouldn't interpolate across 3 different normals. I assume that this is what flat shading modes do automatically (the ones that don't need vertex duplication). Is there a reason this couldn't be done in a vertex shader? I assume there's probably no way to get the other vertices on the same triangle?
To be honest, I'm not really sure how the derivative functions work myself. I mean, I've read a bit about them, and I can make a guess, but I'm hardly an authority. My understanding is that it actually just takes "some value" that you're checking based on adjacent screen space pixels. That value could be a position, or normal, or anything, and I don't think it even matters what space it's in (as long as you know what you're doing with the result of course); it can be a scalar or vector (or even a matrix?), and just returns what the difference is between the left/right and top/bottom pixels' values for ddx and ddy respectively. It feels like it could be a way to do certain things you're normally do in an postprocess/image shader, but on actual geometry.
Ah, I see. So the differential functions are returning a vector because you're passing that vector through in the fragment data, then you're asking for the difference across the interpolated (?) vector that used to be a world position? That gives you pretty small vectors along the world-space plane of the current triangle and you get a normal from there. Smart. From my own googling I see people keep asking questions about how the differential functions handle edge-of-triangle cases, answer: Nobody seems to know reliably, shrug.

@Tuism: Have you seen any weird seams with this? What does it look like when it goes wrong?
• Haven't seen any weird seams with this at all, no. When it goes wrong (was running the project in Unity editor on a Windows machine, no other variable has changed as far as I can tell) it looks right in the editor view, but in game view it just looks like a wash of the directional light colour (in this case it was a bit yellow) with zero shading. Just a plain colour, but shadows are still cast/shown.
• edited
@dislekcia
Actually I meant that you'd get the other two vertices that make up the current triangle being rendered, calculate a perpendicular normal to that triangle's plane and pass that through for all of them so it wouldn't interpolate across 3 different normals.
This is not possible in a vertex shader. Perhaps in a geometry shader, but I don't think so. I don't have much experience with those. But in a vertex shader, each vertex is processed 100% independently, without any knowledge of any other vertices. It's just how the pipeline works. And the reason the pipeline works this way, is so that all your GPU cores can processes all the thousands and possibly millions of vertices concurrently.

The ddy, and ddx functions buffers the values for those pixels in the stage of the pipeline where it is called, and works out pure differences as @Elyaradine mentioned. Across one triangle, the derivatives of the position (in any space), is constant over x and y fragment positions. So that's why the "real" normal can accurately be reconstructed from here. And yes, you will get artifacts the edges of triangles, because the derivative changes here. The question is just whether the change is subtle or not! I suspect the seams might just end up being a mix between the colours on both connecting triangles, effectively hiding it (kinda like anti-aliasing)
• edited
@dislekcia: Yeah, you can't get the other verts in the triangle. Well, not easily. (I also don't have geometry shader experience. There are some super-horrible workarounds that I've tried before that kind of work, but that were really painful to use, like embedding the relationship between them in the UVs or vertex colours or something, and in pretty much every case I'd end up with a skyrocketing vert count.)

OpenGL apparently does have the "flat" flag you can set to make triangles rasterize with the same normal. Or you can also use:
```glEnable(GL_FLAT);

I don't know if that would work in the Unity implementation of those shaders though. (I haven't tried, and I generally avoid doing things that don't work for both OpenGL and DirectX.)

 Apparently DirectX has an equivalent (interpolation modifiers), but it only exists in shader model 4 (DX11 in Unity afaik).
• Looks like this on Windows:

DX11 running on latest nVidia divers

So I would like to upload this Unity project up to GitHud - for community consumption. However the project is quite large. What is the best way to "source control" a Unity projects files?

Should I drop the "meta data" folder (Geos\Library\metadata). Is that folder needed? it is over 100mb alone.

Thx
• @UberGeoff: Does it look that way when the game's busy running as well? And on a default Unity sphere?

Because I'm also running DX11, and it looks messed up in my Game view until the game's running. <_<

--
You only really need to upload the Assets folder. The Library folder is kind of more for caching, afaik, and is recreated when you import the project for the first time.
• Yeah - all messed up in game view even when the game is running. So sad. Not sure what to do. :(

• Got this error now:

Shader error in 'Unlit/HardEdge': cannot map expression to pixel shader instruction set at line 65 (on d3d11_9x)

Compiling Fragment program with DIRECTIONAL SHADOWS_OFF LIGHTMAP_OFF DIRLIGHTMAP_OFF DYNAMICLIGHTMAP_OFF
Platform defines: UNITY_NO_LINEAR_COLORSPACE UNITY_ENABLE_REFLECTION_BUFFERS UNITY_PBS_USE_BRDF3
• And:

Shader error in 'Unlit/HardEdge': cannot map expression to pixel shader instruction set at line 65 (on d3d9)

Compiling Fragment program with DIRECTIONAL SHADOWS_OFF LIGHTMAP_OFF DIRLIGHTMAP_OFF DYNAMICLIGHTMAP_OFF
Platform defines: UNITY_ENABLE_REFLECTION_BUFFERS UNITY_PBS_USE_BRDF1 UNITY_SPECCUBE_BOX_PROJECTION UNITY_SPECCUBE_BLENDING
• Ok. Figured it out.

Switched "Rendering" in project settings from "Forward" to "Deferred".

Looks like it is all happiness now.
• Yup, skip the entire Library folder. But be sure to add all the .meta files, as that is where Unity stores import info for each Asset.
• UberGeoff said:
Ok. Figured it out.

Switched "Rendering" in project settings from "Forward" to "Deferred".

Looks like it is all happiness now.
Hey guy, I'm trying to do the exact same thing you guys were trying to do here but I just seem to have gone wrong somewhere any chance you can help me out please?
• @Xyfer: What exactly is "going wrong"?
• @Xyfer: What exactly is "going wrong"?
Well the fast part of it is, since I'm quite new to shaders Im not exactly sure how or where to apply the code you've Kindly provided above, into my own file...
• edited
I've just gone ahead and added it to the default unlit shader so you can see it as an example. (This really basic shader just blends between two colours depending on the direction of the face's normal.)

```Shader "Unlit/Faceted"
{
Properties
{
_BaseCol("Base colour", Color) = (1,1,1,1)
_TopCol("Top colour", Color) = (1,1,1,1)
}
{
Tags { "RenderType"="Opaque" }
LOD 100

Pass
{
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
// make fog work
#pragma multi_compile_fog

#include "UnityCG.cginc"

struct appdata
{
float4 vertex : POSITION;
float2 uv : TEXCOORD0;
};

struct v2f
{
float2 uv : TEXCOORD0;
UNITY_FOG_COORDS(1)
float4 vertex : SV_POSITION;
float3 worldPos: TEXCOORD2;
};

sampler2D _MainTex;
float4 _MainTex_ST;
fixed4 _TopCol, _BaseCol;

v2f vert (appdata v)
{
v2f o;
o.vertex = UnityObjectToClipPos(v.vertex);
o.uv = TRANSFORM_TEX(v.uv, _MainTex);
o.worldPos = mul(unity_ObjectToWorld, v.vertex);
UNITY_TRANSFER_FOG(o,o.vertex);
return o;
}

fixed4 frag (v2f i) : SV_Target
{
float3 x = ddx(i.worldPos);
float3 y = ddy(i.worldPos);

float3 norm = -normalize(cross(x,y));

// Assume basic light shining from above
float l = saturate(dot(norm, float3(0,1,0)));
fixed4 col = lerp(_BaseCol, _TopCol, l);

// apply fog
UNITY_APPLY_FOG(i.fogCoord, col);
return col;
}
ENDCG
}
}
}```
• Thanks ! but I seem to be plagued with bad luck :( I've applied it and I'm getting warnings that say " Shader is not supported on this GPU (none of subshaders/fallbacks are suitable) " and " Shader warning in 'Unlit/Faceted': Both vertex and fragment programs must be present in a shader snippet. Excluding it from compilation. at line 15 " In terms of the first one from looking around online it seems that it may be because I'm on a Mac and it doesn't like that? and no clue on the second one... and also its showing in pink on the preview and build
• Oh, ffs. The code I pasted might have broken, because the forum software turned the "pragma" portions into html links. Maybe it's that, because as far as I know this should cross-compile to OpenGL/Mac just fine.