[Shader Help] Material alpha fade based on distance from camera

edited in Questions and Answers
Dearest Forum-aspected Shadrar Wizzards,

I'm looking to have a forcefield material that fades from opaque to transparent based on the pixel's distance from the camera. In other words, I'd like to read the ol' depth buffer for my frame, and set the material's alpha to an inverse depth value. Google pokings suggest that I render a depth texture from camera and sample that for the effect. Problem is, Unity free means I'm not able to use deferred rendering or render-to-texture. Grah. With that option gone, I can't for the life of me figure out how to get access to that juicy, juicy depth buffer.

I'm also aware that I'd need to do a double depth/draw pass in order to have a surface with a transparent material show up in the depth buffer. Important, but once known, irrelevant, since I still can't get access to that depth data.

If anyone could point me in the right direction, or has any alternative suggestions to get the same (or similar) effect using the forward renderer, I'd be grateful.

(An alternative hack might be to have a material-specific point light stuck to the camera that the shader samples to affect alpha instead of colour. Feasible?)

Comments

  • edited
    Why not just use a script that changes the material alpha based on the distance of the camera instead of having it inside a shader?
  • Good question. The answer: scale and aesthetics. This material is going to be mapped onto several huge planes that enclose a large terrain to keep the player from wandering off the map. I'd prefer that these planes only be visible in the area immediately around the player/camera, rather than having an enormous textured wall fading in and filling large portions of the screen.
  • STOP WITH THE OVER-THINKING GAZZAS!!!!! :)

    You don't need the depth buffer at all. In the vertex shader, just calculate the length of the vector from the camera position to the vertex and scale that between some arbitrary min/max distance values (ie. map 1 meter to 0 alpha, and 200 meters to 1 alpha or whatever). Calculate that alpha per vertex and then put it in your output to be interpolated and then use it in the fragment shader to modulate your final alpha value.

    So you'll likely want to use Unity's _WorldSpaceCameraPos value, and then transform your vertex pos into world space before you subtract and get the magnitude. You should be able to use _Object2World to do that matrix multiplication and then bob's your uncle.
    Thanked by 1Gazza_N
  • edited
    I'm rather wobbly on the vertex shader side of things, so this simply didn't occur to me. I'll give it a shot! Cheers AngryMoose! \:D/
  • @AngryMoose: Quick one - I'm looking at biiiiiiiiiiiiiiiiiiiiiiiiiiig planes for this one. Your method would likely work fine for smaller objects, but when the vertices are 20000 world units apart, won't it break down a tad?
  • Here's the guts of a shader that should pretty much do what you're looking for. May/not compile; I just pieced it together quickly based on something I did in the past.

    struct v2f
        {
            half4 pos       : POSITION;
            fixed4 color    : COLOR0;
            half2 uv        : TEXCOORD0;
        };
    
    
    
        sampler2D   _MainTex;
        half        _MinVisDistance;
        half        _MaxVisDistance;
    
    
    
        v2f vert (appdata_full v) 
        {
            v2f o;
    
            o.pos = mul((half4x4)UNITY_MATRIX_MVP, v.vertex);
            o.uv = v.texcoord.xy;
            o.color = v.color;
            
            //distance falloff
            half3 viewDirW = _WorldSpaceCameraPos - mul((half4x4)_Object2World, v.vertex);
            half viewDist = length(viewDirW);
            half falloff = saturate((viewDist - _MinVisDistance) / (_MaxVisDistance - _MinVisDistance));
            o.color.a *= (1.0f - falloff);
            return o;
        }   
    
    
        fixed4 frag(v2f i) : COLOR 
        {
            fixed4 color = tex2D(_MainTex, i.uv) * i.color;          
            return color;
        }
    Thanked by 2Gazza_N Elyaradine
  • Why so big? 20k world units is just... WHY SO BIG!!!>!!>>!??111@1 1 one

    If you're modellng 1unit = 1m, that's 20km... that's... how can you render so far? Even if 1u = 10m, that's still 2k and daaaaamn big.

    Can't use use multiple objects? Or at lest a Unity plane which is gridded with lots of verts instead of just a quad?
    Thanked by 1Gazza_N
  • Whoooo! Code! \o/

    As for the enormo-huge planes? Heh. Functionally, they're your Massive Invisible Walls To Keep Players And Projectiles On The Terrain (and distances aren't quite 20000 units, but will deffo exceed the clipping plane). I want localised visibility, so that the player can see the wall's there with some sort of glowy forcefield texture when they're up close, but not so that the texture is stretching aaaaaaall the way across the plane. Think a point light attached to the camera that "illuminates" the alpha channel as an approximation of the effect I'd prefer. Super precision's not essential.

    I'll try it out with a subdivided plane mesh as you suggest though. That'll probably do the trick just fine. :)
  • @AngryMoose: It's working like an absolute charm. Thanks much for your help. :D
    Thanked by 1AngryMoose
Sign In or Register to comment.