[Shader Help] Material alpha fade based on distance from camera
Dearest Forum-aspected Shadrar Wizzards,
I'm looking to have a forcefield material that fades from opaque to transparent based on the pixel's distance from the camera. In other words, I'd like to read the ol' depth buffer for my frame, and set the material's alpha to an inverse depth value. Google pokings suggest that I render a depth texture from camera and sample that for the effect. Problem is, Unity free means I'm not able to use deferred rendering or render-to-texture. Grah. With that option gone, I can't for the life of me figure out how to get access to that juicy, juicy depth buffer.
I'm also aware that I'd need to do a double depth/draw pass in order to have a surface with a transparent material show up in the depth buffer. Important, but once known, irrelevant, since I still can't get access to that depth data.
If anyone could point me in the right direction, or has any alternative suggestions to get the same (or similar) effect using the forward renderer, I'd be grateful.
(An alternative hack might be to have a material-specific point light stuck to the camera that the shader samples to affect alpha instead of colour. Feasible?)
I'm looking to have a forcefield material that fades from opaque to transparent based on the pixel's distance from the camera. In other words, I'd like to read the ol' depth buffer for my frame, and set the material's alpha to an inverse depth value. Google pokings suggest that I render a depth texture from camera and sample that for the effect. Problem is, Unity free means I'm not able to use deferred rendering or render-to-texture. Grah. With that option gone, I can't for the life of me figure out how to get access to that juicy, juicy depth buffer.
I'm also aware that I'd need to do a double depth/draw pass in order to have a surface with a transparent material show up in the depth buffer. Important, but once known, irrelevant, since I still can't get access to that depth data.
If anyone could point me in the right direction, or has any alternative suggestions to get the same (or similar) effect using the forward renderer, I'd be grateful.
(An alternative hack might be to have a material-specific point light stuck to the camera that the shader samples to affect alpha instead of colour. Feasible?)
Comments
You don't need the depth buffer at all. In the vertex shader, just calculate the length of the vector from the camera position to the vertex and scale that between some arbitrary min/max distance values (ie. map 1 meter to 0 alpha, and 200 meters to 1 alpha or whatever). Calculate that alpha per vertex and then put it in your output to be interpolated and then use it in the fragment shader to modulate your final alpha value.
So you'll likely want to use Unity's _WorldSpaceCameraPos value, and then transform your vertex pos into world space before you subtract and get the magnitude. You should be able to use _Object2World to do that matrix multiplication and then bob's your uncle.
If you're modellng 1unit = 1m, that's 20km... that's... how can you render so far? Even if 1u = 10m, that's still 2k and daaaaamn big.
Can't use use multiple objects? Or at lest a Unity plane which is gridded with lots of verts instead of just a quad?
As for the enormo-huge planes? Heh. Functionally, they're your Massive Invisible Walls To Keep Players And Projectiles On The Terrain (and distances aren't quite 20000 units, but will deffo exceed the clipping plane). I want localised visibility, so that the player can see the wall's there with some sort of glowy forcefield texture when they're up close, but not so that the texture is stretching aaaaaaall the way across the plane. Think a point light attached to the camera that "illuminates" the alpha channel as an approximation of the effect I'd prefer. Super precision's not essential.
I'll try it out with a subdivided plane mesh as you suggest though. That'll probably do the trick just fine. :)