- Home /
Using displacementmap on-the-fly (shader or function)?
Anyone know offhand any shader that takes a displacement map? Seems like it wouldn't be hard for a shader to derive a normal map from that, or whatever other map(s) it needs.
Short of that, anyone have a handy function for converting height maps to normal maps? I know this is done by the texture importer, but I need this for run-time. Thanks.
All the parallax shaders take a displacement map, but that's not really your question. You really need the second thing.
Come on DaveA, pony up, answer your own question for the greater good now... :)
Are you really going to make me write it even though you already know the solution? (for each pixel take delta of the two next and above it. Construct a vector orthogonal to both. If no two next, take the one next to it and the delta with himself)
Don't be lazy, you do it! :p
Ok, I've thought about it, and I still think its a bad idea to do in a shader, but I have an idea. Firstly, let me give a quick explanation of vertex and fragment shaders. The vertex shader is called per vertex on the object. Generally, this positions the vertex based on its position and rotation relative to the cameras position, rotation, and view frustum.
//that's this line:
output.vertex = $$anonymous$$ul(input.vertex, UNITY_$$anonymous$$ATRIX_$$anonymous$$VP);
You also do per-vertex lighting, and anything else that you can do with vertex information such as vertex colors or extrusion along normals.
The fragment/pixel shader is basically per fragment. Slightly simplified, that's approximately per visible pixel. Texturing and per pixel lighting is done in the fragment shader. Normal mapping is also done here as well as texture combining such as splat map. Values sent from the vertex shader to the pixel shader are interpolated based on the closeness to a given vertex.
It sounds like it will be too expensive if you run it that many times. Now, let me get to my idea. I think Joshua and Warwick have been hinting at this. Have you tried a render texture. You could copy the source texture to a render texture through Graphics.Blit() with a material that converts your height map to a normal map. This would let you put it outside of a shader and hence controlling how frequently its called. But, this is only relevant once you write the shader:
var heightmap : Texture2D;
var normal$$anonymous$$ap : RenderTexture;
var converter$$anonymous$$at : $$anonymous$$aterial;
//This material has a shader that converts the height map to a normal map;
function Update () [
if(Application.frameCount % 15 == 0) {
//only update the normal map every 15 frames.
Graphics.Blit(height$$anonymous$$ap, normal$$anonymous$$ap, converter$$anonymous$$at);
}
}
Yes, what Peter G says. (I call such things filter shaders, no idea where I got that name from, it's how you do "SetPixel" stuff but in realtime, and it's how the Image Effects like blur, SSAO, FXAA3, etc. are implemented; technically it's the same as any other rendering, it just happens to render a quad over the whole output and the shader of the material of that quad just happens to do something interesting with it's input texture and parameters).