Can an image effect work on vertices displaced by other shaders?
I am trying to use a vertex displacement shader which moves the vertices of an object in the x-z plane. I also have an edge detection effect on the camera (using Roberts Cross Depths Normals). The problem is that the edge detection seems to ignore the vertex displacement created by the shader and draws an edge where the edges of the objects were originally. I have tried to change the Queue tag of the shader but that didn't help. It's not a problem with the vertex displacement shader - I've also tried several other vertex shaders (which I didn't write myself), but the same thing happens with the edge detection. It seems to me that the edge detection is happening before the vertex deformation, but I couldn't find any means of controlling the order. Could anybody please help?
Well, I don't know exactly how your setup or how your shader code looks like, but if it's an image effect then it would need normals and depth I would presume. That information has to be written to a normal and depth buffer, otherwise the image shader won't have any information to work with.
I guess if you do your own displacement, Unity won't know which normals you used (unless perhaps you implemented a surface shader (?)). If you wrote your own vertex and fragment shader then you're basically not using the surface system so I would presume you'd have to update the depth/normal buffer with replacement shaders so that your image effect has the information it needs.
It's a bit vague, sorry, but chipping in with some thoughts.
Thanks for the reply. I used a surface shader, but to displace vertices i had to add a vertex program. Anything simple will exhibit this, for example (just the CG part):
CGPROGRA$$anonymous$$
#pragma surface surf Standard fullforwardshadows vertex:vert
sampler2D _$$anonymous$$ainTex;
struct Input {
float2 uv_$$anonymous$$ainTex;
};
half _Glossiness;
half _$$anonymous$$etallic;
fixed4 _Color;
float _Amount;
void vert(inout appdata_full v) {
float4 worldPos = mul(_Object2World, v.vertex);
worldPos.x += _Amount*sin(worldPos.x);
worldPos.z += _Amount*cos(worldPos.z);
v.vertex = mul(_World2Object, worldPos);
}
void surf (Input IN, inout SurfaceOutputStandard o) {
fixed4 c = tex2D (_$$anonymous$$ainTex, IN.uv_$$anonymous$$ainTex) * _Color;
o.Albedo = c.rgb;
o.$$anonymous$$etallic = _$$anonymous$$etallic;
o.Smoothness = _Glossiness;
o.Alpha = c.a;
}
ENDCG
This will just shift in the x-z plane. I'm not specifically setting depth nor normals; in this case normals are unchanged, and depth I was expecting that the compiled shader would take care of it. I don't know how to specifically write to the depth buffer in a surface shader.
Well, beats me. I would assume that the image shader uses the same depth buffer as rendered geometry. However, if Unity is generating depth + normal buffers as a separate pass, it requires at least normals to be output to a render buffer. Unfortunately I don't know by heart if Unity solves this with the surface shaders (I would had assumed so) or if geometry is rendered with another shader. That shader then obviously need to deform the mesh exactly like your vertex program or it would output wrong depth values.
$$anonymous$$eh, I'll leave this to someone with more/fresher experience with image effects.