This shader works in editor but only partially in build
This is a simple shader that should make an object grow thinner as the camera gets farther away. It starts thinning at 1.5 meters until it basically becomes paper thin at 20. Everything works great in editor and so I committed it, only to find out it fails miserably in build.
First: In build, objects only shrink when my camera moves away from the world origin, not the object and are at full thickness when I am within 1.5 meters of 0,0,0. It's like in the build the shader is completely ignoring my transformation to world space.
Second: Some, but not all, of my gameObjects move toward the world origin z when my camera moves away from the origin. So for those gameObjects, it is like the v.vertex.z *= distFactor line is operating on world coordinates, rather than on object space coordinates. Just to make certain, I output the vertices of my mesh to my log file, and sure enough, they are in object space, not in world space, so how this seems to be operating in world space is beyond me. Once again, this only happens in the build. In the editor, everything works as expected.
Here's my shader code.
SubShader
{
Tags { "RenderType"="Opaque" }
LOD 200
CGPROGRAM
// Physically based Standard lighting model, and enable shadows on all light types
#pragma surface surf Standard fullforwardshadows vertex:vert
// Use shader model 3.0 target, to get nicer looking lighting
#pragma target 3.0
sampler2D _MainTex;
struct Input
{
float2 uv_MainTex;
};
half _Glossiness;
half _Metallic;
fixed4 _Color;
void vert(inout appdata_full v)
{
float3 worldPos = mul(unity_ObjectToWorld, v.vertex.xyz);
float dist = length(worldPos - _WorldSpaceCameraPos);
float distFactor = smoothstep(20, 1.5, dist);
v.vertex.z *= distFactor;
}
// Add instancing support for this shader. You need to check 'Enable Instancing' on materials that use the shader.
// See https://docs.unity3d.com/Manual/GPUInstancing.html for more information about instancing.
// #pragma instancing_options assumeuniformscaling
UNITY_INSTANCING_BUFFER_START(Props)
// put more per-instance properties here
UNITY_INSTANCING_BUFFER_END(Props)
void surf (Input IN, inout SurfaceOutputStandard o)
{
// Albedo comes from a texture tinted by color
fixed4 c = tex2D (_MainTex, IN.uv_MainTex) * _Color;
o.Albedo = c.rgb;
// Metallic and smoothness come from slider variables
o.Metallic = _Metallic;
o.Smoothness = _Glossiness;
o.Alpha = c.a;
}
ENDCG
}
FallBack "Diffuse"
Answer by joeheacom · Apr 01, 2020 at 04:41 PM
I have figured this out. The problems are caused by static/dynamic batching, which transforms the vertex data to world space before batching meshes.
Though this is great for performance, it seems detrimental to vertex shaders, as in some vertex shaders you won't know if your vertex is in object space or world space. Additionally, it doesn't seem there is a way to tell if your vertex is in object space or world space through a shader variable or variant. I would appreciate if anyone had further insight.