Custom shader not working on Samsung S6
I am having problems with a custom shader only on Samsung S6 (also possibly some iOS devices, can't check right now for a number of reasons, but on other Android devices everything is fine)
Since it's a rather complicated shader (it maps sections of a texture along a given x range of UVs and has a lot of variants - the purpose of the shader is to paint the area of effect of an attack) I've tried to find the "minimal expression" that returns an error, to pinpoint the error (at first I thought it would be the usage of ddx and ddy, but it turns out it's not the case.
So, this small shader already fails on the device (Of course, there's a lot of unused variables and #pragma, but that's beside the point):
Shader "Custom/Tiled hints"
{
Properties
{
_MainTex ("Texture", 2D) = "white" {}
_Color ("Color", Color) = (1,1,1,0)
[HideInInspector] length ("length", Float) = 1
[HideInInspector] aspectRatio ("aspect ratio", Float) = 1
[HideInInspector] lMap ("begin", Float) = 0
[HideInInspector] rMap ("end", Float) = 1
[HideInInspector] lCut ("slice left", Range(0, 1)) = 0.25
[HideInInspector] rCut ("slice right", Range(0, 1)) = 0.75
[HideInInspector] lRemove ("remove left", Range(0, 1)) = 0
[HideInInspector] rRemove ("remove right", Range(0, 1)) = 1
[HideInInspector] lHeadTile ("head tile", Range(0, 1)) = 0
[HideInInspector] lHeadMapTile ("head tile", Float) = 0
}
SubShader
{
Tags { "QUEUE"="Transparent-2" "IGNOREPROJECTOR"="true" "RenderType"="Transparent" }
ZWrite Off
ZTest On
Cull Back
Blend SrcAlpha OneMinusSrcAlpha
AlphaTest Greater .01
Fog { Mode Off }
LOD 100
Offset -1, -1
Pass
{
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
#pragma multi_compile_fog
#pragma multi_compile _ TILED_HEAD
#pragma multi_compile TILED_MIDDLE_LEFT TILED_MIDDLE_RIGHT
#pragma multi_compile _ TILED_HEAD_WITH_CUT
#pragma multi_compile _ DEBUG_SHADER
#pragma multi_compile _ UVS_01
#pragma multi_compile _ PING_PONG_FMOD
#pragma target 3.0
#include "UnityCG.cginc"
struct appdata
{
float4 vertex : POSITION;
float2 uv : TEXCOORD0;
};
struct v2f
{
float2 uv : TEXCOORD0;
UNITY_FOG_COORDS(1)
float4 vertex : SV_POSITION;
};
//Parameters
float length;
float aspectRatio;
float lMap;
float rMap;
float lCut;
float rCut;
float lRemove;
float rRemove;
float lHeadTile;
float lHeadMapTile;
float4 _Color;
sampler2D _MainTex;
float4 _MainTex_ST;
v2f vert (appdata v)
{
v2f o;
UNITY_INITIALIZE_OUTPUT(v2f,o);
o.vertex = mul(UNITY_MATRIX_MVP, v.vertex);
o.uv = TRANSFORM_TEX(v.uv, _MainTex);
UNITY_TRANSFER_FOG(o,o.vertex);
return o;
}
float ilerp(float a, float b, float lerped)
{
return saturate((lerped - a) / (b - a));
}
fixed4 frag (v2f i) : SV_Target
{
float2 orUV = i.uv;
bool mirror = lMap > rMap;
float lMapScaled = lMap / length;
float rMapScaled = rMap / length;
float lMapSide = min(lMapScaled , rMapScaled);
float rMapSide = max(lMapScaled , rMapScaled);
return ilerp(lMapSide, rMapSide, mirror ? 1 - orUV.x : orUV.x);
}
ENDCG
}
}
CustomEditor "TiledHintShaderInspector"
}
Smaller versions of the fragment shader seem to work just fine. I'm on Unity 5.3.6 (it's a rather big project, updating is a headache)
The variable length
is not 0 (I set it from code to nonzero values), and lMap
and rMap
are also set from code to different values. This is also true before changing the values (i.e. the material already has these values set like this.
Any help? I don't really know what is going on. I've read that the Mali chips the Samsung s6 uses are known to give headaches, but we can't really drop it from the list of supported devices. Thanks!
Answer by alvaro_perez_omnidrone · Mar 27, 2017 at 09:51 AM
I finally found what the problem was! After comparing the compiled code of the shader I posted with one replacing length
by a known possible value (which worked), I noticed that length got changed to xlat_varlength for some platforms. I didn't know it was an standard function. It seems that generally it just gets translated and the shader works anyway, but or this device, this doesn't seem to be very compatible with the way Unity handles the shader variables.
So, changing length
into some other name solved the problem completely.
How do you compare the compiled shader code on different platforms? I'm having a weird Samsung problem too...