- Home /
High GPU cost of a color parameter
I have a very simple shader that only reads a texture and multiplies it with a color. I've given it below. In my scene, I'm using it to draw a few 2D objects, filling the screen maybe two times over. The problem is that, on my iPhone 4 at least, the GPU consumption is over 30ms/frame. However, if I remove the multiplication by the _Color parameter at the end, the GPU consumption drops to 11ms or so.
This seems quite weird to me. How does a multiplying by a simple color variable eat so much GPU? I've also tested replacing the _Color variable by a simple "float(0.5, 0.5, 0.5, 0.5)": the GPU time remained around 11ms.
Here is my shader:
Shader "Custom/MultiResSpriteShader" {
Properties {
_Color ("Main Color", Color) = (1, 1, 1, 1)
_MainTex ("Base (RGB)", 2D) = "white" {}
}
SubShader {
Tags {"Queue"="Transparent" "IgnoreProjector"="True" "RenderType"="Transparent"}
ZWrite Off Lighting Off Cull Off Fog { Mode Off } Blend SrcAlpha OneMinusSrcAlpha
LOD 110
Pass {
CGPROGRAM
#pragma vertex vert_img
#pragma fragment frag
#pragma fragmentoption ARB_precision_hint_fastest
#include "UnityCG.cginc"
uniform float _Color;
uniform sampler2D _MainTex;
float4 frag(v2f_img i) : COLOR {
return tex2D(_MainTex, i.uv) * _Color;
}
ENDCG
}
}
}
Any thoughts would be appreciated.
Comment
Your answer
