- Home /
Built-in ShaderLab "_Time.y" and "Time.time" are not equal?
Hi!
I am trying to make an animation in shader using the built-in "_Time" property. Lets say the animation is fade-out of output color (white to black) based on values of current T0="Time.time" and some future T1="Time.time + 10". In shader I do simple linear interpolation:
fixed f = clamp( (_Time.y - T0)/(T1 - T0) );
fixed4 color = fixed4(1,1,1,1) * (1-f) + fixed4(0,0,0,0) * f;
From my first impression it all works right, but some time ago I did notice that "_Time.y " is not equal to Time.time. The animation is always playing with delay and I think the longer the scene is playing the longer is delay occured.
So, the question is how the "_Time" value corresponds to Time.time.
try to reproduce some great delta with _Time and Time.y and tell how we can do it too 8)
i often use these values and AFAI$$anonymous$$ it's the same value.
Answer by bad3p · Aug 21, 2012 at 03:04 PM
_Time.y = Time.timeSinceLevelLoad
Solution:
material.SetFloat( "_Time0", Time.timeSinceLevelLoad );
material.SetFloat( "_Time1", Time.timeSinceLevelLoad + FadeTime );
Thank you. After hours of debugging, moving things from fragment to vertex shaders, wondering why iOS and Android differ, etc. Another strike for information that should be in the docs (I mean, if you have Unity source code, this is going to be pretty obvious...) but isn't and ins$$anonymous$$d is buried somewhere on the web.
From the docs: "Time (t/20, t, t*2, t*3), use to animate things inside the shaders.".
Useless.
Answer by aldonaletto · Aug 15, 2012 at 06:13 PM
I would not mix Time.time and _Time.y: the docs don't say they are the same, thus this feels unreliable to me - they may be independent variables, and accumulate some difference over time. I would pass Time.time to a shader variable in Update with material.SetFloat(), as you probably is already doing with the variables T0 and T1, or use _Time.y to set T0 and T1 (not always possible, depending on your logic).
Answer by monotoan · Oct 19, 2017 at 07:39 PM
A slightly stronger solution to this issue is to run one script that passes Time.time (or whichever time value you need) into a GLOBAL shader variable. This solution enables you to access game time from any shader or material without having to worry about individually setting time variables on any of those materials.
For example:
Shader.SetGlobalFloat("_GameTime",Time.time);
Then, in any shader, you can declare and use
float _GameTime;
Just remember, global shader variables won't be updated or work properly if you add them to the "Properties { }" section of a shader, so make sure you just put this declaration in the SubShader { } / CGPROGRAM section.
Just to confirm, this works great. Thanks! The two are definitely not always equal, for example I've found that the Unity splash screen (or something else in builds) delays the shader time start but about 5 seconds compared to Time.time
I've also tried this approach and, so far so good. I wanted to be able to adjust the rate at which time passes, but then I had to recalculate an "offset" to make the transition smooth, else the new multiplier lets _Time * _Speed jump at each change. I now calculate on offset each time I change the _Speed, and send the calculated time to the global shader variable, keeping a handful of shaders aligned.
Answer by adamgryu · Sep 10, 2018 at 08:56 PM
You can also get the shader's _Time value from a C# script by calling this:
Shader.GetGlobalVector("_Time")
This way, you don't have to modify your shaders to use a custom time variable. You can just use the shader time in your C# script.
Answer by GrayedFox · Jan 03, 2021 at 02:46 AM
Rather than using a time variable I would put that logic into a coroutine which drives the animation property of your shader. I go into more detail in this answer but I think the same logic should apply: http://answers.unity.com/answers/1801454/view.html