I am completely misunderstanding what the point of Time.deltaTime is.
I am trying to time how long my character is in the air so that I can play the correct animation upon his landing (soft or hard). I have a public airTime float at the top of my script which reads 0, then when my character first lifts off the ground I have put the airTime += time.deltaTime as the next process. Finally, before the landing animation, I have the 'if (airTime >= 2f)' then play the landing animation.
None of what I said though really matters because when I watch the public float box for my airTime in Unity, it counts up by 0.02 seconds every jump. I read deltaTime is used to tell you how long it takes to render the last frame, but other people are using it to make timers or stop watches, or countdown clocks, and I just don't get how a timer that only counts by 0.02 seconds is even useful in the slightest? I am sure I am missing something huge, but I've followed about six tutorials and they always end up counting by 0.02 no matter if I change the float, name the string something different, or try to count down instead of up.
All I wanted was a simple way to tell how long my character is in the air for after pressing space, and it's starting to make me believe there are outside circumstances that are messing with how I'm using deltaTime because if all it does is count by how long the last frame took to render, why use it? Sorry for being ignorant but I do want to learn so I will go to any tutorial somebody has or YouTube video (I've watched a few but maybe you know a different one) Thank you.
Your answer
Follow this Question
Related Questions
How to shake the virtual camera when time scale is 0? 0 Answers
How to solve the error CS1026I 0 Answers
Maximum Delta Time 0 Answers
Type out text in sync with audioclip 0 Answers