- Home /
Using Time efficiently regardless of current frame rate.
Hello!
I hope this is a simple fix for someone out there, I've looked around for a fix and tried a few things, but nothing seems to give me the correct result.
The main issue is using a coroutine like this:
float elapsedTime = 0f;
float timeToComplete = 2f;
while(elapsedTime < timeToComplete)
{
elapsedTime += Time.deltaTime;
yield return 0;
}
//Do stuff
I use this function to run all my timer countdown code, but what I've noticed is if I get a drop in frame rate the time for it to complete is slightly longer than it should be. In my head this makes sense simply because Time.deltaTime will return a number from how long it took for it to complete this frame. So if the game is running slower than it should take longer. So I tried replacing Time.deltaTime and using Time.time to get this instead:
float elapsedTime = 0f;
float timeToComplete = Time.time + 2f;
while(elapsedTime < timeToComplete)
{
elapsedTime = Time.time;
yield return 0;
}
//Do stuff
This however gives me the same behaviour as using Time.deltaTime.. Isn't there a time function that will give me the same result each time regardless of the frame rate? I just want to count to 'X' seconds and have something happen at the end, regardless of how much frame rate I have.
Any help would be greatly appreciated!!
The problem is that the interval between frames (Time.deltaTime) is a) finite and b) not fixed. So it's extremely unlikely that at the end of the final frame in this process (the one that where elapsed is no longer less than timeToComplete), elapsed will end up being exactly equal to timeToComplete. You'll always overshoot.
Chances are you'll be best off just refactoring things in such a way that it's no longer a problem that the process takes a fraction of a second more than timeToComplete, to complete. By the way, are you sure it is a problem? How much of an overshoot are you actually getting?
But an alternative you could try is using fixedUpdate ins$$anonymous$$d of a coroutine, that way the interval becomes predictable.
I'm not really concerned with the ti$$anonymous$$g to always hit the 'timeToComplete' mark, because it will always over shoot, the main problem is the noticeable difference of how far it goes when the game slows down, for example I'm using it to slow my character down after "Dashing". At a normal frame rate you slow down very quickly, but at a slow frame rate you slow down very slowly which in turn makes it look like your dashing much further than it should be. I've even tried messing around with 'Time.time' using 'return new WaitForFixedUpdate' but to no luck, I always get a different result when the frame rate is low.
I even tried just recording the differences in time just to see if they were close enough:
float elapsedtime = Time.time;
float endTime = Time.time + delay;
Debug.Log("Start Time : " + elapsedtime + " > " + endTime);
while (Time.time < endTime)
{
yield return null;
}
Debug.Log("End Time : " + (Time.time - elapsedtime) + " > " + endTime );
The difference in time is about 2 times larger when your frame rate is low. Which is very odd. The only thing I'm thinking now is Coroutines are just bad to use when you have a low frame rate.
Are you saying the overshoot is doubled when the frame rate is slow? On the face of it, that doesn't seem particularly surprising or excessive. What's the size of the difference?
It sounds to me that the countdown might be a red herring and the issue might be to do with what you're doing during/after this countdown (for example how you're slowing the character down) rather than with the countdown itself. For example if one were to reduce something's speed by a fixed amount in every frame, rather than by an amount that depends on deltaTime, then one would expect the deceleration to correlate with the frame rate.
To eli$$anonymous$$ate frame rate differences, you may just need to make sure that (the extent of) whatever you're doing in "DoStuff" is dependent on how long it is since you last "did stuff".
Generally speaking, if ti$$anonymous$$g is that critical, I'd have thought that using FixedUpdate is probably the way to go.
@Statement I simply just call the coroutine to start the slowing down the character.
@BonfireBoy I'm going to try what you said, and try and work out how long it has been since the timer and apply that to how long it should take to slow down, that sounds like a reasonable way to go about it to accommodate the frame rate issue.
Your answer
Follow this Question
Related Questions
Need opinions, or facts, about how to best go about programming this behavior. Basically Redstone. 0 Answers
Toggling bools automatically using coroutines 1 Answer
Trouble Resuming after Yielding while Inside Coroutine 1 Answer
How would I make a timer frame independent? 1 Answer
Multiple Cars not working 1 Answer