- Home /
How can I calculate the frame rate? I have some misunderstandings with deltaTime.
I feel a little embarrassed asking a simple question such as this so far along in my projects. But oddly, I did a search and came up empty. Thought for sure someone would have asked this question before.
Is there a simple formula for figuring out the frame rate so I can view it in my standalone game? I understand how Time.deltaTime works (insofar that it counts time rather than frames, and how to use it for accurate animation/simulation speeds on all platforms), but sometimes I have trouble figuring out what to do with it, so I have no idea what kind of equation I pop it into to count how many frames have passed since the last second, even with a full understanding that every time Update() runs, it means a frame has passed. Think I'm just confused.
Perhaps as an aside to help me understand deltaTime a bit better (I could just ask for code and be on my way, but I also want to solve the root of my misunderstanding) and a part two to my question... I saw a script that DaveA posted a week ago for someone asking for a script detecting double-tapping. I recall that the code basically did nothing but asked for a check on Time.deltaTime inside Update(). But how does that work if that check runs every single frame? Shouldn't it never detect a double-tap if the same check code is running fresh and new on every Update()?
I feel like maybe once I understand more practical applications of deltaTime and why it works, I might be able to clean up some of my scripts and invent my own solutions to certain problems, like my inability to calculate the frame rate on my own. :)
Answer by Eric5h5 · Feb 01, 2011 at 09:05 PM
http://www.unifycommunity.com/wiki/index.php?title=FramesPerSecond
Without seeing the double-tap code, it's impossible to say. Time.deltaTime, as implied by the name, is simply the amount of time that has passed since the last frame.
Sorry, I wasn't asking about the specifics of his double-tap code. I was just curious how the logic of it works.
If it takes, say, one second to render 10 frames, is it safe to say that the deltaTime for each frame is going to be about 0.1 (second)? I just am trying to figure out how I can guesstimate what values are going to be returned by it, because typically I don't understand what it does under the hood.
I don't know how the logic of it works without seeing the code. :) If it takes one second to render 10 frames, each frame might be .1, but it's also possible that you had 9 frames that took .01, and one frame that took .91, because of some heavy processing for whatever reason. Generally it's unwise to make assumptions.
While the Framer Per Second script is a helpful link, I'd like to learn more Time.deltaTime as well.
Your answer
Follow this Question
Related Questions
Is it a bad practice to change the timescale for the whole game? 1 Answer
Is there any way to determine Unity's actual target frame rate? 1 Answer
Big frametime on iOS + intermittent cpu-waits-gpu 0 Answers
I can't get framerate independence to work... 1 Answer
Synchronize VideoPlayer with Time.captureFramerate 0 Answers