- Home /
What could possibly be wrong with this simplest-possible FPS counter?
I've seen a few scripts that do all kinds of "delta time" trickery, averaging and what not. I thought I'd just do the "simplest possible FPS counter" instead. Just something in addition to Unity's built-in stats I can have in the corner of my .exe window:
using UnityEngine;
public class FpsCounter : MonoBehaviour {
void Update ()
{
guiText.text = (1f / Time.deltaTime) + " FPS";
}
}
Time.deltaTime is clearly defined as The time in seconds it took to complete the last frame. If this is correct, what's wrong the either the above code or Unity's built-in Stats panel? Because I consistently get completely different FPS for both counters as demonstrated below:
Answer by Griffo · Oct 05, 2012 at 10:55 AM
"The frame counter in the stats only tells you how long it takes to draw the graphics."
http://forum.unity3d.com/threads/42608-Is-my-frames-per-second-counter-correct
Thanks... so just to clarify, for "effective end-user FPS" (as opposed to Unity Stat's gfx-only fps) the above script is then essentially correct?
Yes it is. deltaTime is the period and the invert is the frequency/FPS. Here http://docs.unity3d.com/Documentation/ScriptReference/Time-realtimeSinceStartup.html you get a better (longer) version that gets rid of the fluctuation.