- Home /
Calculate time per frame
Hello, i'm looking for a way to calculate time per frame. It's not FPS which is just number of Updates per second, it time it took to render this single frame. Time per frame should be calculated in ms not in fps, so 30fps is ~= 33ms per frame per second.
So simply i want to track when main loop started to render this particular frame and when it's done.
EDIT: I should clarify that i want to calculate time per frame independent of targetFps. Otherwise deltaTime would suffice. But when targetFps is set, for ex., to 30fps, deltaTime will always be around 0.033sec, when in reality frame could be rendered much faster. I want this stat to provide myself with insight just by how much my targetFps is lower than it could be.
EDIT2: As of right now i see i can't get it. What i want in nutshell (meta-c).
int last_start = 0, last_done = 0;
time_t last_frame_ms;
loop
{
int last_start = gettime();
UNITY_LOOP(last_frame_ms: last_frame_ms);
last_done = gettime();
last_frame_ms = last_start - last_done;
}
Not exactly, Time.deltaTime calculated with per frame time padding when targetFps is set. I want to calculate exaclty how long this frame was rendered without any delays due to targetFps
This sounds like something the profiler does for you. Do you need it in game, or just for testing purposes?
I want it in game. I would like to send stats to my server to then decide on targetFps value. Also it will give me stat about just how fast my game is perfo$$anonymous$$g after each update.
I may have 30 fps on 90% of users when in fact frames being rendered in 0.016, and i can up my targetFps to 60 without worry.
Answer by moemartin2 · Jan 21, 2014 at 07:46 AM
You can use fps =1/Time.deltaTime
.
deltaTime is targetFps dependant. For ex.: if you are to set targetFps to 30, than you will NEVER get deltaTime less than 0.033, you may get more if frame rate is dropping less than 30, but you'll never ever get more.
$$anonymous$$y goal is to calculate how many milliseconds was the current/last frame rendered, not time between main loop invoking Update function.
As of right now i see i can't get it. What i want in nutshell (meta-c).
int last_start = 0, last_done = 0; time_t last_frame_ms;
loop { int last_start = gettime(); UNITY_RENDERER(last_frame_ms: last_frame_ms); last_done = gettime(); last_frame_ms = last_start - last_done; }
Answer by brycedaawg · Jan 20, 2014 at 03:31 PM
Time.realTimeSinceStartup may be what you're after. Time.realTimeSinceStartup gives you the exact time (in seconds) since your game was started. It ignores timescale and fixed framerate and comes directly from your system's clock. This means that if you were log Time.realTimeSinceStartup at the start of your Update method and then log it again at the end, you'd receive two different values because Time.realTimeSinceStartup is (like the name suggests) in real time.
I'm unsure if Update is called more than once per frame if you limit your framerate, but if it is all you'd need to do is store Time.realTimeSinceStartup in a global every Update cycle like so:
public float timeLastFrame;
void Start()
{
//Initialize our timeLastFrame variable
timeLastFrame = Time.realTimeSinceStartup;
}
void Update()
{
float realDeltaTime = Time.realTimeSinceStartup - timeLastFrame;
TimeLastFrame = Time.realTimeSinceStartup;
//Do whattever you want with delta time here...
}
Regrettably Update will be invoke only once per frame, so not faster than target framerate. It is right place to check how many frames actually rendered per second (how many updates was invoked and when), but you can't check how long did the previous frame has been rendered.
I tried to experiment with OnPreRender/OnPostRender hooks but they give some very strange results.
Ah darn it. Well I looked over the $$anonymous$$onobehaviour documentation and I don't believe there's any method that is invoked independently of the engine's fixed frame-rate. I could be wrong about this so good luck to you if you keep looking.
Answer by hoihoi87 · Aug 15, 2018 at 09:03 PM
This doesn't really solve the original problem but it is a work around. If you determine your min spec device of the majority of your users (I'm guessing you are collecting device info), you could run the game with uninhibited FPS as a local test, then use those results to set your FPS for the release version.
Answer by RShields · Aug 15, 2018 at 09:54 PM
With newer versions of Unity, you can go to Window > Analysis > Profiler
and check all the times in ms.
Any way to get this data from inside the game? I and the original poster are using the data for telemetry of users.
You can definitely save the data, if that's what you're trying to do: https://stackoverflow.com/questions/32809888/how-can-i-save-unity-statistics-or-unity-profiler-statistics-stats-on-cpu-rend
I'd be very surprised if you couldn't extract the data to the game in a similar way
Answer by SupriyaRaul · Feb 18, 2020 at 02:11 AM
Did you find a solution for this problem @genius-fx ?? I also want to find the render time for every frame. I was thinking about using OnPreRender and OnPostRender, but you said you got weird results with it!?
Your answer
Follow this Question
Related Questions
Changing two different objects renderer colour 1 Answer
Update Renderer Bounding Volumes - Lower Game Performance 1 Answer
Always render a object on top of another. 8 Answers
Render Transparent Geometry takes high CPU and low fps 1 Answer
Option to Toggle Renderers/Colliders OFF Slows Game FPS to Standstill 1 Answer