- Home /
How do I get the GPU Rendering Time of a Frame?
All I want to know is how long it took the GPU to render a given frame. That's it. I know that Unity has API code for this. I've seen the class definition for that code. UnityEditor.Profiling.HierarchyFrameDataView.frameGpuTimeMs
What I don't know is: How do I get my hands on an instance of the HierarchyFrameDataView that contains this one piece of information? The only think I need is a float (or double) that tells me how long it took the GPU to render the previous frame, within my own code. Not the UI. Not the full blown profiler tool. Just within my own code.
I would rather use FrameTimingManager, but that piece of s**t doesn't work on Windows Platforms.
Time.DeltaTime is the time taken for each frame to render. Is that what you wanted? @fct509
Time.DeltaTime is the total time between two consecutive frames. I'm not looking for how long it was since the last frame, I'm looking for how much time the GPU and ONLY the GPU spent on the last frame.
Answer by sacredgeometry · Dec 05, 2020 at 12:06 AM
Just saw you didnt want to use the profiler so you can time the difference between the LateUpdate call and the OnDrawGizmo one I guess
here are the lifecycle events:
I'd be ok with using the Profiler's API if it were better documented. I'm not 100% sure, but I think the stats pop-up in the Game View Window is getting the GPU Rendering Time from the Profiler's API. The client just wants the GPU Rendering Time thrown into a log file. Also, when I say GPU Rendering Time, I mean GPU Rendering Time. Using the time between LateUpdate and OnDrawGizmo doesn't work because the object culling (along with some other things) are done on the CPU. Giving one component a very late priority and using it to grab the current time in the LateUpdate, and getting the current time in the OnDrawGizmo on a component with a very early priority would get me closer than the deltaTime (which may or may not be scaled).