- Home /
How do I get the render thread (GPU) ms from c# script
I am trying to create an adaptive resolution based on the amount I can get away with, whiles keeping the target framerate. At the moment I am using Time.deltatime and using that to get the overall framerate, however if say the program is using more CPU than GPU then it of course downgrades the resolution to try and increase the framerate even though the GPU could easily support a higher resolution without affecting the framerate.
Is it possible to get the render thread in ms via a c# script, the value that is shown in the editor for example when viewing the stats?
Any help would be greatly appreciated.
Your answer

Follow this Question
Related Questions
Rendering overhead on Android 0 Answers
Why the overdraw mode behave like this with geometry rendering? 2 Answers
Unity3D OSX rendering in minimized / hidden window 1 Answer
iOS framerate drops every 10 seconds or so... 1 Answer
Is it possible to offset the rendering start time in millisecond level? 1 Answer