- Home /
Using Time.time to test the performance of scripts
I'd like to do some heavy optimizations of the scripts we run in our game (the critical ones). One thing I did to test performance was to get the timestamp at the beginning and end of an Update() function, and print it every 100 times. Here's the code:
void Update() { debugTimeStart = Time.time;
//here's the stuff going on
debugTimeTaken += Time.time - debugTimeStart; debugTimeCount++;
if (debugTimeCount == 100)
{
Debug.Log("Time: " + debugTimeTaken);
debugTimeCount = 0;
debugTimeTaken = 0;
}
}
The problem is that I always get 0 as the result.. Is there something more precise than Time.time?
Thanks a lot!
Answer by duck · Dec 12, 2009 at 08:15 PM
Yes, Time.time and the other Time properties (.deltaTime, .fixedDeltaTime) have a value which is only updated between frames.
There are a few other methods of timing which should give you what you need, such as:
- Environment.TickCount (An integer, in milliseconds)
- DateTime.Now (A DateTime object, which has many useful properties)
- System.Diagnostics.Stopwatch (Provides a class with methods for measuring time)
Answer by TowerOfBricks · Nov 15, 2010 at 08:40 PM
Time.realtimeSinceStartup also works great, I use that for all my profiling.
If this is also accurate, then this should be the best answer since it doesn't need to using System etc.
Answer by Lance Sun · Dec 12, 2009 at 04:58 PM
Try System.Diagnostics.Stopwatch
is System.Diagnostics.Stopwatch part of .Net 2.0? I'm using Unity iPhone, and it doesn't seem to work. Thanks!
Your answer
Follow this Question
Related Questions
iOS. Jerks in first 15-20 seconds after starting scene 0 Answers
Debug build runs better than release build. 0 Answers
Logging setting in player settings 1 Answer
How do I get my game to run faster? 4 Answers
Optimizing Headless Servers 2 Answers