- Home /
Delta time not accurate enough.. What should I do?
I'm having problems with syncing time with millisecond accuracy.
Here's what I did:
I started a game on my first computer with the time read out displaying.
I started another game on my other computer which is sitting right next to it with it's time also displaying.
I took a picture of both screens with my digital camera to see if they were close to the same time. They were very close.
I waited about two minutes and took a picture again. This time they had drifted off from each other by about 3 tenths of a second.
Something tells me I need a more accurate delta instead of Time.deltaTime
Anyone have any suggestions?
I suppose I should be using doubles and not floats... However, Time.deltaTime is a float.
Answer by Berenger · Feb 08, 2012 at 11:28 PM
Try System.DateTime.UtcNow, TotalSecond and those kind of things.
No idea what you are talking about.. I looked those up in the script reference and found nothing.
Thanks for trying anyways...
To be honest I've never used it. I found it in Aron Granberg AStar code, there was those line to profile his function performance (comments are an alternative):
//float startTime = Time.realtimeSinceStartup;
System.DateTime startTime = System.DateTime.UtcNow;
... code and stuff to profile
//Time.realtimeSinceStartup-startTime;
lastScanTime = (float)(System.DateTime.UtcNow-startTime).TotalSeconds;
Debug.Log ("Scanning - Process took "+(lastScanTime*1000).ToString ("0")+" ms to complete ");
Answer by by0log1c · Feb 09, 2012 at 04:15 AM
You have mistaken what is Time.deltaTime. It represents the time elapsed since the last frame. It depends on zillions on factors, software, hardware, program running, etc and is in no way related to Unity syncing out.
You want to use Time.time which represents time elapsed since the start of the application.
What Berenger propose is part of the base .NET/Mono Framework on which Unity is running. You can find more info about it on MSDN. Note that it can also be done in JS, with the correct syntax.
Just thinking out loud: I can't think of a way to start a process on both computer with millisecond accuracy without 'manually' syncing them once they're running. Either you're pressing buttons or sending calls, both of which will be different no matter what, isn't it?
EDIT: oh yeah, you could have something monitor time closely and launch the process for you, duh.
Well, actually.. it's synchronization via the network.. I finally found out that Time.time - as stated in my previous answer. I should have marked it as correct, but who am I to say my answer IS correct. I have no idea yet. I'll only find out in a couple days when I finish a bunch of ti$$anonymous$$g logic in my game.. That $$anonymous$$SDN link you gave me looks interesting though.. I will have to check it out.
Thanks!
Answer by demented_hedgehog · Feb 22, 2015 at 05:34 AM
Time.time is the "The time at the beginning of this frame (Read Only)." So that's no good if you want to time something within a frame.
(Note: Environment.GetTickCount is not in unity?)
Possibly try Time.realtimeSinceStartup?
Answer by Rush3fan · Feb 09, 2012 at 04:04 AM
I think Time.time is my closest bet. I still need to work on some other problems with my synchronization before I start worrying about this again.
Thanks anyway Berenger.