- Home /
change timer resolution of Windows
The timer resolution of Windows is 15.6ms by default. You can get the value by using ClockInterval or something like that. When I open Unity3D, the timer resolution changes to 10ms, while I open the game(exe) compiled by Unity3D, the timer resolution changes to 1ms!
I'm making online game, the server and client run in the same FPS. The server is not written by Unity3D, and it's timer resolution is 15.6ms. The server uses GetSystemTimeAsFileTime and Sleep to lock frame. The client uses InvokeRepeating to lock frame. Because the client uses smaller timer resolution than the server, which means the server may has more error than client when get time and sleep.
I find the server runs slower than the client, probably because the server sleep more time than the client. When I change the server's timer resolution to the client, they run at the speed. But I can't change the server's timer resolution, since I'm not sure what will happen.
Question 1: How can I change the timer resolution since Unity3D changes it?
Question 2: Why the server runs slower than the client?
Sorry for my poor English, but I think you understand :)