- Home /
Why is Time.captureFramerate an integer value?
It would be really nice to have a floating-point equivalent for Time.captureFramerate, so I could force external scripts and Unity code to run at 23.976, 29.97 or 59.94 Hz. Now I have to set it to 24, 30 or 60, so there is a difference in time between prepared values and output when the result is synchronized to NTSC formats. E.g. an animation, which is authorized to be exactly one hour long, becomes 3.6 seconds longer. It is only a difference of 0.1% but professional customers can question the precision of a system or lose confidence in general. The use of Time.timeScale is no option, if the flexibility of Unity should be retained, because scripts from the Asset Store or made by customers can create conflicts when they use that value, too.
Why has Time.captureFramerate ever been designed to be an integer initially? Would it be possible to add a floating-point alternative?