How can I have Time.maximumDeltaTime be unaffected by Time.timescale?
I have a game I would like to be able to increase the playback rate up 100x (Time.timeScale = 100). This expectedly results in larger frame times and thus larger values of Time.deltaTime.
However, when running on machines that cannot handle the workload I would like the simulation to slow down. This works fine when not using Time.timeScale as frame times greater than Time.maximumDeltaTime will get clamped and thus run slower. Unfortunately, Time.timeScale appears to be applied to Time.maximumDeltaTime which limits its usefulness.
For example, I set Time.maximumDeltaTime to 0.1 and then I get a frame that takes 200ms (0.2s) and the framerate is capped as expected. In another example, I set Time.maximumDeltaTime to 0.1 and Time.timeScale to 10 and then get a frame that takes 200ms. In this case, it does nothing as 0.1 x 10 = 1s which is greater than what elapsed.
How can I get the framerate to be capped properly when using timeScale?