- Home /
How to zoom camera in x amount of seconds
I'm trying to zoom in a camera by decreasing its field of view but to make things smooth im trying to decrease it a bit many times. Here is my current code:
public static IEnumerator ZoomCamera(Camera cam, float goal, float time)
{
var watch = System.Diagnostics.Stopwatch.StartNew();
float spliter = time / 0.1f; // camera zooms in a bit every 0.1 seconds
float difference = goal - cam.fieldOfView;
for (int i = 0; i < spliter; i++)
{
yield return new WaitForSeconds(time / spliter);
cam.fieldOfView += difference / spliter;
}
watch.Stop();
float elapsedMs = watch.ElapsedMilliseconds;
Debug.Log("Seconds Wanted: " + time);
Debug.Log("Seconds Took: " + elapsedMs / 1000);
}
The Debug.Log statements print out that the seconds took are more than the time wanted (probably because the code itself takes time to execute. I saw a few code examples that used Mathf.Lerp but i didn't understand how to implement it.
Is there a way in which this process could take exactly the amount of time i want it to?
Thanks in advance,
Answer by JavierRuidoRosa · Jul 11, 2018 at 10:27 AM
There is no way you'll have an EXACT timer, it depends even on the machine the code is running, but you can have something that is very precise:
public IEnumerator ZoomCamera(Camera cam, float goal, float time)
{
var watch = System.Diagnostics.Stopwatch.StartNew();
float start = cam.fieldOfView;
float timer = 0f;
while (timer < time)
{
timer += Time.deltaTime;
cam.fieldOfView = Mathf.Lerp(start, goal, timer / time);
yield return null;
}
cam.fieldOfView = goal;
watch.Stop();
float elapsedMs = watch.ElapsedMilliseconds;
Debug.Log("Seconds Wanted: " + time);
Debug.Log("Seconds Took: " + elapsedMs / 1000);
}
Here you see how you could do it using Mathf.Lerp. I've compared it to your method and measured the time of each one in various ocassions (let's call this mine and the one you posted yours):
FieldOfView from 60 to 30 in 2 seconds: mine got 2.006 average and yours got 2.03 average.
FieldOfView from 60 to 30 in 10 seconds: mine got 10.005 average and yours got 10.15 average.
FieldOfView from 90 to 30 in 60 seconds: mine got 60.002 average and yours got 60.67 average.
I assume the problem on your code is the WaitForSeconds instruction, but it's only a suspicion.
I hope this helps!
Thanks a lot for taking the time to write the code, it works just how i wanted it to. still a bit inaccurate for very short transitions: from 60 to 50 in 0.1 seconds: 0.115 from 60 to 50 in 0.05 seconds: 0.066
There's a point when the frame update time is just too slow to give us an ultra precise timer (even 120 FPS can be too slow if we want a 0.01 second timer), but at least we can have a great approximation, imperceptible for any human ;)
Your answer
Follow this Question
Related Questions
Clamping a 2D camera while zooming. 0 Answers
Camera zoom smoothing 1 Answer
Wait before zooming camera at the start of level 1 Answer
Mouse wheel zoom 1 Answer
Changing camera position to shoot through scope on gun 1 Answer