float error while incrementing
Hi! I have two floats in my script, lets says they're named like this:
private float fl1; private float fl2;
in my start function I assign them with the following value:
fl1 = 1f; fl2 = 0.01f;
every time I do some sort of input ( lets say I click my mouse button) fl2 is added on fl1. I also display the value of fl1 on a textfield in my UI.
if(input.getmousebuttondown(0)){ fl1 =+ fl2
mytextfield.text = fl1.ToString(); }
and this works fine the way I want it to. It shows me values like 1.01, 1.32, 1.45.. you know, increments of 0.01f
however.. the moment the value of my fl1 exceeds the 1.53 mark, I get results like 1.54999 and 1.73997. and it does that every single time. and its always after 1.53
how does that happen ? I do not understand why I get such random numbers if I'm only incrementing me float by 0.01f.
any thoughts ?
Thanks in advance Gilles.
Answer by Larry-Dietz · Nov 26, 2019 at 10:03 PM
This is due to the rounding errors in floating point precision. I had a similar issue a while back.
One workaround if this small percentage off is causing you problems is to use doubles instead of floats, then you can cast the resulting double back into a float and it should give you the expected answer.
Here is the code from a quick test I just did for this...
double f1 = 1, f2 = 0.01;
public void Test()
{
f1 += f2;
Debug.Log(f1);
Debug.Log((float)f1);
}
Hope this helps, -Larry
Your answer
Follow this Question
Related Questions
Can you do float + float 1 Answer
How can I change the value of a Float smoothly? 2 Answers
Float without comma 1 Answer