- Home /
Floating-point errors while assigning transforms
In my hierarchy, I have:
Main camera
Empty game object named "Level"
I've also created and attached a simple script to Level which is meant to load dungeon tile prefabs, generate a tilemap, and lay out the tiles in 3D space to make a seamless dungeon. The problem is, whenever I programmatically assign a position, rotation or scale to any given tile, I get massive floating-point errors between 0.00001 and 1.3, which leaves a visually noticeable unaligned tile layout.
I have even tried explicitly casting these values into integer values, checking in the debugger that they are indeed the expected values, and yet, after assigning them to a transform, they still show up slightly off in the inspector. However, when launching the script and then manually setting these values within the inspector, all the tiles line up perfectly and everything looks as expected. So the issue only occurs when assigning values to transforms within a script...
What is the cause of this madness, and what can be done to fix it?
What sort of values do you have for the positions to get such huge floating point errors?
To get floating point errors as large as 1.3, you've got to be generating huge maps. Perhaps scale everything down an order of magnitude?
Do I understand correctly that int has a better behaviour,, I wonder if rounding off the float value could help?
Can you show the error where you state "massive floating-point errors between 0.00001 and 1.3"?
It's odd that it behaves better when the values are entered in the editor. But as long as transform uses floats for position, feeding it an int won't make a difference because it will be implicitly converted to a float, which then again leads to floating point errors.
$$anonymous$$y tile map is a simple 5x5 set of wall and floor prefabs, each of which are no larger than 6 Unity-standard units in the X and Z axes. The largest value entered into the inspector was a local Y-rotation of 180.0, which, when explicitly set in code, becomes 178.7 in the inspector. Oddly enough though, when attempting to recreate the issue in a $$anonymous$$imalistic new project, I couldn't reproduce such a large margin of error. If I can find a way to reproduce this behavior, I will post a new comment about it.
Answer by SpectreCular · Jan 23, 2015 at 07:11 AM
Since manually setting the transform in the inspector always seems to fix this issue, I have found a temporary solution for anyone else who might be looking for one...
In the project explorer, select any prefab (such as a wall)
Hold the Alt key, and drag the prefab into the same folder. This will create a copy.
Rename your copy using whatever naming convention works for you (I just did "wall_0", "wall_1", etc)
For "wall_1", set the Y-rotation to 0 in the inspector. Increment this value by 90 for "wall_1", then repeat this process until you have 4 prefabs, each with a Y-rotation 90 degrees higher than the previous copy.
.. Now you can instantiate your prefabs in any script, simply using whatever naming convention you chose, without having to worry about floating-point errors messing up your transforms. Unfortunately, this solution only accounts for rotations. If floating-point errors are visually messing up your translations, you will need to find some other solution.
I'd like to add that assigning integer literals or constants (NOT floats) to a transform also seems to correct the issue... But of course, doing this could potentially limit how dynamic your code could be. Sometimes, if it means gaining functionality, the trade-off isn't so bad.