- Home /
ScreenToWorldPoint and Vector3.Distance give different values
In my FPS, I have a gameObject which I place in the game world using ScreenToWorldPoint
. The z-parameter for this function is 20. The docs state this z position is "in world units from the camera".
Next (without moving the camera/player), I calculate the distance between the camera and this object using Vector3.Distance
. This distance appears to be greater than 20. It also appears to be dependent of the other parameters of ScreenToWorldPoint. On a FOV of 65, I mostly get values between 20 en 25. On a FOV of 130, I get values between 60 and 70.
What am I missing here? Is there a way to actually place the object 20 units from the camera, independent of the FOV?
Thanks in advance!
Answer by Bunny83 · Nov 26, 2017 at 08:09 PM
It is the distance from camera in "parallel space". So the resulting distance will lie in one frustum plane that is parallel to the screen and not on a spherical surface. Of course the euclidean distance between the camera origin and the resulting point will be larger the further out the point is from the screen center.
If you want to place an object on the spherical surface around the camera origin with a radius of 20 you just need to renormalize the relative vector after projection. However an easier solution would be to use ScreenPointToRay and use GetPoint with the desired distance.
This is right. Another explanation for the unexpected results is that (I'm guessing) ScreenToWorldPoint is internally really just multiplying the point on the screen with a screen-to-world matrix and both, the screen- aswell as the world-matrix do have a depth.
Thanks! I tried ScreenPointToRay in combination with GetPoint, and it works now! Note: there's still a difference with the distance calculated using Vector3.Distance, but it's a $$anonymous$$or difference I can live with (values in 130 FOV are around 20.1, in 30 FOV around 20.03). I wonder why that is... any ideas?