- Home /
Using WorldToViewportPoint to find an object off screen.
Hello!
First of all thank you to everyone in this community. Although I have not yet posted, I have gathered a lot of information from here. I unfortunately have hit my first big wall in the development of my game, and hope the community can help me overcome it.
I am making a 3D space shooter game in the vein of FreeSpace. I have a targeting system implemented, and when the target is off screen, I would like there to be an arrow on the side, top, or bottom of the screen pointing in the direction the player must turn in order to face his target.
The mathematics of this are quite mind boggling to me unfortunately. I have only a limited understanding of WorldToViewportPoint to begin with, and there is not much documentation or discussion on it. I do know how to find if an object is on screen, but how the actual values of x, y and z are calculated remains a mystery to me. X and Y can have some pretty bizarre values, and I know Z has something to do with the distance to the target, but I can't put it all together.
If someone can help me get started, I would very much appreciate it.
Thanks!
Did you get anywhere with this? - I've managed to get exactly this working but once the object is too far offscreen or behind you I start getting strange results, Thanks.
I unfortunately did not solve this problem. After further research, it seems WorldToViewPortPoint is not optimal for this type of situation. The best alternative I found was to spawn several invisible planes and raycast from the player to the target, then somehow interpret the raycast hit to the edge of the screen. It's above my head, unfortunately, but might be worth making a new topic about.
Answer by syclamoth · Mar 21, 2012 at 12:22 AM
For worldToViewport, the resultant x and y values are the position on the screen that the 'worldPoint' would be rendered at from the chosen camera. If they are outside of the viewed 'area', the values will be less than 0 or greater than width or height, and as the angle from the camera increases, the values approach infinity and negative infinity respectively.
When the 'worldPoint' goes behind the camera, the x and y values begin to come back down again, and that is where the z coordinate comes in- it gives the depth along the forward vector of the camera of the object- if it is 10 units away, it wil be 10. When an object is behind the camera, the z-coordinate will be negative- this allows you to know if the viewport position of an object should still be offscreen even if the x and y coordinates seem to be within the bounds of the viewport.
Thank you for this! I'm going to get started right now and see if I can figure this out :)
check my solution to such problem at the following link.... http://answers.unity3d.com/questions/463689/how-to-check-if-an-object-is-rendered.html#answer-830721
But why exactly does it go to infinite? The Viewport itself doesn't shrink or scaled, and the X distance can be also zero if the camera looks the object, still it will have normal Y value.
Answer by tyjkenn · Jan 15, 2013 at 03:08 AM
I have something just like this on the Asset Store: http://u3d.as/content/gdcore/radar-arrows. It not only will point off-screen, around the perimeter of the screen, it also gives the option of hovering above on-screen objects.
Answer by jt78 · Jan 15, 2013 at 10:29 AM
Just in case anyone is stuck on this, I figured out a working solution (I'm sure it could be optimised a lot)
//Two utility functions to find the intersection points between 2D lines and rectangles
function Intersect(a1 : Vector2, a2 : Vector2, b1 : Vector2, b2 : Vector2) {
var b : Vector2 = a2 - a1;
var d : Vector2 = b2 - b1;
var bDotDPerp : float = b.x * d.y - b.y * d.x;
//If b dot d == 0, it means the lines are parallel so have infinite intersection points
if(bDotDPerp == 0) {
return null;
}
var c : Vector2 = b1 - a1;
var t : float = (c.x * d.y - c.y * d.x) / bDotDPerp;
if(t < 0 || t > 1) {
return null;
}
var u : float = (c.x * b.y - c.y * b.x) / bDotDPerp;
if (u < 0 || u > 1) {
return null;
}
return a1 + t * b;
}
function Intersect(a : Vector2, b : Vector2, r : Rect) {
var tl : Vector2 = Vector2(r.xMin, r.yMin);
var bl : Vector2 = Vector2(r.xMin, r.yMax);
var br : Vector2;
var tr : Vector2;
var i;
i = Intersect(a, b, tl, bl); //Check left segment
if(i != null) {
return i;
} else {
br = Vector2(r.xMax, r.yMax);
i = Intersect(a, b, bl, br); //Check bottom segment
if(i != null) {
return i;
} else {
tr = Vector2(r.xMax, r.yMin);
i = Intersect(a, b, br, tr); //Check right segment
if(i != null) {
return i;
} else {
i = Intersect(a, b, tr, tl); //Check top segment
if(i != null) {
return i;
} else {
return null;
}
}
}
}
}
//Now, this function is added to each of the objects that you want to show on screen
function OnGUI() {
var vp = Camera.main.WorldToViewportPoint(transform.position);
if(vp.z > 0) {
var ap : Vector2; //In viewport space
if(vp.x >= 0 && vp.x <= 1 && vp.y >= 0 && vp.y <= 1) {
ap = vp;
} else {
ap = Intersect(Vector2.one / 2, vp, Rect(0, 0, 1, 1));
}
ap = Camera.main.ViewportToScreenPoint(ap);
ap.y = Screen.height - ap.y;
GUI.Box(Rect(ap.x - 10, ap.y - 10, 20, 20), "x");
}
}
Hopefully that'll help a bit for those who are stuck :]
Answer by joao_pm · Apr 15, 2014 at 01:22 PM
I stumbled on this old post after running into the same problem myself.
I got this to work by simply setting the ScreenPoint to (infinity,infinity, infinity), thus throwing it offscreen. Perhaps not the prettiest solution, but it is simple and it is working!
Vector3 ScreenPoint=Camera.main.WorldToViewportPoint(TargetPoint);
if(ScreenPoint.z<0)ScreenPoint=new Vector3(Mathf.Infinity,Mathf.Infinity,Mathf.Infinity);
Your answer
![](https://koobas.hobune.stream/wayback/20220613065348im_/https://answers.unity.com/themes/thub/images/avi.jpg)
Follow this Question
Related Questions
Space.World and Space.Self don't work 1 Answer
iTween MoveAdd world coordinates Simple 2 Answers
Point gravity implementation lacks precision 2 Answers
Drift Rotation 1 Answer
Spaceship Control 3 Answers