- Home /
Scaling Planets based on Distance from Camera
So, I have a solar system simulation and both the scale of the planets and the distance of the planets are relative, but not between the two scales. Ie the sun is 150 times the size of the earth (1 m) and the distance of the earth from the sun is 152 m. So essentially, the real distances would be 1 billion times further than my simulation and the scale would be 12.742 million times larger than my simulation.
So what I'm trying to do is scale every object in the scene by how far away from the camera it is to give an accurate visual representation of what other bodies would actually look like. I'm having trouble getting it right though. I've been working off of an intercept theorem ratio of size/distance = size/distance where I'm finding one of the sizes, but this gives me values that are much larger than I need. That's because I assumed that one side should be 1 meter / object scale = object distance / X . Does anyone have any insight into how I can scale these to appear realistic given the size constraints of a single Unity scene?
Thanks.
The basic insight number 1 for this kind of thing (specifically the scale of solar systems) is to avoid layering one scaling system on top of another.
Unity, as you know, isn't going to "understand" the scale of a solar system, but C# has no such troubles. So, pick a substantially large real world unit (say $$anonymous$$m ins$$anonymous$$d of meter), and store real world measurements as data in your objects (that Unity will not use or read).
Now you can realistically calculate relative scales based on actual distances, then apply THAT to Unity sized representations.
I understand that, but I still have the issue of calculating an accurate apparent scale of objects based on the camera's perspective. IE I'm trying to write a function that will change the scale of the sun so that it appears realistically in size from the earth and also realistically in size from Jupiter and every point in between. I can't use the same scale for both size and distance because the values become unusably small or unusably big. I've figured out how to calculate how large the sun should be from the earth by finding out how large an arc-$$anonymous$$ute is in my simulation but I can't figure out how to make that a function of distance and apply it to other bodies dynamically.
Answer by JVene · Jul 12, 2018 at 09:33 PM
Hey @tjmarshall. Accuracy of the apparent size of a planet or the sun will be relative to the display, so ultimately all of this will have to work on that relationship. That is, if the apparent size of the sun as seen by the human eye on Earth is 32 arcminutes, and if that is shown on a 24" 1080 monitor in actual size, it would be smaller on a phone or tablet without further adjustment (and I assume that would be ok).
Now, consider, a couple of references. We know the apparent size of the moon and sun appear about equal to the human eye on Earth. These numbers show why, and while I assume you're fully aware of them, I detail them for any other readers following, and so I can follow my own thoughts :
3.4/1391.9 = .002443, which is the ratio of the diameter of the moon divided by the diameter of the sun in thousands of km.
384/149604 = .002567, which is the ratio of the distances to the moon and sun respectively in thousands of km
The two are nearly the same, which is why the sun appears nearly the same size despite being 1391.9/3.4 or 409.4 times larger than the moon.
If we normalized the apparent size of the moon on a display, we'd be able to use these kinds of ratios to calculate the apparent size of other stellar bodies from Earth, or from other perspectives.
Let's say we normalized the size of the moon to be about 100 pixels on the display, a close approximation to the apparent size of the sun or moon from Earth on some 24" 1080 monitor. This is a coefficient adjusted for the user's monitor. On a 4K 48 inch monitor that might be 350 pixels, or 500 pixels on a smaller 4k. it wouldn't matter to you as long as this approximate reference point were pinned down.
Since this is a reference size, we declare it to be a 1.0 apparent sized object, such the any 1.0 sized object, sun or moon or whatever, would be the apparentMoonSize (100 for my monitor, I'm guessing).
Some pertinent values for an Earth observer:
The moon size/distance ratio is .009033, or moonDiameter/distanceToMoon.
Not by coincidence, this ratio is .009304 for sunDiameter/distanceToSun, making the apparent size of the sun barely noticeably larger.
The reciprocal of the moon ratio is 110.7044.
Here's a few example "apparent sun" calculations using this information.
sun from Saturn: 864,938 / 890,700,000 = .000971, .000971 x 110.7044 = .107502, .107502 x 100 pixels = 10.75.
The Sun would be about 10.75 pixels on my display from Saturn. On a large 4K (350 pixels is the setting), the Sun would be about 37.6 pixels.
From Venus:
864,938 / 67,240,000 = .012863, .012836 x 110.7044 = 1.42404, 1.42404 x 100 = 142.404
On my display, the Sun from Venus would be about 142.404 pixels. On a small 4k (500 is the setting I assume), the Sun from Venus would be 1.42404 x 500 = 712.0199 pixels.
Keep in mind what your display mechanics are. If you're using a 3D camera, it has it's own perspective (FOV) that will require accounting in this setup, which I've not included.