- Home /
Render 3D object to a 2D texture at runtime
I am looking to create a space environment where all the stars are actual stars that the player can travel to. My current approach is to scale the objects based on their distance from the camera and then, if their position is beyond 10000, then pin them to a "sphere" around the camera based on their direction. (I'm taking this idea from a Unite Speech given by the people who made Universe Sandbox.)
My question is, how would I go about rendering each of these stars to a texture so I can place it on a single quad (2 triangles) because at that distance, who can tell the difference? In doing this, I hope to save on the number of triangles in my scene.
OR is there a better way to do what I'm talking about? Thoughts?
Answer by rutter · May 17, 2013 at 12:51 AM
Interesting!
If you're using Unity Pro, you could use a Render Texture, render the object out at runtime, and see if you can capture the raster data for use in your other texture.
At the distance you're talking about, though, you can probably get away with a pre-baked sprite. If so, you could save yourself a lot of trouble by using a Texture2D that you've rendered ahead of time.
You can copy data between Texture2D instances using SetPixel() and GetPixel(), allowing you to produce a dynamic background texture while the game is running. I'm not sure if this will necessarily perform better than rendering out a quad for each object, but it might be interesting to look into.
I imagine some of these operations could be dramatically optimized by a native plugin, if you're able to write one.
Your answer
Follow this Question
Related Questions
Scaling Planets based on Distance from Camera 1 Answer
Effects to make space seem far away 1 Answer
Weapons Range Projectile script 1 Answer
Unity -- iOs FPS dropdown on 0 Answers