- Home /
How to Render an Background on Top
hi
For filtrate issues I want to render the background AFTER all other objects. My idea is to render the bg texture only to places where no other pixel was drawn already. The idea is ok, but the shader is not working for me. I have a simple shader with zwrite off. But how to make a ztest like only if z is unlimited ?
Also how can I enforce to render the background after all other objects?
Greetings
@JarJar$$anonymous$$ Please give me a good reason why I answered your question, because most of the times you can easily get around these kind of issues and rendering it first is still possible most of the times.
I have two reasons for you.
First fillrate. Drawing the background is relatively costly on the iPhone because of limited filtrate. For test reasons I only draw halt of the screen as background. I got instantly 3fps more. Because more than half of my screen is overdrawn it looks for me that the 3-4fps would be possible.
Second, using two cameras lead to a problem with image effects on the iPhone. I do not why but the image effect only was quadratic and the image rotated by 90 degrees (ios only).
$$anonymous$$y current way is now draw the bg plane in the same camera. But here it hides some of my objects.
So, drawing a simple quad with a texture is costly on iOS? How would you ever draw your 3D $$anonymous$$odels then?
Have you thought of using the skybox holder as your background drawer as well?
Fillrate is always important, drawing bg on top is eg also used in mass effect. On mobiles it is just more important and a racing game with 60fps is just better than 30
I can come into that. The pixel shader has way more to accomplish when the complete background has to be drawn. Still, 3 FPS doesn't mean much if your framerate is already 600 or so. Better look at the ms/frame.
How do 2D games fill the complete screen then? Does it really ask that much from your device?
Answer by Marnix · Jun 26, 2011 at 08:10 PM
Nice issue,
I will present an alternative option before answering your question. The first thing you do need for both, is layers. You can use a layer and set different object in a layer. For your background, you could create a background layer.
Option 1 (Pro only)
Use a post-effect with RenderTexture to check in your texture if alpha is still 0 on that pixel. If so, it is certain that there was no object there. You can draw a pixel from your background there.Option 2
Create 2 cameras. One main camera, one second camera that only renders your object (by using the layers). Make sure that the second camera doesn't clear ANYTHING. Also, the depth-buffer needs to stay full.The background-camera needs a different `depth`. This will control the rendering order. Now, we can adjust the ZTest in the shader for the background object.
I haven't tested this, but in theory it looks fine to me.
I was thinking of more options, but I lost them while writing. Option 2 is probably implementable in the easiest way, because you don't have to write a post-effect. Camera depth is very important to control your render path. If I can think of more, I'll update this answer.
[3]: http://unity3d.com/support/documentation/Components/SL-CullAndDepth.html
Just thought, you don't even have to change the ZWrite to Greater.
Answer by Waz · Jun 26, 2011 at 10:17 PM
Are you sure Unity doesn't already draw Opaque geometry objects front-to-back (the transparent ones in the reverse order)? Seems a pretty obvious optimisation.
Good point, but there is no certainty to say that this is true, unless you can give us a reference.
I mention it since if using two cameras makes no difference, then this may be the reason, or an equivalent reason, that the GPU is reordering already (they normally do this at least in single draw call).
@Warwick Allison: Opaque objects don't have to be ordered, because it costs CPU time, while z-buffering solves everything in the GPU. It is less efficient to sort everything before drawing. For transparent object, this is indeed true.