- Home /
texture maps render on one side only
Hi, having solved a problem about passing uv map textures from blender to Unity3D, now I'm facing another issue. When I map an UV texture to my object, Unity renders the texture on one side only, and the other side disappears. It's simply invisible. So, from one side (let's say the upper face of a vault) you see the texture, on the other side the texture (and the mapped object) simply disappears. That happens also when mapping a simple texture on a simple native Unity mesh (like a plane). What did I forget? While simple mesh primitives (like planes) can be reversed, obviously that's not possible with complex meshes. I am working on the cross vaults of a cloister, and of course I need the internal side textured, while Unity applies the textures only on the external one. Any help? Thank you
Answer by syclamoth · Dec 14, 2011 at 12:34 PM
This is completely normal behaviour. It's called backface culling, and it's the most common graphics optimisation used by video cards and the like.
Every face has a 'normal' value, which defines the vector pointing directly out from that face. As well as being used in several shader calculations (lighting etc.), it is also used by the camera to determine whether the face in question is on the side of the object which should be visible. If not, (ie. the normal is facing away from the camera) it simply does not render the face.
There are two solutions to this problem- the first is to find or modify a shader which doesn't remove backfaces. This is probably not the best of plans, since it will probably double the amount of rendering that needs to be done, for no real advantage.
The second option, which I would recommend, is that you model all your meshes to have two sides- one set of faces for the outside shell, and another set of faces pointing the other way for the inside. Another thing- for UV mapping, you probably want a different texture on the inside. If you only have one set of faces, you'll have the same texture on both sides- which would look a little odd, for most buildings.
Thanks Syclamoth, but since most of building aren't seen from the inside, the solution of single faces with same texture on both sides can be acceptable. The camera will never see their other side. As you can imagine, we'll never look inside a column, nor the hidden upper side of a cloister vault... To follow the suggestion of your second option, I'm moving around Blender to find out how to tell the program to double the faces without doubling the mesh itself. For what I'm seing now, it seems that in the normal modelling+mapping+esporting+placing+mapping+rendering process from Blender to Unity, all the external (convex) sides are rendered, while all the internal (concave) sides are hidden. Is there any way to tell Unity to flip the normal value for the faces of a mesh? Let's say to render the internal and not the external ones?
Yes, you can manually flip the normals of ever triangle in the mesh. However, if (as you say) you never see the inside of the buildings, why is this so important? Also "double the faces withoud doubling the mesh itself" - this is what I am talking about with the shader solution. Anything else is just adding in the extra faces without you having to model them (something totally different).
Imagine to be standing on the floor of a cloister. Face up. Surely you see the cross vaults. These are typically the inside faces of a complex mesh, built starting from cylinders, then making many booleans operations, then making an array to reach the multi-unit consistence of a cloister arcade. This is my situation now. As far as Blender is concerned, I've flipped normals (inside). But for some strange reason the exported (.3ds) mesh loses some faces. As per a geometric conflict. I'll investigate a bit.
@Syclamoth I don't think Santelia is trying to make dual-sided surfaces, I think his problem is that he wants to reverse the normals, because he's modeling the inside of a room, and not the outside of the building.
I think it is a simple(?) matter of getting the normals right in the modelling package, and then getting it to import correctly into Unity.
Thnak you all guys for the many contributions. Yes, my goal at the moment is to map the inside of a building, assumed that the lower faces of the cross-vaults all around a cloister can be defined as "inside" ;-) . Yes of course my walls have a thickness, but for the purpose that we were discussing about, that's uninfluent. Of course, for columns, basements etc I am using normals pointing outside, as in the default assigning of the modeller. In the meantime, I have reverserd (flip normals) all the normals of my complex meshes for the vaults, and everything now is ok, both on Blender and on Unity. Still, indeed, when I apply normals maps for bumping in Unity (bumped diffuse shader) I always need to fix my normal textures as per request of Unity itself. Do you rather suggest to use "calculate normals from grayscale" (if I well remember the option) or to assign typical bluette normal textures and "fix" them? Any relevant difference between the two methods?
Your answer
Follow this Question
Related Questions
Assigning UV Map to model at runtime 0 Answers
can't put texture materials on imported mesh. 1 Answer
Texture question 1 Answer
What is the best way to texture a Blender Model when working with Unity? 1 Answer
Generating UVs for a scripted mesh 4 Answers