- Home /
Texture UV bug in Unity 5
So, when I was using my game project in a previous version of Unity, I set up textures for a normal Unity instanced 3d cube object from a texture atlas and then used a script to set the coordinate manually. The image below shows how it looks before and after updating to Unity 5.3. The left most is what it use to be, and the center is what it is at the moment. The right is the same one with a wireframe to show the details of the uvs.
As you can see, the texture on the side is somehow switched/ stretched now. I thought my UVs might have been out of order, but nothing changed from then to now as far as I'm aware. Did Unity 5 present something about uvs that I'm not aware of?
This is the code I used. The Left and Right code seems to be the culprits, though I believe this effect is also happening on the Bottom.
var mf = GetComponent(MeshFilter);
var mesh : Mesh;
if (mf != null)
mesh = mf.mesh;
if (mesh == null || mesh.uv.Length != 24) {
Debug.Log("Script needs to be attached to built-in cube");
return;
}
var uvs = mesh.uv;
// Front
uvs[0] = Vector2(tUnit * texturePosSides.x, tUnit * texturePosSides.y);
uvs[1] = Vector2(tUnit * texturePosSides.x + tUnit, tUnit * texturePosSides.y);
uvs[2] = Vector2(tUnit * texturePosSides.x, tUnit * texturePosSides.y + tUnit);
uvs[3] = Vector2(tUnit * texturePosSides.x + tUnit, tUnit * texturePosSides.y + tUnit);
// Top
uvs[8] = Vector2(tUnit * texturePosTop.x, tUnit * texturePosTop.y);
uvs[9] = Vector2(tUnit * texturePosTop.x + tUnit, tUnit * texturePosTop.y);
uvs[4] = Vector2(tUnit * texturePosTop.x, tUnit * texturePosTop.y + tUnit);
uvs[5] = Vector2(tUnit * texturePosTop.x + tUnit, tUnit * texturePosTop.y + tUnit);
// Back
uvs[6] = Vector2(tUnit * texturePosSides.x, tUnit * texturePosSides.y);
uvs[7] = Vector2(tUnit * texturePosSides.x + tUnit, tUnit * texturePosSides.y);
uvs[10] = Vector2(tUnit * texturePosSides.x, tUnit * texturePosSides.y + tUnit);
uvs[11] = Vector2(tUnit * texturePosSides.x + tUnit, tUnit * texturePosSides.y + tUnit);
// Bottom
uvs[12] = Vector2(tUnit * texturePosBot.x, tUnit * texturePosBot.y);
uvs[14] = Vector2(tUnit * texturePosBot.x + tUnit, tUnit * texturePosBot.y);
uvs[15] = Vector2(tUnit * texturePosBot.x, tUnit * texturePosBot.y + tUnit);
uvs[13] = Vector2(tUnit * texturePosBot.x + tUnit, tUnit * texturePosBot.y + tUnit);
// Left
uvs[16] = Vector2(tUnit * texturePosSides.x, tUnit * texturePosSides.y);
uvs[18] = Vector2(tUnit * texturePosSides.x + tUnit, tUnit * texturePosSides.y);
uvs[19] = Vector2(tUnit * texturePosSides.x, tUnit * texturePosSides.y + tUnit);
uvs[17] = Vector2(tUnit * texturePosSides.x + tUnit, tUnit * texturePosSides.y + tUnit);
// Right
uvs[20] = Vector2(tUnit * texturePosSides.x, tUnit * texturePosSides.y);
uvs[22] = Vector2(tUnit * texturePosSides.x + tUnit, tUnit * texturePosSides.y);
uvs[23] = Vector2(tUnit * texturePosSides.x, tUnit * texturePosSides.y + tUnit);
uvs[21] = Vector2(tUnit * texturePosSides.x + tUnit, tUnit * texturePosSides.y + tUnit);
mesh.uv = uvs;
Answer by Bunny83 · Oct 11, 2016 at 01:44 AM
Your approach is simply a bad one ^^. You can't assume that the vertices of the build-in meshes stay the same between different versions of Unity. Unity actually replaced the sphere mesh as well. Some changes had to be done to some meshes in order to produce better lightmap UV coordinates.
Why do you actually use Unity's cube mesh? It would be better to either import your own mesh or just create it from scratch. I've posted some code to create a cuboid mesh over here. The method over there is ment to create the view frustum of a camera. All you have to change is in line 30 and 31 to set your z positions instead of the near and far plane and remove the lines 33 and 34 as they are responsible for applying the perspective. The pivot of the cube would be at the lower corner and not in the center. If you need it in the center you have to offset all vertices by 0.5 on each axis.
The way the vertices are laid out makes it rather easy to assign UV coordinates. Since the mesh is created manually it will always look the same way.
Unfortunately, the same thing happened when I imported my own cube. It actually was worse, with textures now stretching everywhere and some turned to the wrong side. I also can't use too many resources building a cube like that everytime. That's why it's instanced since it's a common "pick up" item.
But if you are correct about the mesh change, then I wonder why it wasn't documented.
I also tried your example to create a cube. But it continues to not build anything from the returned mesh.
$$anonymous$$esh mesh = new $$anonymous$$esh();
Vector3[] v = new Vector3[8];
v[0] = v[4] = new Vector3(0.5f, 0.5f, 0.5f);
v[1] = v[5] = new Vector3(0.5f, 1.5f, 0.5f);
v[2] = v[6] = new Vector3(1.5f, 1.5f, 0.5f);
v[3] = v[7] = new Vector3(1.5f, 0.5f, 0.5f);
v[0].z = v[1].z = v[2].z = v[3].z = 0.5f;
v[4].z = v[5].z = v[6].z = v[7].z = 1.5f;
Vector3[] vertices = new Vector3[24];
Vector3[] normals = new Vector3[24];
// Split vertices for each face (8 vertices --> 24 vertices)
for (int i = 0; i < 24; i++)
vertices[i] = v[m_VertOrder[i]];
// Calculate facenormal
for (int i = 0; i < 6; i++) {
Vector3 faceNormal = Vector3.Cross(vertices[i * 4 + 2] - vertices[i * 4 + 1], vertices[i * 4 + 0] - vertices[i * 4 + 1]);
normals[i * 4 + 0] = normals[i * 4 + 1] = normals[i * 4 + 2] = normals[i * 4 + 3] = faceNormal;
}
mesh.vertices = vertices;
mesh.normals = normals;
mesh.triangles = m_Indices;
return mesh;
You haven't created any UV coordinates for the mesh ^^. The mesh i created doesn't have UVs yet. You have to create them first. I just said that when you create the mesh yourself you can be sure that each vertex will stay at the same position so your code doesn't break when you update Unity.
I just added a new feature to my UVViewer so you can visualize your UV coordinates inside Unity. Just copy the file into the editor folder of your project. I've added a simple triangle list that allows you to view where each vertex actually is. Simply open the viewer by clicking Tools-->B83-->UVViewer. Once open just select an object that has a $$anonymous$$eshRenderer. You can display the triangle list by the button at the top right.
The viewer should display the texture of the used material in the background.
Apparently, when I get the mesh back and make the uvs for it, the generated mesh still ends up with 0 triangles and 0 vertices.
Is maybe because I'm returning a mesh object from a static class into a javascript class?
Edit: No, that can't be it. I checked the vertex count of the returned mesh and it's still 24. Yet, at run time, it's showing up as zero with no triangle list.