Assigning uv's to multiple faces
Hi, I'm having trouble assigning a texture to every face of a mesh. Here, you can see in my game, the the first face is perfectly fine, but the second face is inverted. As you can see here: http://i.imgur.com/Q950Xlr.png
However, I want all faces to have the same uv coordinates, but I'm confused how one vertex can have different values (since each vertex has a uv coordinate)
Here is how I'm assigning the faces
for(int i = 0; i < topUvs.Length; i++){
if(select == 4){ select = 0; }
switch(select){
case 0: topUvs[i] = new Vector2(0, 50); break;
case 1: topUvs[i] = new Vector2(0, 0); break;
case 2: topUvs[i] = new Vector2(1, 50); break;
case 3: topUvs[i] = new Vector2(1, 0); break;
}
select += 1;
}
This is what I want to happen
http://i.imgur.com/JFxT6Tz.jpg
How would i do this?
Thanks!
Answer by troien · Mar 16, 2016 at 12:41 PM
Simply put, one vertice in Unity cannot(not sure)/does not contain multiple uv data. An example of this is that a primitive cube uses 24 vertices (4 for each face) instead of just 8.
Have you tried to set (in your second image) vertice #5 uv to [2,1] and #6 to [2,0]? Haven't tested this yet but depending on the shader I think this might work (I suppose texture's wrapmode has to be 'repeat').
A thing that I know would work for sure is simply setting it to range from 0 to 1 (with 0.5 being halfway abviously) and then setting the material's tiling to x=2 for instance.
Right, however that's nothing Unity came up with. That's how the graphics hardware works. A Single vertex can infact contain multiple independent UV coordinates but those are used in the same face. For example if a face has a texture as well as a lightmap texture. Since a lightmap usually is an atlas texture for multiple faces / objects they need seperate UV coordinates.
However those "extra" UV coordinates are per vertex. So every vertex does have them. A shader treats every vertex and every triangle in the same way. So it's not possible that triangle one would use the first set of UVs and ignore the second and the second triangle uses the second set and ignore the first.
In general when you have a mesh with shared vertices, so a mesh where two neighbor triangles share their vertices, you can't have different vertex attributes for either triangle. As soon as you want any vertex attribute (position, normal, vertex-color, uv, uv1, uv2) to be different between two neighbors you have to duplicate the vertex in order to specify different attributes.
Like @troien said, that's why a cube mesh doesn't have 8 vertices but 24 since each corner has to be duplicated twice so each of the 3 faces that meet at each corner can have their own normal vector.
For example a sphere is a continuous mesh and has almost all vertices shared. However at the point where the end meets the start the vertices has to be duplicated since the UV coordinates aren't continuous here. The start usually has "0" as "u" while the end has "1". So a full shared sphere isn't possible if you need UV coordinates.
If you want single quads with their own unique UV coordinates, that quad needs its own set of 4 vertices. The two triangles that will make up the quad can share the two vertices on the diagonal line since everything is continuous there.
So you usually have to do something like this:
1---2 5---6
| \ | | \ |
| \| | \|
0---3 4---7
Where vertex 2 and 5 as well as 3 and 4 are at the very same position. So your triangles would be
0,1,3, 1,2,3 // first quad
4,5,7, 5,6,7 // second quad
The way most modelling tools work could be misleading. They use logical "vertices". They don't associate the actual vertex attributes with each vertex. They use things like "smoothing groups", "UV seams", "face normal" which doesn't actually exist in the hardware.
Smoothing groups are just an automatic way to calculate shared vertex normals. If something is "not smooth" during export those vertices has to be duplicated (depending on the export format. Some support smoothing groups but at hardware level they don't exist).
UV seams can be "defined" by the editor. All that does is to actually split the vertices in the unwrap editor so a single logical vertex breaks up into two independent ones so they can be mapped to arbitrary points on the texture.
Face normals don't exist at all on hardware level. The information is encoded in the winding order of the vertices.
Thanks so much! Setting the uv to (2, 1) and (2, 0) was what created the visual effect I was after.
Your answer
Follow this Question
Related Questions
Help setting UV coordinates for code-generated plane 0 Answers
Texture not Drawn on all uv Coordinates 0 Answers
Tiling a noise texture 1:1 over each triangle without creating extra vertices 1 Answer
How can I check if the UV forms a triangle, and get their UV-coordinates? 1 Answer
Can I add the same UV coordinates to multiple vertices? 0 Answers