- Home /
Mesh visible in editor but not build
Hello, I'm currently trying to generate a mesh at runtime and have been successful, I suppose. However, I cannot figure out why mesh generation doesn't work in the built project, while functioning in the editor, even in playmode. To be more accurate, the problem has something to do with displaying the mesh.
Below is the function for generating the mesh:
Mesh GenerateMapMesh()
{
List<Vector3> mapMeshVertices = new List<Vector3>();
List<int> mapMeshTriangles = new List<int>();
List<Color> mapMeshVertexColors = new List<Color>();
#region Calculating Mesh Data
for (int x = 0; x < mapSize.x; x++)
for (int z = 0; z < mapSize.y; z++)
{
Vector3 hexagonPosition = hexagonPositions[x, z];
#region Triangulating
Vector3 height = CalculateHexHeight(noiseMap[x, z]);
for (int t = 0; t < 6; t++)
{
GenerateTriangle
(
hexagonPosition + height,
hexagonPosition + hexagonCorners[t] + height,
hexagonPosition + hexagonCorners[t + 1] + height,
ref mapMeshVertices, ref mapMeshTriangles
);
// 3 is the vertex count
for (int cc = 0; cc < 3; cc++)
mapMeshVertexColors.Add(GenerateColor(noiseMap[x, z]));
// If this hex should be elevated, then a quad will be
// generated from the elevated and non-elevated vertices
if (height.y - 0.075f > 0)
{
GenerateQuad
(
hexagonPosition + hexagonCorners[t],
hexagonPosition + hexagonCorners[t + 1],
hexagonPosition + hexagonCorners[t] + height,
hexagonPosition + hexagonCorners[t + 1] + height,
ref mapMeshVertices, ref mapMeshTriangles
);
// The quad has 6 vertices (2 trianlges)
for (int cc = 0; cc < 6; cc++)
mapMeshVertexColors.Add(GenerateColor(noiseMap[x, z]));
}
}
#endregion
}
#endregion
#region Setting Mesh Variables
Mesh mapMesh = new Mesh();
mapMesh.name = "HexMapMesh";
mapMesh.vertices = mapMeshVertices.ToArray();
mapMesh.triangles = mapMeshTriangles.ToArray();
mapMesh.colors = mapMeshVertexColors.ToArray();
mapMesh.RecalculateNormals();
#endregion
return mapMesh;
}
Most of the above function is irrelevant, but just in case there is a need for someone to view the entire thing, I will not remove any part of it.
Back to the point, my calculations are accurate, the lists are populated properly, I've tested by instantiating objects in their place. The problem I mentioned is using lists, if I use an array and populate it manually it works in the editor and in the build. The following is an example:
Mesh GenerateMapMesh()
{
#region Calculating Mesh Data
#region Calculating Vertices
Vector3[] vertices = new Vector3[3];
vertices[0] = Vector3.zero;
vertices[1] = Vector3.up;
vertices[2] = Vector3.right;
#endregion
#region Calculating Triangles
int[] triangles = new int[3];
for (int t = 0; t < triangles.Length; t++)
triangles[t] = t;
#endregion
#endregion
#region Setting Mesh Variables
Mesh mapMesh = new Mesh();
mapMesh.name = "HexMapMesh";
mapMesh.vertices = vertices;
mapMesh.triangles = triangles;
mapMesh.RecalculateNormals();
#endregion
return mapMesh;
}
Of course, I've tried using arrays instead of lists and painstakingly do the exact same calculations with them. I tested it thoroughly but got the same results, worked in the editor, but not in the build.
I tried doing this in different unity versions, I tried unity 2019 and 2018 but still got the same results, currently, I'm using unity 2020.1. I also tried to manually move the lists' values to arrays but it didn't work.
After doing further testing, I determined that the vertices and the triangles are assigned to the mesh, but like I said, they aren't displayed.
Have you tried mesh.RecalculateBounds?
https://docs.unity3d.com/ScriptReference/$$anonymous$$esh.RecalculateBounds.html
You set it to the meshfilter right? And have a mesh renderer? Try changing material and also double check that the camera is pointing at the mesh.
Out of curiosity, have you tried using $$anonymous$$esh.SetVertices(List<Vector3>) (and similar), rather than mesh.vertices = v3Array
to assign the values?
Unfortunately, it still doesn't work in the build.
Hmm... it's kinda hard to think of suggestions to offer here, especially because I'm currently working on a project involving a ton of procedural mesh and texture generation (all of which *has* been working in both editor and build).
The only other thing that comes to $$anonymous$$d at the moment in terms of editor vs. build discrepancy is whether you're using any objects ONLY through instantiation (or similar). For example, if you have a prefab object or material/shader/etc. that is exclusively loaded using Instantiate(), then it might be culled during the building process unless it's in a Resources folder, because no scene distinctly references it.
That said, it doesn't necessarily seem likely that would be the case, though, since you've mentioned using (effectively) direct replacements on scripts (between array and List) so that shouldn't have anything to do with the mesh application process itself.
Out of curiosity, have you been able to deter$$anonymous$$e whether the attributes of the mesh seem correctly defined in both cases? (Like, after creating the mesh, display the counts and/or details of mesh vertices in some manner)
Answer by razzraziel · Nov 10, 2020 at 05:46 PM
Is it possible at some point there are editor scripts involved in process? Because they're not included in build.
There is one, but it's not vital to the generation process, it just calls a function.