- Home /
Quad Tessellation Support for OpenGL Core?
Hi All, I have a project where I need to use quad tessellation. I have a solution which works in Direct3D11 but not in OpenGL Core. The mesh I have generated to be tessellated uses MeshTopology.Quads.
When I found out my shader wasn't working with OpenGL Core the first thing I tried was a triangle tessellation version. One thing I noticed is although triangle tessellation works on both versions, with Direct3D11 there is only one triangle per quad, whereas with OpenGL Core there are 2 triangles per quad. So maybe a red flag there.
Also for the quad tessellation shader with OpenGL Core it looks like maybe it's taking the first 4 verts it finds regardless of whether it's part a designated quad?
I have read that the built-in Surface Shader tessellation does not support quad tessellation so perhaps that's another clue as to why this isn't working.
However I would like to know explicitly: Is quad tessellation supported in Unity 5 and above for OpenGL Core?
Thanks in advance :D
Answer by David-Lycan · Nov 08, 2017 at 06:27 PM
Hi All, I found a workaround that suits my needs. The problem seems to boil down to this, with OpenGL it looks like it triangulates the quads before it gets to the tessellation stage, completely messing up the order as well as doubling up the incoming verts. The solution I've found is to generate a unique mesh for OpenGL using MeshTopology.Points, that way the inputs verts are always in the correct order. I use SystemInfo.graphicsDeviceVersion to detect whether I need to switch mesh generation methods. It's possible there may be some very very thin gaps/holes in the mesh, about a pixel in size, caused by floating point error because each quad's verts are now unique. However I can't see this problem on my machine but that doesn't mean it won't happen on other platforms. Anyway I hope this hack is helpful to anyone else facing this problem :)