- Home /
Get Mesh Data From Texture Positions
I am working on generating a texture that updates in real time based on a few different parameters. To accomplish this I am looking to find information about the mesh the texture is applied to and use that information to do my calculations. The issue I've run into is that there doesn't appear to be a means to accomplish this.
I am looking to find the Normal and Position of the point on a model from texture coordinates but all I have been able to accomplish is to pull the raw data from the mesh (using the uv, normal and other properties of the Mesh
object). This doesn't accomplish what I am looking for as my algorithm needs complete coverage to work, that is, I can't skip over a point on the texture.
Is there some way to efficiently get these values or am I going to need another solution?
Edit: Currently I am using a precalculation to figure this out. This calculation takes approximately a minute to run on my computer for a 16x16 texture. That is hardly optimal. While I could thread this I think that'd would just serve to hide a less than optimal solution. My code is as follows (WrappingGrid is a custom class for calculating nearby points based on a grid which wraps (like UV cords do)):
void Start () {
grid = new WrappingGrid(1, 1);
nodes = new MeshTextureNode[heatmap.width, heatmap.height];
for (int x = 0; x < heatmap.width; ++x)
{
for (int y = 0; y < heatmap.height; ++y)
{
float u = (float)x / (float)heatmap.width;
float v = (float)y / (float)heatmap.height;
int closest = 0;
for (int i = 0; i < mesh.uv.Length; ++i)
{
closest = CalculateClosest(u, v, closest, i);
}
nodes[x, y] = new MeshTextureNode()
{
normal = mesh.normals[closest],
localPosition = mesh.vertices[closest]
};
}
}
}
private int CalculateClosest(float u, float v, int closest, int i)
{
Vector2 thisDiff = mesh.uv[i];
Vector2 closestDiff = mesh.uv[closest];
if (grid.NearerThanPoint(new Vector2(u, v), thisDiff, closestDiff)) closest = i;
return closest;
}
If this is going to be a heatmap, why mapped onto a mesh? Do you have caves and different levels? Heatmaps usually are simple overview maps and you simply map the worldspace coordinate to the map. or do you have a "round" world like Spore planets?
I am mapping it to a planetoid. Specifically, this. Obviously it still has issues though.
Answer by Bunny83 · Dec 18, 2014 at 04:02 PM
So if i got you right you want a method that takes a mesh, the texture for that mesh and a uv position in that texture and get back a world / local space position on the mesh and the points normal vector, right?
First of all you have to understand that the uv to world point projection is not 1 to 1 it's a "1 to n" mapping where "n" even could be 0 (this is the case when that portion of the texture isn't mapped to the mesh at all). "n" could also be greater than 1. This is the case when you have multiple triangles mapped to the same portion of the texture.
So naturally a method like this would return an array of position / normal pairs which can also contain 0 elements.
What you have to do is iterating through all triangles, get the corresponding uv coordinates of the 3 corners. Calculate the barycentric coordinate of that triangle from the given uv coordinates. use those coordinates to test if your point is inside this triangle. If it's inside, just use the barycentric coordinates to interpolate the position and normal values of the 3 corners.
Finally if you need those positions / normals in worldspace you have to use transform.TransformPoint / TransformDirection of the transform that contains your mesh.
We had a similar question over here. There i only gather the position on the mesh. If you want a method that returns both, you should use a struct with two Vector3 (position and normal).
Also related (it's crosslinked from the other answer):
ps: I just fixed the code formatting on those answers. It seems that (once more) due to some migration of UA the code formatting got messed up and there were tons of <
, >
and &
instead of <, > and &. Maybe i've missed some if you find something, please leave a comment.
Well, the sample code I have in my question is close to this, however, it stopped at the Barycentric coordinates bit. Even had the struct as you describe. I'll work on implementing this, but your description solves the problem. Thanks for the related questions!
Answer by Kiwasi · Dec 18, 2014 at 03:23 AM
Is it just me or are you simply describing a shader?
Indeed, but I need to update the Texture and have it have a persistent state. There will be fading between the current frame and previous. If I can do this with a shader I'd love that solution! If you have a link or suggestion that will help here that could be handy. I had discarded shaders as not allowing me to persist the state, perhaps I was wrong?
No idea, I'm not fully up to speed on shaders and how they work.
Alas, nor am I. At the moment I am exploring a preprocessing route since an actual solution isn't apparent, once I get it close I'll update my question with the code, maybe someone will be able to help me refine it further.
I have edited in my current work around, it misses points and has a series of other issues (mostly it's hugely inefficient).
Your answer
Follow this Question
Related Questions
can anybody help with script that takes webcam texture pixel information to deform mesh vertices? 0 Answers
Realtime mesh deformation or texturing 0 Answers
Does micropoly / tinypoly texture mapping have disadvantages? 1 Answer
[ problem ] transparent shader !!! 1 Answer
Apply PNG as texture to 3d object 0 Answers