- Home /
RaycastHit.textureCoord --- does the reverse exist?
I have a texture that's mapped to an object. I'm currently using RaycastHit.textureCoord to collect the UV coordinate based on a point on the in-world object, which is great.
However, I'm stuck finding the reverse.
That is to say... I have an image (world map) that's textured to a sphere. I'd like a way to click a point on the original 2D texture and see where that uv is mapped to the sphere (x,y,z)
I'm going to guess and say this is difficult because a single uv coordinate/pixel could be covering an area on the 3D object, but I still hoped there was a way to return a point list, or even a bounding area on a surface where the points lie.
For what it's worth, I wanted to throw in my mathematical solution (specifically for spheres). It works with the built-in primitive, and could be modified for other things if you wanted to revise how the X,Y -> U,V is done.
Note with converting X,Y -> U,V - you should convert UV as (1.0f - U) and (1.0f - V)
Also note that you'll need to rotate the Theta and Phi angles depending on the sphere's orientation (I had to rotate 18 degrees on the Y axis to line up the x,y,z properly)
//C#
float t = 2 * $$anonymous$$athf.PI * uv.x - $$anonymous$$athf.PI/2;
float p = $$anonymous$$athf.PI * uv.y;
float _x = $$anonymous$$athf.Sin(t) * $$anonymous$$athf.Sin(p) * radius + offset_x;
float _z = $$anonymous$$athf.Cos(t) * $$anonymous$$athf.Sin(p) * radius + offset_z;
float _y = -$$anonymous$$athf.Cos(p) * radius + offset_y;
Point_on_sphere_surface = new Vector3(_x,_y,_z);
Answer by Bunny83 · Feb 09, 2012 at 05:18 AM
It's of course a bit of work but possible ;)
First you have to find the triangle in which the point is located. So first you just need only the uv coordinates of all triangles and do a in-triangle test of the 2d coordinates. If you have the triangle you can calculate the barycentric coordinate of the point inside the triangle and now you simply can use the batycentric coordinate with the world-vector positions to get the world position.
I don't have the time for a full example at the moment, but most of the needed things can be found on this site ;)
edit
Just searched for this old answer where i put an barycentric calculation routine up;
All you need to do is go through all triangles, calculate the barycentric coordinat of your point, check if the point is inside the current triangle and use the barycentric coordinates to get the world position.
Keep in mind that more than one or none of the triangles could be mapped to a certain area of the texture. The function will return an array of mapped position in localspace. The array can contain none, one or multiple points in the meshs localspace that is mapped to the given uv coordinate.
// C#
using UnityEngine;
using System.Collections.Generic;
public static class MeshExtention
{
public static Vector3 GetBarycentric (Vector2 v1,Vector2 v2,Vector2 v3,Vector2 p)
{
Vector3 B = new Vector3();
B.x = ((v2.y - v3.y)*(p.x-v3.x) + (v3.x - v2.x)*(p.y - v3.y)) /
((v2.y-v3.y)*(v1.x-v3.x) + (v3.x-v2.x)*(v1.y -v3.y));
B.y = ((v3.y - v1.y)*(p.x-v3.x) + (v1.x - v3.x)*(p.y - v3.y)) /
((v3.y-v1.y)*(v2.x-v3.x) + (v1.x-v3.x)*(v2.y -v3.y));
B.z = 1 - B.x - B.y;
return B;
}
public static bool InTriangle(Vector3 barycentric)
{
return (barycentric.x >= 0.0f) && (barycentric.x <= 1.0f)
&& (barycentric.y >= 0.0f) && (barycentric.y <= 1.0f)
&& (barycentric.z >= 0.0f); //(barycentric.z <= 1.0f)
}
public static Vector3[] GetMappedPoints(this Mesh aMesh, Vector2 aUVPos)
{
List<Vector3> result = new List<Vector3>();
Vector3[] verts = aMesh.vertices;
Vector2[] uvs = aMesh.uv;
int[] indices = aMesh.triangles;
for(int i = 0; i < indices.Length; i += 3)
{
int i1 = indices[i ];
int i2 = indices[i+1];
int i3 = indices[i+2];
Vector3 bary = GetBarycentric(uvs[i1],uvs[i2],uvs[i3],aUVPos);
if (InTriangle(bary))
{
Vector3 localP = bary.x * verts[i1] + bary.y * verts[i2] + bary.z * verts[i3];
result.Add(localP);
}
}
return result.ToArray();
}
}
This is an extention method i've just wrote for Unity's Mesh class. Just put this static class into your "Standard Assets" folder and you can use it like this:
// C#
Vector3[] mappedPoints = myMesh.GetMappedPoints(uvPos);
// UnityScript (Javascript)
var mappedPoints = myMesh.GetMappedPoints(uvPos);
I've tested it with Unity's default sphere / cube mesh and like expected on the sphere i get one point on the surface and on the cube six since every side of the cube is mapped to the whole texture.
If you need more information of a position, e.g. the normal vector or triangle index, you can modify the function to return an array of struct (maybe just use the RaycastHit structure) to be able to return more data per point.
ps. Keep in mind that the returned Vectors are in local space. Use transform.TransformPoint() to bring the positions into worldspace if you need to.
Good luck
Thanks for doing all that work! I'll give this a try and see.
Currently I've been using math for it, with like:
u = r sin(theta) cos(phi) v = r sin(theta) sin(phi)
and solving for x,y,z.
But I think I messed up, because the picked point always correctly lie on the surface of the sphere, but one coordinate tends to be off. Presumably depending on the quadrant the point lies in, and I haven't had a chance to really pinpoint what's going on.
I think as far as multiple points goes - it should be O$$anonymous$$ as long as I'm using a stretched texture, not a repeating/tiled one. Then I think all UVs will map to only one pixel.
Thanks again ... I'll add this in and see if it does the trick today, then I can work on my math if/when I feel masochistic.
Nice post, and nice answer. I have tried to implement the solution above, but my barycentric coordinate values are extremely high. Have you ever had this problem ?
@Bunny83 Do you know if this works with Unity's new UI? I was hoping to draw something on a 3D wall, and then have that drawing show up on a 2D UI element.
This is so useful! I've added the normal calculation in the code below - it'll return a struct PositionAndNormal
containing the local position and normal. (EDIT: Edited to work with @Bunny83 's comments. $$anonymous$$ake sure to include this struct in your project. Thanks Bunny83!)
// C#
using UnityEngine;
using System.Collections.Generic;
// http://answers.unity.com/answers/215552/view.html
public static class $$anonymous$$eshPositionAndNormal
{
public struct PositionAndNormal
{
public Vector3 position;
public Vector3 normal;
}
public static PositionAndNormal[] Get$$anonymous$$appedPoints(this $$anonymous$$esh a$$anonymous$$esh, Vector2 aUVPos)
{
List<PositionAndNormal> result = new List<PositionAndNormal>();
Vector3[] verts = a$$anonymous$$esh.vertices;
Vector2[] uvs = a$$anonymous$$esh.uv;
Vector3[] normals = a$$anonymous$$esh.normals;
int[] indices = a$$anonymous$$esh.triangles;
for (int i = 0; i < indices.Length; i += 3)
{
int i1 = indices[i];
int i2 = indices[i + 1];
int i3 = indices[i + 2];
Barycentric bary = new Barycentric(uvs[i1], uvs[i2], uvs[i3], aUVPos);
if (bary.IsInside)
{
// get local position
Vector3 localP = bary.Interpolate(verts[i1], verts[i2], verts[i3]);
// get and normalize normal
Vector3 localN = bary.Interpolate(normals[i1], normals[i2], normals[i3]).normalized;
result.Add(new PositionAndNormal
{
position = localP,
normal = localN
});
}
}
return result.ToArray();
}
}
Example usage - attach to a GameObject, assign the GameObject's mesh and another GameObject to use as a reference point, and move the Vector2 pos
in the editor to see the reference point move and align with the normal. Note that this is only tested with very simple meshes + UVs. (EDIT: edited to work with @Bunny83 's comments. Thanks Bunny83!)
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using UnityEditor;
namespace Sandbox
{
[ExecuteInEdit$$anonymous$$ode]
public class $$anonymous$$eshSandbox : $$anonymous$$onoBehaviour
{
public GameObject box;
public $$anonymous$$eshFilter mesh;
public Vector2 pos = new Vector2(0, 0);
public void PlaceBox()
{
if (mesh == null || box == null) return;
$$anonymous$$eshPositionAndNormal.PositionAndNormal[] points = mesh.shared$$anonymous$$esh.Get$$anonymous$$appedPoints(pos);
if (points.Length > 0)
{
box.transform.position = transform.TransformPoint(points[0].position);
box.transform.LookAt(box.transform.position + transform.TransformDirection(points[0].normal));
}
}
private void Update()
{
PlaceBox();
}
}
}
There are some things a bit strange. First you calculate the geometric center / centroid of the triangle but it isn't used anywhere. Next thing is there's no need to convert all 3 vertices to worldspace. Just calculate the normal in local space and use TransformDirection at the end on the normal. You return the local space point but the worldspace normal which seems more than strange. Either return the local space point and normal (which would make most sense) or return both in worldspace.
Your sample code in your other answer is dangerous. You don't check how many points are actually returned. It's possible that a part of a texture isn't mapped at all and therefore you get an empty array back.
Ins$$anonymous$$d of calculating the triangle surface normal you may want to use the interpolated vertex normal which would give you a much smoother transition between triangles on "curved" surfaces like a sphere.
The vertex normal can be interpolated just the same way as the position (or any other vertex attribute):
// at the top
Vector3[] normals = a$$anonymous$$esh.normals;
// [ ... ]
Vector3 localN = bary.x * normals[i1] + bary.y * normals[i2] + bary.z * normals[i3];
localN = localN.normalized;
Finally the name "UvTransformData" is misleading. It does not contain any UV data as it contains just a position and a normal vector, both not in UV space but either in local space or world space.
Great points! I'll edit those accordingly - thanks! :)
Answer by Berenger · Feb 09, 2012 at 04:47 AM
I never thought about that, so this is total improvisation. You could create an array of uv points sorted in a clever way, that will allow you to find the closest point when you hit the 2D thing, and that would give you the 3D position of the corresponding vertice (more likely, a triangle). No idea what that's worth, but I guess you'd rather have more answer than not enough !
Your answer
Follow this Question
Related Questions
Mesh's texture seems to be off by a small amount 1 Answer
More vertex or more texture ? 0 Answers
What is Mesh UV? 2 Answers
Confusion about uv mapping 1 Answer
Mesh.uv is out of range 1 Answer