- Home /
Why does my Mesh Collider RayCastHit only return a texturecoord on Unity Remote?
I have searched the answers here for someone with this problem, but I have been unable to find anyone facing the same issue. I have no idea why what I am doing works on Unity Remote but when I compile to iOS stops working.
Basically I am trying to do a simple "touch the screen to make a mark on the object" through editing the UV at the right coordinates. My plan was to do a raycast, then use textureCoord of the RayCastHit to modify the texture at the right place. That plan works fine when I test with Unity Remote.
Here is my code, first on the camera and then on the object:
// Camera code, to capture screen touches
function Update () {
for (var i = 0; i < Input.touchCount; ++i) {
var hit : RaycastHit;
var ray = camera.ScreenPointToRay (Input.GetTouch(i).position);
if (Physics.Raycast (ray,hit, 100)) {
hit.transform.gameObject.SendMessage("objectTouched", hit);
}
}
}
And on the object:
public void objectTouched(RaycastHit hit) {
Vector2 texturepos = hit.textureCoord;
}
Now this code works fine and dandy and gets the correct texture UV coordinates when I am using Unity Remote. I can also do a Debug.Log to ensure that hit.collider is indeed a Mesh Collider. However, when I compile and run it live, all I get for hit.textureCoord is Vector2(0,0), and for hit.collider I get null.
Why would these work fine using Unity Remote, but return null and zero values when run on the actual device?
Thank you in advance.
Your answer
Follow this Question
Related Questions
Why in Unity3D RaycastHit. textureCoord always return 0,0 after building the project? 1 Answer
Sphercast hitinfo TextureCoord rounded? 1 Answer
RaycastHit.textureCoord ranges from (0,0) to (1,1) for each face. 2 Answers
Something fishy with RaycastHit 1 Answer
Rays do not detect terrain and prefabs do not spawn where wanted 1 Answer