- Home /
iPhone Droid Input.GetTouch(i).position
Now either Unity3D is full of Bugs or I am missing something simple. The Texture versus the Touch event do not even remotely line up to the same position to fire off an event. Using a simple pair of 32x32 images to invoke seperate events but the input position seems to be miles away even when I do a contain.
NOTE: Not using Event.current.mousePosition as it is for a single input event and using two fingers on a mobile and not one. Also, Guitexture is not an option for this since i'm using the ultimate FPS camera and it hates guitextures.
using UnityEngine;
using System.Collections;
public class touch : MonoBehaviour
{
public Texture2D jumpicon;
public Texture2D forwardicon;
public GUISkin White;
public int jumpxpos;
public int jumpypos;
public int forwardxpos;
public int forwardypos;
void OnGUI()
{
Rect r = new Rect(Screen.width-jumpxpos,Screen.height-jumpypos,32,32);
GUI.DrawTexture(r, jumpicon);
Rect rforward = new Rect(Screen.width-forwardxpos,Screen.height-forwardypos,32,32);
GUI.DrawTexture(rforward, forwardicon);
if(Input.touches.Length > 0)
{
int i = 0;
while (i < Input.touchCount)
{
Vector2 fingerPos = Input.GetTouch(i).position;
if(r.Contains(fingerPos))
{
Debug.Log("jump");
}
if(rforward.Contains(fingerPos))
{
Debug.Log("forward");
}
++i;
}
}
}
}
It's probably not a good idea to be doing touch events in OnGUI - especially as that code is executed multiple times for different events. Not sure that's causing your problem as you are explicitly setting the rectangles, but given that Input.mousePosition doesn't work in OnGUI, it's probably reasonable to believe that the touch code is unreliable too.
Answer by urawhat · Jun 02, 2012 at 12:36 AM
Ouch. Been searching the web and just not finding the correct way. Any links that might point me in the correct path?
Hang on let me dig something out - I was searching when you posted this.
Just one thing - and dont worry everyone does it first time - but don't post comments as answers :) In Unity Answers "Answer" means "Solution" and not "Reply". There's a add new comment link hidden on the right of the screen.
Answers like this get deleted by someone - or possibly turned into a comment depending on how busy the person doing it is :)
Answer by whydoidoit · Jun 02, 2012 at 12:42 AM
So actually I think your code for the detection would be fine if it wasn't in OnGUI. Given that you could store the rectangles as private vars rather than local variables you could draw your textures in OnGUI and actually detect your collisions in Update().
In addition you could use GUITextures and real objects with colliders parented to the main camera or a secondary active camera then use a ray cast and colliders.
There is this code that makes On$$anonymous$$ouseDown work for multiple fingers on real objects with colliders:
// OnTouchDown.cs
// Allows "On$$anonymous$$ouseDown()" events to work on the iPhone.
// Attach to the main camera.
using UnityEngine;
using System.Collections;
using System.Collections.Generic;
public class OnTouchDown : $$anonymous$$onoBehaviour
{
void Update () {
// Code for On$$anonymous$$ouseDown in the iPhone. Unquote to test.
RaycastHit hit = new RaycastHit();
for (int i = 0; i < Input.touchCount; ++i) {
if (Input.GetTouch(i).phase.Equals(TouchPhase.Began)) {
// Construct a ray from the current touch coordinates
Ray ray = Camera.main.ScreenPointToRay(Input.GetTouch(i).position);
if (Physics.Raycast(ray, out hit)) {
hit.transform.gameObject.Send$$anonymous$$essage("On$$anonymous$$ouseDown");
}
}
}
}
}
Answer by urawhat · Jun 02, 2012 at 10:12 AM
I will have to mess around with this. GUITexture not an option, but if u did a drawtexture do you think it would work? Hardest part is insuring it only does the event when the finger is only on the texture.
Answer by rhodnius · Mar 02, 2013 at 04:50 AM
Well, i know this is a little old post, but i think it worth to answer this, since I have been searching a lot to find this answer.
Your first code works fine, you need to take out your code for detection as whydoidoit suggested, but there is one little thing that is missed there, you need to substract the fingerPos.y form Screen.height (after searching, the touch position in the y axis is flipped or backwards in respect to screen.height), the code would be something like this:
using UnityEngine;
using System.Collections;
public class touch : MonoBehaviour
{
public Texture2D jumpicon;
public Texture2D forwardicon;
public GUISkin White;
public int jumpxpos;
public int jumpypos;
public int forwardxpos;
public int forwardypos;
Rect r;
Rect rforward;
void OnGUI()
{
r = new Rect(Screen.width-jumpxpos,Screen.height-jumpypos,32,32);
GUI.DrawTexture(r, jumpicon);
rforward = new Rect(Screen.width-forwardxpos,Screen.height-forwardypos,32,32);
GUI.DrawTexture(rforward, forwardicon);
}
void Update(){
if(Input.touches.Length > 0)
{
int i = 0;
while (i < Input.touchCount)
{
Vector2 fingerPos = Input.GetTouch(i).position;
fingerPos.y = Screen.height - fingerPos.y;
if(r.Contains(fingerPos))
{
Debug.Log("jump");
}
if(rforward.Contains(fingerPos))
{
Debug.Log("forward");
}
++i;
}
}
}
}
Hope this would help someone.
Rhod,