- Home /
Input.mouseposition is confused by multiple clicks
I made virtual joystick for space game. I also have button to fire lasers. When used by 1 finger everything is ok, fire fires lasers and joystick drags and drop by mouse down and up that guide spaceship in space. The problem is when i put 2 fingers on screen of mobile together, the position is in the middle that cause problems that joystick is sticking to this middle position instead of finger intentionaly used to control joystick. The input i use is Input.mousePosition.
I also experimented with Input.GetTouch(0).deltaPosition and Input.GetTouch(0).position but none of them reacted on finger put on screen. (maybe i did something wrong)
void Update () { print(Input.touchCount); if (Input.touchCount > 0 && Input.GetTouch(0).phase == TouchPhase.Began) { print ("click position "+Input.GetTouch(0).deltaPosition); } }
If anyone could help me what can i do to make this task to work i would be very thankfull.
i am trying this on Samsung Galaxy S4
The thing is that something is maybe not configured (It dont work and i dont know why) When i print touchcount like this in sample script in C#
public class Test : $$anonymous$$onoBehaviour {
//...
void Update () {
print(Input.touchCount);
}
}
It print just "0" ,on each frame , even when i am touching the screen with fingers on mobile, and not even mouse click triggers increase of touchcount. How is this possible?
This code is linked on $$anonymous$$ainCamera and when i put some other code its normally working (i tryed to make new and clear project to test this = same result)
Any tip would help me a lot, cause i am stuck :( thanks for any help
Please edit your original question or make a comment ins$$anonymous$$d of wrongly posting an answer.
Are you trying this via Unity Remote? Because that probably won't work, you have to test on the physical device.
Answer by fherbst · Sep 01, 2013 at 05:50 PM
If you want to use multiple fingers, use something like
foreach(Touch t in Input.touches)
{
// check if touch t fullfills some conditions (e.g. is over joystick) and process it
}
or
for(int i = 0; i < Input.touchCount; i++)
{
Touch t = Input.GetTouch(i);
// check if touch t fullfills some conditions (e.g. is over joystick) and process it
}
GetTouch(0) will always just give you the first touch obviously. Your code looks fine otherwise, maybe you made a mistake somewhere else - what's your platform you're running this on?
EDIT: To check whether the system supports touch, use Input.multiTouchEnabled.
It was my mistake , i thought that mouseclick triggers touchCount
And btw the solution is that I check whather am I using pc or tablet/smartphone like platform on touch event(i dont know if this is ok but its first working way i found for both testing on pc and on smartphone)
if (Input.touchCount == 0) {
if (Input.Get$$anonymous$$ouseButtonDown(0) {
//code for mouse - PC
}
} else {
foreach(Touch newTouch in Input.touches) {
//code for touchscreen
}
}
Thanks very much for your help and time, all the best.
For platform dependant code follow the link
http://docs.unity3d.com/Documentation/$$anonymous$$anual/PlatformDependentCompilation.html
Good, sounds like it's working now. Please accept an answer to help others as well.
meat5000: that's also ok in principle, but since mul$$anonymous$$ch is not a "platform feature" but a general feature, I'd suggest using Input.multiTouchEnabled (feature dependant vs. platform dependant).
Your answer
Follow this Question
Related Questions
How do I get the Vector2 from an input action in code? (New Input System) 1 Answer
How to get input from wheel(car direction) joystick? 1 Answer
Is there a way to create a floating JoyStick using the On-Screen Stick component? 0 Answers
How can I control cursor with joystick 4 Answers
Joystick Zone + Screen-swipe touch input clash. Solution??? 1 Answer