- Home /
Input.mousePosition equivalent to first finger touch?
is unity's input.mousePosition equivalent to first finger touch in mobile and touchscreens?
Answer by Firemaw · May 27, 2015 at 12:29 PM
On a touch device:
The above answers are correct only if you are touching with one finger at a time. When there is more than one touch active at the same time, Input.mousePosition is actually the average of all touch positions in the Input.touches array.
Another difference is that where there are no touches, you might expect Input.mousePosition to return zero, but it doesn't. It remains at the position the last touch or touches were released (as if you "let go" of the mouse).
All of this said, you should be using platform-specific code that references either Input.mousePosition or the Input.touches array for your logic (use the Platform Dependent Compilation defines or the Application.platform variable).
How about if I want to have both on the same app? Some devices have touch screens and a mouse now -- I'd like to have both active, but I do NOT want the first finger touch to activate Input.GetAxis("$$anonymous$$ouse X") or Y.
Is that possible?
Answer by GoSuNeem · Oct 30, 2011 at 01:10 AM
If you are talking about Touch inputs, check the following links.
http://unity3d.com/support/documentation/ScriptReference/Touch.html http://unity3d.com/support/documentation/ScriptReference/Input.html
To pull the info of 1st touch, use the fingerID to check if it's the first touch. To get the positions, use the touch position. Try using Input.GetTouch(0).position.
Good luck!
Input.mousePosition appears to be equivalent to the first touch point. Try it!
Oh whoops! Sorry. Thought u were asking from $$anonymous$$ouse input -> Touch not the other way.
Anyways, Yes Input.mousePosition is same as touch.position.
button click downs are... Input.Get$$anonymous$$ouseButton(0)
http://unity3d.com/support/documentation/ScriptReference/Input.Get$$anonymous$$ouseButton.html
Input.mousePosition is the position of the mouse (Vec2)(x,y)
http://unity3d.com/support/documentation/ScriptReference/Input-mousePosition.html
Sorry about the confusion.
it looks like Input.mousePosition actually returns a Vector3 with z coordinate 0, while Input.touch.position returns just a Vec2...
Answer by gaps · Feb 10, 2014 at 09:12 PM
If using the mouse input is "equivalent" to using the first finger touch input... When considering a touch: yes; when considering the actual mouse: no.
The first touch simulates the mouse input, so if you use Input.mousePosition, it will work both with the actual mouse and the first touch. But the opposite does not happen, i.e. if you are just handling the touches, the mouse won't simulate that, so your code will only work on touch devices.
EDIT: To answer the comments, you can use Input.simulateMouseWithTouches = false;
if you want to disable the mouse simulation with the touches (it is enabled by default).
Any idea how to STOP this from happening? Googling for an answer is terrible, most of the results are questions on how to get touch input working :)
But seriously -- this is frustrating me. I want both mouse AND touch inputs to work on the same game at the same time, for devices that have both.
However the first touch always activates Input.GetAxis("$$anonymous$$ouse X") & Y. I can't see how to turn this off for touch.
@sfbaystudios me too, did you ever find a way to tell them apart?
Input.simulate$$anonymous$$ouseWithTouches = false;
Disables the default behaviour.
Your answer
Follow this Question
Related Questions
Dragging UI Image by touch 3 Answers
Detect if finger lifted off screen 1 Answer
Convert Mouse Input to Touch Input 0 Answers
How do I move an object with my finger?[C#] 1 Answer
Touch Input madness 2 Answers