- Home /
Touch delta distance to be uniform, independent of the pixel density
I am trying to create a scene with touch input.
I am detecting the touch input for swipe (horizontal/vertical), by remembering the mouse position when mouse gets down, by Input.GetMouseButtonDown(0), and then calculating its distance when the mouse is up, by Input.GetMouseButtonUp(0).
My problem is that I get a smaller difference between the two vectors, while testing on a low pixel density touch device, than that, while testing on a high pixel density device, for the same amount of swipe with the finger.
I know that the mouse positions which I read, by Input.mousePosition, are in pixel coordinates, and hence the problem (for each centimetre of swipe on device, more pixels are traversed on a high pixel density device). That means, the swipes are pixel density dependent.
As a side note: The reason I am using GetMouseButton, and not GetTouch, is that I need to test this on my laptop too, with a mouse. Also, I hope if I had used GetTouch instead of GetMouseButton, I would have faced the same problem....right?
Can I work around this, so that I get a uniform distance (not necessarily in pixel coordinates) for the same amount of swipe, regardless of the pixel density of the device?
Thanks :)
Your answer
Follow this Question
Related Questions
Input.mousePosition equivalent to first finger touch? 3 Answers
Mobile Touchscreen, vertical screen slice problem,Mobile Input, Horizontal Screen Slicing 1 Answer
OnTouch function? 3 Answers
New Unity Input System Touch Controls - Destroying GameObjects On Mobile 1 Answer
Duel JoyStick from scratch 0 Answers