- Home /
Detecting a GameObject interaction in a portable fashion
Hi guys,
I was reading through Porting a Project Between Platforms and decided I wanted to give it a try in a simple project.
In my test scene, I have a cube and I'd like to detect when the user interacts with it by double clicking on it. The catch is that I want to be able to detect and handle this in a portable fashion, i.e. I want both mouse clicks and double taps to work properly without having to dump platform-specific code into the cube script.
Let's say I'm abstracting the input away via an InputManager
class, as in the aforementioned article. I want this class to detect a double click/tap and fire the appropriate event that my game object can then listen for.
I know I can use Input.GetTouch()
and Physics.Raycast
to figure out what object was clicked on if I'm on a touch-enabled device, but how would I do this for simple mouse interaction? As far as I can tell, there's no way of getting the raw mouse coordinates out so I can perform a raycast with them.
That means I'm stuck handling events via e.g. MonoBehaviour.OnMouseDown()
, which won't work for touch-enabled devices and I've now effectively split my input handling across multiple components.
I'm brand new to Unity, so my apologies if this is a silly question. :)
Answer by tanoshimi · May 04, 2014 at 08:58 AM
You can use Input.mousePosition to get the "raw" mouse coordinates: https://docs.unity3d.com/Documentation/ScriptReference/Input-mousePosition.html
Wow, don't know how I missed that... I shall give that a try! :)