- Home /
The question is answered, right answer was accepted
Oculus Touch controller registered as mouse input by Unity - why?
[Please note I have also posted the following message on the Unity Forum and Oculus Forum - if either of those provide answers, I will update this post as well]
Hi everyone,
I've been working on a project in Unity where one person uses the Oculus HMD + Touch controllers to interact with the virtual environment, while another person monitors the first via PC (so on a monitor). The latter uses a (self-made) interface to enable / disable certain components for the person in VR; this interface is made with Unity's own UI elements (so: Buttons, Inputfields, Sliders, etc.).
For some reason, one button of the Oculus Touch controllers - the B button on the right controller - simultaneously functions as mouse input. To give a better description: when the mouse on the PC is hovering over a UI component (e.g. an Inputfield) and the user in VR presses the B-button on the Right Touch controller, the UI component will be activated as if clicked on by the mouse.
My question: has anyone encountered this before? How can I disable this?
To give some further information on my predicament:
I am using Unity 2017.2.0f3
I am using the OpenVR / SteamVR for Unity plug-in as the project needs to be compatible with both HTC Vive and Oculus Rift. Oculus is supported by this plug-in - the plug-in automatically recognizes the Oculus Rift and -Touch controllers when those are connected to the computer when the application is running.
Input from the Oculus Touch controller is read by the aforementioned OpenVR plug-in - in particular, scripts such as "Steam VR_Trakced Controller" are used to read whatever input it receives and interprets it as a HTC Vive controller. This works perfectly fine despite the fact that Oculus Touch controllers are interpreted as HTC Vive controllers - for example, if the Thumbstick is touched, OpenVR will interpret it as that the Pad of the HTC Vive controller is touched. If it is clicked, it will register it as a Pad press.
Any help would be very welcome! Thanks in advance!
Answer by Nesse_M · Oct 22, 2018 at 02:22 PM
It took a while, but I discovered what was causing it. To help anyone who has a similar problem:
In relation to UI elements, an EventSystem will allow controller input to interact with Canvases if the "Send Navigation Events" boolean of the EventSystem script is set to true. I have no idea why specifically the B-button of the Right Oculus Touch is the only button to be registered as input in my case, but ignoring that, you can simply set the Send Navigation events boolean to false to prevent any controller input from changing UI elements.
Also see https://docs.unity3d.com/ScriptReference/EventSystems.EventSystem.html and https://docs.unity3d.com/ScriptReference/EventSystems.EventSystem-sendNavigationEvents.html