- Home /
Do I need to re-invent the Unity UI system to implement UI in 3D?
I am building a VR modelling app with Unity and it involves a lot of direct manipulation in VR/3D. Thus, I have created an Input Manager and Selection Manager as well as a bunch of classes/interfaces to enable 3D UI (handles, buttons, sliders, 3D layout, etc.). I suddenly realized that my design resembles Unity's UI system w/ Interfaces: Selectable, IDeselectHandler, IEventSystemHandler, IMoveHandler, IPointerDownHandler, IPointerEnterHandler, IPointerExitHandler, IPointerUpHandler, ISelectHandler. And realized how stupid it is to re-invent the wheel.
But Selectable is derived from UIBehavior, and it looks like its designed for Canvas/UI elements and would not work well if the objects are not Unity UI objects. But, I have not fully confirmed this.
So has anyone built similar stuff and used the Unity UI System? Or is this just the way it is and I need to build my own?
Note that I use Unity UI system for traditional menus and buttons (world space of course). And my InputManager needs to check every incoming event to see if its a Unity UI event or my event, even though the semantics are practically identical. 'Tis ugly.
Your answer
Follow this Question
Related Questions
Gear VR Clicking on regular Unity buttons 0 Answers
(Vive) How can I navigate my UI with Vive Dpad? 0 Answers
Screen Space UI looks jagged in VR (oculusGO) 0 Answers
Render sorting issue when ui canvas in -y rotation VR 0 Answers
Unity UI not working using VR Samples plugin after upgrading Unity to 5.4 or 5.5 1 Answer