Tracking movement / orientation of handheld object relative to static “beacons” in room or relative to smartphone (worn as headset)
I'm working on a fire safety training app build in AR (Augmented Reality) using Unity and Vuforia* .
The users will be wearing Google Cardboard* headsets (but able to view reality around them through a cutout). They are able to see "virtual machinery" which catches "virtual fires" - and they need to extinguish them using the correct fire extinguisher.
They will pick up a physical (dummy) fire extinguisher and by pressing a button on the extinguisher and pointing it's nozzle in the correct direction:
If it is pointed in the direction of the fire and held for 'x' seconds, it should put the fire out (key feature).
I need to show "extinguishing foam / material" come out from the nozzle as viewed by the user through the headset (nice to have feature).
My challenge is how can I determine where the "nozzle" of the fire extinguisher is pointed to with respect to the room or with respect to the headset. I can place beacons at fixed points in the room, but I'm not sure which hardware products will allow me to determine the position / orientation of the nozzle.
To my understanding, there are "controllers" available for smartphone-headsets, but they communicate only button presses - not their own position with respect to the smartphone.
Any innovative / unconventional ideas on how to solve this? :)
I apologize in advance if some people think that this post is off-topic, I saw similar "hardware linkages" discussions here so I assumed it would be acceptable: https://forum.unity.com/threads/standard-way-to-communicate-with-hardware-arduino.32301/
and https://forum.unity.com/threads/integrating-new-hardware-into-unity-controllers-peripherals.391509/
*(if another low cost hardware/AR-software platform supports this, I would be open to change)