- Home /
How to disable position and rotation measurement from VR headset gyroscope?
I'm trying to setup a player out of headset and mocap system. In my case the camera will be driven by mocap measurements from head part of the mocap system, not the headset. The problem is I have access to read the headset measures in real time, but no access to modify/disable them. This creates the problem, that I rotate camera from headset and gameobject describing the head part of the mocap system at once. The result is, when I turn my head right for 45 degrees the game shows the resulting rotation of 90 degrees. I don't want to disable the measurements from head part of mocap system, because it would look very silly from third person that the head doesn't move at all.
For now my way of dealing with problem is doing reverse transforms in the parent object to the camera to read headset measurements and recentering the VR headset every frame. This solution would do if not for the ugly jitter of the screen when moving head.
If somehow I would be able to disable headset gyroscope and accelerometer measurements, the jitter would vanish. Now the question is how to do it?
PS: I'm using Unity 5.3.x and testing on Oculus DK2 but considering on changing to HTC Vive.
Answer by Darth-Zoddo · Aug 14, 2017 at 12:15 PM
I am doing the exact same thing but with the oculus CV1, have you found a way to disable the rotation of the VR device?
I currently am able to do this by adding a script to OVRCameraRIG, but in my CV1 i get a "Cinema" type of effect. where there's one big screen that renders the camera, but around it it's completely black....
have you got it working?
Answer by tobi_s · Sep 10, 2017 at 07:29 AM
I'm also trying to do this. So far I've used Unity 5.3, and I've been able to cut perceived latencies using an IMU together with a Mocap system on an HTC Vive by estimating the headset pose as late as Camera.onPreCull with some predictive modelling. Basically, given Unity's imaging pipeline, I need to bridge a latency of slightly more than two frames, which is near the upper limit for non-shaky, responsive VR in my experience.
Unity 5.4 and later change the VR handling, moving the pose prediction outside the range that is accessible to the user, which adds at least one additional frame of latency. Remember that the Vive IMU shuts off if you don't use the Lighthouses, so I cannot rely on the code in Unity for the final step.
I would love to learn that I'm mistaken, and that there is a way of modifying the camera's transform as far down the pipeline as was possible in Unity 5.3. If not, I would be really happy if the Unity developers could rethink the decision to not make this possible. I don't really fancy writing a SteamVR driver for my positioning system (which would be the ultimate resort).
Just FTR: my code ported just fine to Unity 5.6 and 2017.1. Seems like what I had read before was mistaken. I'm not completely sure what the best latency setting for my code is, but it works very well already.