- Home /
Two-hand grabbing of objects in Virtual Reality
Hi,
I am currently working on a room-scaled VR game where the Player uses (Oculus- or HTC Vive) controllers in both hands to grab differently-shaped blocks. The blocks can only be grabbed when both hands are at one of the sides of the block and the buttons of both controllers are pressed. The intention is to then base the orientation of the grabbed block on the orientation of both the left and right controller. The block needs to move and rotate based on how the left and right controller are moving and rotating.
To achieve this, I have made a (C#) script that during runtime makes sure that a sphere (let’s call it the MiddlePoint Sphere) (1) keeps right in the middle of the two controllers position-wise and (2) bases its rotation (orientation) on both the left and right controller. While the former works well, the latter only works partially: the MiddlePoint Sphere rotates correctly until the hands reach a certain angle, whereupon it completely flips its axes.
Since my knowledge on Quaternions and Slerping is extremely limited, I based a part of my code on recommendations from other coders. This unfortunately leaves me scratching my head when it comes to explaining the incorrect flipping.
My code, in order of execution (for every frame):
public void Calculate_MiddlePoint_GlobalPosition () {
// - Add X, Y and Z of Left and Right controller positions -
// (OS._Preset is a different script containing the Left- and Right Controller Transforms)
float X_LeftAndRight = OS._Preset.Left_Hand_Transform.position.x + OS._Preset.Right_Hand_Transform.position.x;
float Y_LeftAndRight = OS._Preset.Left_Hand_Transform.position.y + OS._Preset.Right_Hand_Transform.position.y;
float Z_LeftAndRight = OS._Preset.Left_Hand_Transform.position.z + OS._Preset.Right_Hand_Transform.position.z;
// - Create new Vector3 based on (sum of Left and Right coordinates) divided by (2) -
Vector3 CenterPosition = new Vector3 (X_LeftAndRight, Y_LeftAndRight, Z_LeftAndRight) / 2;
// - Change position of MiddlePoint Sphere -
ThisObject_Transform.position = CenterPosition;
}
public void Calculate_MiddlePoint_GlobalRotation () {
// - Create direction of the MiddlePoint Sphere by taking the <Right Controller position> and subtracting the (normalized) <Left Controller position> -
// - Is this what is causing the incorrect flip? Should it be <Left Controller minus Right Controller> when a certain axis is reached? -
Vector3 Direction = (OS._Preset.Right_Hand_Transform.position - OS._Preset.Left_Hand_Transform.position).normalized;
// - My lack of knowledge on Quaternions prevents me from describing what happens here -
Quaternion _LookRotation = Quaternion.LookRotation (Direction);
// - Ditto - not yet knowledgable enough on Slerping either -
ThisObject_Transform.rotation = Quaternion.Slerp (ThisObject_Transform.rotation, _LookRotation, 0.5f);
}
I’ve recorded three separate videos where you can see (1) how the MiddlePoint Sphere positions and rotates (correctly and incorrectly) based on the Left and Right controller, (2) what happens when you grab a block (including the incorrect flipping) and (3) how the MiddlePoint Sphere rotates when you specifically rotate your hands forward and backwards.
Video 1: https://tinyurl.com/yb454l4g
Video 2: https://tinyurl.com/y9lujulv
Video 3: https://tinyurl.com/y77qt6bb
Does anyone know if my current code is along the right lines and what is necessary to make it work entirely correctly? Or, if I am doing the hand orientation wrong: how should I be doing it? Any help is greatly appreciated - thanks in advance!
TL;DR:
Trying to make a VR application where the user uses both controllers to grab objects. Orientation of the grabbed object is not working properly as it flips its axes when reaching a certain angle. Would appreciate any help on it!
Answer by danielrabascogarcia · Sep 18, 2019 at 08:06 AM
@Nesse_M What about specifying the updaward direction in LookRotation with the object transform up vector? Something like:
Quaternion _LookRotation = Quaternion.LookRotation (Direction, ThisObject_Transform.up);
Furthermore, I don't think the Quaternion.Slerp call is necessary. You can just return _LookRotation.
Answer by dyaroslavski · Jan 09, 2020 at 08:21 PM
I got this working correctly and the trick is that you want to keep applying the most recent rotation delta from each frame to your object's active transformation- NOT from your object's initial transformation at the time of "capture".
''
The fact that this is done in most two handed VR apps can be gleaned from testing rotations in a given order. Try it in your favorite VR app: start with two hands at the same Y some distance horizontally apart, then rotate them around Z 90 deg (as if you are turning a wheel) and then rotate them around X 90deg (as if you are wrapping something around your hands). Your resultant model orientation will be different than had you just rotated the model about Y 90 deg. This shows that the order of transformations matters, even if the final positions of your hands are at the same orientation from one another. This implies that you cannot create a single transformation algorithm that maps from initial positions & orientations to some active position & orientation.
''
I've found the solution requires applying three transformations:
''
First: multiply the rotation delta from the previous difference vector between the two hands and the new difference vector (you can get this using Quaternion.FromToRotation
). This will get you most of the way there.
''
Then, multiply your new orientation with two additional transformations representing the delta in rotation of each hand's orientation. I've found that this additional transformation should be scaled by half to get a good result (you can use Slerp of 0.5f from the default identity rotation) This gives you a sort of averaging of the influence of each of the two hands' orientations on the overall rotation.
''
Lastly, because the two hands orientation changes will modify the rotation angle of your model in a way that could "de-attach" the model from your hands, simply use Quaternion.LookRotation
with the resultant up vector of all these transforms as the up parameter to get a final rotation (this essentially forces the rotation component applied from each hand to only influence the rotation about the axis between both hands).
''
This kind of algorithm will give you a realistic feeling rotation (with realistically changing up direction) and won't ever "jump" to a wildly different rotation (because you're constantly applying a rotation over time).
''
That said, you'll find that most 2-handed free-rotation algorithms exhibit some kinds of behavior that you may not find suitable with some movements(even in your favorite VR app) so the ideal solution will probably require additional thought to how you finally place your objects (snapping?) and if you want to limit rotation to a specific axis (It is possible -and easier- to create a consistent, completely order-independent transformation algorithm if you lock rotation to a single axis, say, Y).
''
Good luck!
Would you $$anonymous$$d expanding on your solution with some pseudo code? I'm having trouble following your explanation
Yes I would also be very interested in some code examples for this soultion if you are saying it is working correctly for you! I am also having toruble following your instructions in only text form..