[Solved] Quaternion from IMU sensor to GameObject Orientation problem
Hello everyone,
Update: Solved in the meantime.
Rotation that has to applied is simple:
transform.rotation = Quaternion.Inverse(CalculatedQuaternionFromAlgorithm);
Quote from own comment below:
So basically, all of the available algorithms just don't put out the rotation of the sensor with respect to earth, but vice versa: orientation of earth frame with respect to sensor frame. This gets a bit more clear when you use the magnetic field as well and not just acceleration for gravity and see what the outcome is. Simple as that, the calculated quaternion has to be inverted in order to apply the actual sensor orientation to the GameObject.
Original question:
I've just messed over 4 1/2 hours with applying an orientation I get from an Inertial Movement (IMU) sensor to a simple cube in Unity.
Straight away: I've actually solved it with very simple code. But I have no idea why, and instead this code should to the opposite...
So here's what I do:
I calculate a quaternion from my sensor data and want to see a cube in Unity move the same as my sensor. My problem was, that when I rotated around a sensor axis, say X, the cube got rotated around the world's axis X, and not its local axis. I read nearly everything google and Unity Answers provided as information, and I understand a bit about quaternions. However, the only and really only solution that worked is as follows:
void Update ()
{
diffQuat = Quaternion.Inverse(bodyQuat) * lastQuat ;
transform.Rotate(diffQuat.eulerAngles, Space.World);
lastQuat = bodyQuat;
}
I calculate a difference Quaternion (diffQuat) from the last sensor Quaternion (bodyQuat) and the latest sensor Quaternion. So, bodyQuat gets updated between the frame updates. This difference calculation is necessary in order not to rotate infinitely fast around random axes, but only the delta value. Doing it this way, I can use...
... the Rotate() function, where I apply the Euler Angles of my difference Quaternion. Oddly, I have to set
Space.World
here. I would have expectedSpace.Self
(!) in order to rotate around the cube's axes, but somehow this leads to the opposite.
Even more strange: Initially, I thought that
transform.Rotate(diffQuat.eulerAngles, Space.Self)
would be the same (in my case) as
transform.rotation = transform.rotation * diffQuat;
However, using transform.rotation
, I also get only the rotation around the world's axes, when I move the sensor around.
So two questions:
Why does the
transform.rotation
-method not work?Why do I have to use
Space.World
in order to get local axis' rotations?
Can anybody tell me if that code is really the way to go or if there's another thing I haven't realized yet how to "attach" an IMU sensor to a simple cube in Unity. By the way one of the most unresolved questions out there...
Answer by emthele · Jun 14, 2016 at 09:10 AM
Update:
I've now spent two more days in that issue and it's solved for now. To be honest, as I'm quite new to Unity, I had to gain a lot more experience and understanding about the rotations.
My question above has not been answered yet (and I couldn't figure it out on my own), but I finally found THE solution that connects the sensor movement directly to the cube:
transform.rotation = Quaternion.Inverse(newSnsQuat);
I don't know why I have to take the inverse, but I took this idea from a different piece of work, where these sensor have been used with an OpenGL framework. That's the point where I admit that things are still more complicated or irritating than I thought. At least, it's so stupidly simple that it takes no computational effort.
I would appreciate it a lot if someone with knowledge could clarify this for me:
Assigning -not- the Inverse in the above code would rotate the cube around the world's axes. Using the Inverse rotates it around it's own axes, perfectly fine, without the necessity of something like transform.rotation = transform.rotation * newSnsQuat
(which leads to continuous rotations around random axes anyway). So, what is the Inverse of a Quaternion in geographic means?
The only other thing I had to do is to put the rotation of my object as the initial conditions for my AHRS algorithm (the one that translates sensor data into a quaternion). And that's it for now.
I'll keep this post updated for people who might find it important or necessary at some point and who had to struggle with the same issue(s).
I'm trying to do the same thing and really hope to discuss more with you.
Answer by blaineL · Feb 17, 2017 at 04:13 PM
Hi @emthele - I know it's been awhile since you posted this, but did you ever figure out the reason for the inverse?
Check this http://answers.unity3d.com/answers/1163849/view.html
I'm also looking at IMUs and hoping to send quaternion data over Bluetooth LE. I'm using 2 Arduono 101s (on each wrist) for movement.
Did you use/make your own AHRS algorithm or one from the company that makes your IMU and then convert to quaternion?
I'm using some of the data that comes from FreeIMU or from the Intel Curie chip and followed some info from Phil's blog (http://philstech.blogspot.com)
In my case, calibration of initial position is important because of the position of the IMUs are not flat on a surface and also face in opposite directions.
Would love to hear how your projects are going for mo-cap too.
Cheers...
Blaine
Hi @blaineL,
yeah in the meantime we proceeded quite a lot in this matter.
So basically, all of the available algorithms just don't put out the rotation of the sensor with respect to earth, but vice versa: orientation of earth frame with respect to sensor frame. This gets a bit more clear when you use the magnetic field as well and not just acceleration for gravity and see what the outcome is. Simple as that, the calculated quaternion has to be inverted in order to apply the actual sensor orientation to the GameObject.
We started using the open source $$anonymous$$agdwick and $$anonymous$$ahony algorithms (http://x-io.co.uk/open-source-imu-and-ahrs-algorithms/), as they are already written in C#. In the meantime, creator Sebastian $$anonymous$$adgwick himself has been using a more advanced algorithm which he also wrote himself, simply called "Fusion" (https://github.com/xioTechnologies/Fusion). That one is written in C++ for using on a microcontroller, but you can simply port it to C#. We've done that and it actually works better than the original $$anonymous$$adgwick or $$anonymous$$ahony algorithm (at least on first tests, more real world tests to be done). I can't give you the source as it's our company's development, but it's really straight forward and simple from the source code provided in c++.
The initial orientations of our sensors are done via giving the algorithms time to settle with large gain values (depending on filter, the variables you can actually set from the outside). If you're working with a magnetic field, the sensors get fully aligned properly. If not, take care to have the virtual sensor's initial orientations set with the almost right start-orientation, so you don't get misalignments with respect to yaw.
That's it. If you have further question, feel free to ask (when related to topic).
Hi @emthele,
Sorry for getting back to you so late. Fell sick for a few days 'n took some time off.
I checked for the fusion link, but it looks like it's gone. I checked some other sources to and no luck either. I guess he pulled it :(
I'm actually making some small I$$anonymous$$U sensors that will send data through BLE as well. Not for $$anonymous$$o Cap or ga$$anonymous$$g, but for research and rehabilitation for stroke victims and the correlation between autism and mobility (x-io has a few devices, but I need to make them much more smaller with no add-on boards, no logger for now, and only with BLE). There's no magnetometer or GPS, and I'm just using the built-in calibration function for the Intel Curie chip found on the Arduino 101 - though still thinking of another way to calibrate them while on the person (usually on the legs).
I'm sending I$$anonymous$$U data 'this way', but I was thinking it I should add a control/setting feature to allow users to receive data as Quaternions (using the $$anonymous$$adgwickAHRS files and $$anonymous$$adgwickQuaternionUpdate function here to do the conversions).
I'm wondering if the format's going to be the same as with Unity's w, x, y, z values. Some of the researchers want to use Unity to create a graphical rehab platform.
Appreciate your thoughts.
Hi @blaineL,
I'm sorry that the Fusion link is down, but that might be due to changes in the codebase, as Sebastian $$anonymous$$adgwick wanted to make it more publicly available as well, but in a single code base. You might be lucky finding it on the x-io page at some point, or just write him an email, he responds quite quickly and helpful.
For Arduino: I have never worked with them yet, as I'm also hardware engineer and develop dedicated circuits for our needs. However, they seem quite useful, as long as you don't get stuck in the given massive framework (which usually just bloats things up, but is necessary for the large amount of users).
Anyway, using the cpp-versions of the algorithms looks like a good idea. Take care for the quaternion conversations, because you do have to switch values!
$$anonymous$$agdwicks implementations are using NWU convention, means x = north, y = west, z = up. In unity, y = up! So at the beginning of the algorithm code (here using $$anonymous$$adgwicks notation of q1..q4, you have to switch y and z values by doing this:
q1 = OldCalculatedQuaternion.w;
q2 = OldCalculatedQuaternion.x;
q3 = OldCalculatedQuaternion.z;
q4 = OldCalculatedQuaternion.y;
And at the end:
NewQuaternion.w = q1;
NewQuaternion.x = q2;
NewQuaternion.z = q3;
NewQuaternion.y = q4;
With our sensors, we ALSO had to switch y and z values of the inco$$anonymous$$g raw data before passing them to the algorithm (x,y,z of the raw data is totally different from the meaning of x,y,z in the quaternion!) Only this way, the algorithm worked as expected. Took me a while to figure this out so I hope it helps.
What do you mean the Intel Curie calibration mechanism, what does it do? And check if using no magnetometer is sufficient for your use case!
Answer by VedantUnity · Dec 06, 2017 at 08:45 PM
Hi, I am quite a beginner as well, I wanted to ask you, which brand of IMU did you use and how did you make Unity interface with it?
Your answer
Follow this Question
Related Questions
Questions about Rotations in unity 1 Answer
3 Axis Camera Rotation 0 Answers
Quaternion.RotateTowards for camera positions is buggy 1 Answer
unwanted rotation in y axis 0 Answers