- Home /
How to get the corrrect Stereo Projection matrices
I have tried the method:
Camera.GetStereoNonJitteredProjectionMatrix(Camera.StereoscopicEye.Left/Right);
However, the matrix that it gives me is actually different than the one used on the GPU. For a comparison, I used a GPU readback to get the projection matrix from the Vive Pro, and compared that with the matrix that the camera method gave me:
// Camera method, left eye:
0.77925 0.00000 -0.05615 0.00000
0.00000 0.70090 0.00480 0.00000
0.00000 0.00000 -1.00060 -0.60018
0.00000 0.00000 -1.00000 0.00000
// Camera method, right eye:
0.78300 0.00000 0.05547 0.00000
0.00000 0.70472 0.00192 0.00000
0.00000 0.00000 -1.00060 -0.60018
0.00000 0.00000 -1.00000 0.00000
// GPU Readback, left eye:
0.77925f, 0.00000f, -0.05615f, 0.00000f
0.00000f, -0.70090f, -0.00480f, 0.00000f
0.00000f, 0.00000f, 0.00030f, 0.30009f
0.00000f, 0.00000f, -1.00000f, 0.00000f
// GPU Readback, right eye:
0.78300f, 0.00000f, 0.05547f, 0.00000f
0.00000f, -0.70472f, -0.00192f, 0.00000f
0.00000f, 0.00000f, 0.00030f, 0.30009f
0.00000f, 0.00000f, -1.00000f, 0.00000f
These are somehow different, so I'm wondering which additional calculations Unity does before sending it to the GPU. If I know that then I could implement those steps myself and be able to overwrite the Projection matrix in for VR.
Answer by Marco_Stone · Nov 25, 2019 at 02:34 PM
I found the function GL.GetGPUProjectionMatrix
which might be the way it's calculated.
[EDIT] I feel really stupid right now, but the way I managed to reproduce the matrix is by passing 'true' to renderIntoTexture like this:
var gpuProj = GL.GetGPUProjectionMatrix(proj, true);
If anyone else stumbles across this thread.
Your answer
Follow this Question
Related Questions
Is this a valid projection matrix and why? 1 Answer
Issues constraining angles in CCD 0 Answers
Fit in squares in a dedicated space 2 Answers
Is there such a thing as Unity Algebra? 0 Answers