- Home /
Can I see the calculation of unity_MatrixVP ?
I would like to know what is unity_MatrixVP
.
I searched it in builtin shader files, whose version is 2017-1-2f1.
I found the UnityShaderVariables.cginc(20)
#if defined(USING_STEREO_MATRICES)
#define unity_MatrixVP unity_StereoMatrixVP[unity_StereoEyeIndex]
UnityShaderVariables.cginc(178)
#if defined(USING_STEREO_MATRICES)
CBUFFER_START(UnityStereoGlobals)
float4x4 unity_StereoMatrixVP[2];
and
UnityShaderVariables.cginc(226)
#if !defined(USING_STEREO_MATRICES)
float4x4 unity_MatrixVP;
But I could not found how Unity calculate unity_StereoMatrixVP
and unity_MatrixVP
.
Is is possible to see not the declaration but the definition of unity_MatrixVP
?
Answer by Bunny83 · Oct 18, 2017 at 11:19 AM
The VP matrix the the combination of the ViewMatrix (also known as Camera matrix) and the ProjectionMatrix of that camera. So this matrix transforms a point from worldspace directly into viewport space (-1 to 1).
The View or Camera matrix is basically just the WorldToLocal matrix of the gameobject where the camera is attached to. However the camera matrix is doing the conversion from left-handed to right handed space so it's scale is inverted on the z axis. You can access / set the "V" matrix by accessing Camera.worldToCameraMatrix. However for stereo view (VR / AR stuff) you actually have two of these, one for each eye.
The projection matrix converts the positions from camera space into viewport space (or clipspace). The final VP matrix is calculated by simple combining the matrices. Keep in mind that the order in which you combine matrices matters.
Matrix4x4 VP = ProjectionMatrix * ViewMatrix;
The matrices are taken from the camera(s) involved. Unity sets the external shader variables automatically before rendering the corresponding camera. That's actually the main point of a "Camera". A Camera just represents rendering parameters.
An important note is that because Unity supports so many platforms this answer is only mostly correct. Before you combine the matrices you have to run the projection matrix through GL.GetGPUProjection$$anonymous$$atrix() to convert it from the default OpenGL convention Unity uses in the scene to the proper conventions for whatever API is being used under the hood.
If forget that step, you can get things like weird upside-down/inside-out rendering on certain platforms.