I’m implementing a custom camera class, and this of course involves the calculation of a view matrix, to pass to my shaders.
I am using the following code to calculate this matrix:
viewMatrix = Matrix4x4.LookAt(
camera.transform.position,
camera.transform.position + camera.transform.forward,
camera.transform.up
);
I’ve set up an instance of my custom camera and a regular Unity camera with identical transforms for testing: position = (-8, 5, -12), rotation = (0, 0, 0), scale = (1, 1, 1).
However, when I compare the view matrices, my matrices are slightly different from Unity’s, as seen in the frame debugger:
The values at (3, 0), (3, 1) and (3, 2) have the wrong sign for some reason, probably because I’m using the LookAt function incorrectly. Does anyone know what I’m doing wrong?
I've figured out why the scale caused problems: Unity cameras ignore scale entirely. The following code should produce a view matrix identical to Unity's, no matter the camera's Transform settings: Vector3 antiScale = new Vector3(camera.transform.lossyScale.x, camera.transform.lossyScale.y, -camera.transform.lossyScale.z); viewMatrix = Matrix4x4.TRS(Vector3.zero, Quaternion.identity, antiScale) * camera.transform.localToWorldMatrix.inverse;
– Betteanne