First, we will introduce orthogonal matrices, which are those corresponding to linear transformations that preserve all angles and lengths; in that sense they define a rigid motion in space. For example, any rotation of around the origin doesn’t stretch any vectors, or change the angles between two vectors. So rotation matrices are examples of orthogonal matrices.
Let be a square matrix. Then the following conditions are equivalent.
,
,
The rows of form an orthonormal basis of ,
The columns of form an orthonormal basis of ,
The linear transformation defined by preserves the inner product; in other words, , for any .
An orthogonal matrix is one which obeys any of the conditions in Theorem 5.1. To check whether a given matrix is orthogonal, only one (and any one) of the above conditions needs to be checked, since they are all equivalent.
For , check each condition of Theorem 5.1.
[End of Exercise]
To prove that several different statements are equivalent, there are many different proof strategies that are logically valid. For example, a strategy different from the one used below would be to prove (i) (ii) (iii) (iv) (v) (i). Whichever strategy is used, if one of the statements is assumed to be true, then all of the other statements must follow from it.
(i)(ii): This equivalence follows from the well-known fact that implies for square matrices over a field.
(i)(iii): Let be the coefficients of the matrix ; then . Using the formula for matrix multiplication (see Exercise 1.25), we obtain an expression for the entry of the matrix product:
Since the th row of is the vector , the above sum is exactly the inner product of the th and th row. Therefore the rows are all orthonormal (and hence a basis) if and only if when and 0 otherwise; this is the same as .
(ii)(iv) The columns of are orthonormal if and only if the rows of are orthonormal, so we apply the same argument as above, except replace with .
(ii)(v): By the definition of the standard inner product, for any :
The second equality used that , which is an elementary property of the transpose, seen in MATH105; the third equality used associativity of matrix multiplication. Now it is clear that if then (v) is true. For the reverse implication, we use Exercise 5.3 to see that , as required. ∎
[Hint: The columns of are , where is a standard basis vector.]
[End of Exercise]