Home page for accesible maths 5 Spectral decomposition

Style control - access keys in brackets

Font (2 3) - + Letter spacing (4 5) - + Word spacing (6 7) - + Line spacing (8 9) - +

5.A Orthogonal matrices

First, we will introduce orthogonal matrices, which are those corresponding to linear transformations that preserve all angles and lengths; in that sense they define a rigid motion in space. For example, any rotation of n around the origin doesn’t stretch any vectors, or change the angles between two vectors. So rotation matrices are examples of orthogonal matrices.

Theorem 5.1.

Let AMn(R) be a square matrix. Then the following conditions are equivalent.

  1. i.

    AAT=In,

  2. ii.

    ATA=In,

  3. iii.

    The rows of A form an orthonormal basis of n,

  4. iv.

    The columns of A form an orthonormal basis of n,

  5. v.

    The linear transformation defined by xAx preserves the inner product; in other words, Ax,Ay=x,y, for any x,yn.

An orthogonal matrix is one which obeys any of the conditions in Theorem 5.1. To check whether a given matrix is orthogonal, only one (and any one) of the above conditions needs to be checked, since they are all equivalent.

Exercise 5.2:

For Rθ:=[cosθ-sinθsinθcosθ], check each condition of Theorem 5.1.

[End of Exercise]

Proof of Theorem 5.1.

To prove that several different statements are equivalent, there are many different proof strategies that are logically valid. For example, a strategy different from the one used below would be to prove (i) (ii) (iii) (iv) (v) (i). Whichever strategy is used, if one of the statements is assumed to be true, then all of the other statements must follow from it.

(i)(ii): This equivalence follows from the well-known fact that AB=In implies BA=In for square matrices over a field.

(i)(iii): Let [A]ij=aij be the coefficients of the matrix A; then [AT]ij=aji. Using the formula for matrix multiplication (see Exercise 1.25), we obtain an expression for the (i,j) entry of the matrix product:

[AAT]ij=r=1n[A]ir[AT]rj=r=1nairajr.

Since the ith row of A is the vector (ai1,,ain), the above sum is exactly the inner product of the ith and jth row. Therefore the rows are all orthonormal (and hence a basis) if and only if [AAT]ij=1 when i=j and 0 otherwise; this is the same as AAT=In.

(ii)(iv) The columns of A are orthonormal if and only if the rows of AT are orthonormal, so we apply the same argument as above, except replace A with AT.

(ii)(v): By the definition of the standard inner product, for any x,yn:

Ax,Ay=(Ax)T(Ay)=(xTAT)(Ay)=xT(ATA)y.

The second equality used that (AB)T=BTAT, which is an elementary property of the transpose, seen in MATH105; the third equality used associativity of matrix multiplication. Now it is clear that if ATA=In then (v) is true. For the reverse implication, we use Exercise 5.3 to see that ATA-In=𝟎, as required. ∎

Exercise 5.3:

For AMn() assume that

xTAy=0

for all x,yn. Prove that A is the zero matrix. [See also: Theorem 3.6]

Exercise 5.4:

In Theorem 5.1, prove directly that (v) implies (iv).

[Hint: The columns of A are Aei, where ein is a standard basis vector.]

[End of Exercise]