In this subsection, we consider arbitrary rectangular matrices. A concept that turns out to be useful in practice is the transpose of a matrix. Roughly, starting with a matrix , we take its flip about the diagonal terms of , so that the rows of form the columns of and vice-versa.
Let , for some integers . The transpose of is the matrix with coefficients
Example 1.5.2.
Some authors write the transpose of a matrix as , or even .
The next result outlines the main properties of the transpose. See Section 3 for the definition of the inverse of a matrix .
Let . The following properties hold.
.
If , then and are defined, and .
For the proof, we write for the coefficient of a matrix .
By definition of the transpose, we have . By iterating the transpose, we obtain
and so, .
We have and , so is defined and both and belong to . We need to check that for all indices . We have by definition of the transpose and matrix multiplication
which we compare with
Since for all indices , the products are equal, saying that .
∎
Many matrices that arise naturally, such as the correlation matrix in statistics, have a special property: they are symmetric.
Let . We say that is symmetric if . We say that is skew-symmetric if .
The terms symmetric and skew-symmetric are defined for square matrices only.
If is skew-symmetric then the elements on the diagonal are all zero.
Most matrices are neither symmetric, nor skew-symmetric.
Compare the notions of (skew-)symmetry of matrices with the concept of parity of functions in analysis.
Example 1.5.7.
The matrix is symmetric.
The matrix is skew-symmetric.
Let be any square matrix. Then is symmetric. Indeed, by Theorem 1.5.4, we have
Thus, is a symmetric matrix.
Let be a symmetric matrix and a skew-symmetric matrix. Then is skew-symmetric. Indeed, by Theorem 1.5.4, we have
So, is skew-symmetric.