latex.matrix_delimiters(left='[', right=']')
latex.matrix_column_alignment('c')
$\newcommand{\la}{\langle}$ $\newcommand{\ra}{\rangle}$ $\newcommand{\vu}{\mathbf{u}}$ $\newcommand{\vv}{\mathbf{v}}$ $\newcommand{\vw}{\mathbf{w}}$ $\newcommand{\vzz}{\mathbf{z}}$ $\newcommand{\nc}{\newcommand}$ $\nc{\Cc}{\mathbb{C}}$ $\nc{\Rr}{\mathbb{R}}$ $\nc{\Qq}{\mathbb{Q}}$ $\nc{\Nn}{\mathbb{N}}$ $\nc{\cB}{\mathcal{B}}$ $\nc{\cE}{\mathcal{E}}$ $\nc{\cC}{\mathcal{C}}$ $\nc{\cD}{\mathcal{D}}$ $\nc{\mi}{\mathbf{i}}$ $\nc{\span}[1]{\langle #1 \rangle}$ $\nc{\ol}[1]{\overline{#1} }$ $\nc{\norm}[1]{\left\| #1 \right\|}$ $\nc{\abs}[1]{\left| #1 \right|}$ $\nc{\vz}{\mathbf{0}}$ $\nc{\vo}{\mathbf{1}}$ $\nc{\DMO}{\DeclareMathOperator}$ $\DMO{\tr}{tr}$ $\DMO{\nullsp}{nullsp}$ $\nc{\va}{\mathbf{a}}$ $\nc{\vb}{\mathbf{b}}$ $\nc{\vx}{\mathbf{x}}$ $\nc{\ve}{\mathbf{e}}$ $\nc{\vd}{\mathbf{d}}$ $\nc{\ds}{\displaystyle}$ $\nc{\bm}[1]{\begin{bmatrix} #1 \end{bmatrix}}$ $\nc{\gm}[2]{\bm{\mid & \cdots & \mid \\ #1 & \cdots & #2 \\ \mid & \cdots & \mid}}$ $\nc{\MN}{M_{m \times n}(K)}$ $\nc{\NM}{M_{n \times m}(K)}$ $\nc{\NP}{M_{n \times p}(K)}$ $\nc{\MP}{M_{m \times p}(K)}$ $\nc{\PN}{M_{p \times n}(K)}$ $\nc{\NN}{M_n(K)}$ $\nc{\im}{\mathrm{Im\ }}$ $\nc{\ev}{\mathrm{ev}}$ $\nc{\Hom}{\mathrm{Hom}}$ $\nc{\com}[1]{[\phantom{a}]^{#1}}$ $\nc{\rBD}[1]{ [#1]_{\cB}^{\cD}}$ $\DMO{\id}{id}$ $\DMO{\rk}{rk}$ $\DMO{\nullity}{nullity}$ $\DMO{\End}{End}$ $\DMO{\proj}{proj}$ $\nc{\GL}{\mathrm{GL}}$
In these notes, we discuss some special classes of (square) matrices that naturally arise from the operations that we have mentioned.
A sqaure matrix $A$ is nilpotent if $A^k = \vz$ for some $k \ge 1$.
Checkpoint. Certainly $\vz$ is nilpotent but there are non-zero nilpotent matrices:
Verify that $A = \bm{0 & 1 & -1 \\ 0 & 0 & 2 \\ 0 & 0 & 0}$ is nilpotent.
As we will see, shift operators give rise to nilpotent matrices. See the page for more examples.
A square matrix $A$ is idempotent if $A^2 = A$.
Checkpoint. Certainly, $\vz$ and $\vo$ are idempotent. But there are more:
Check that $A = \left[\begin{array}{rrr} 2 & 1 & -2 \\ 0 & 1 & 0 \\ 1 & 1 & -1 \end{array}\right]$ is idempotent.
Idempotent matrices correspond to projections in vector spaces. See the page for more information.
Exercise Show that an invertible idempotent must be an identity matrix.
A square matrix $A$ is:
A matrix is triangular if it is either upper or lower triangular.
The transpose of an upper triangular matrix is a lower triangular matrix and vice versa.
A matrix is diagonal if it is both upper and lower triangular. In other words, a diagonal matrix is a matrix whose off-diagonal entries, i.e. $a_{ij} = 0$.
Example
The matrix $\bm{1 & 1 & 2 \\ 0 & 0 & 0 \\ 0 & 0 & -2}$ is upper triangular,
$\bm{1 & 0 & 0 \\ -1 & 1 & 0 \\ 0 & -1 & 0}$ is lower triangular and $\bm{0 & 0 & 0 \\ 0 & -1 & 0 \\ 0 & 0 & 1}$ is diagonal.
A symmetric matrix is a matrix $A$ that equals its only transpose, i.e. $A^T = A$, and hence must be a square matrix.
The diagonal entries of a square matrix are fixed by transpose, so diagonal matrices are symmetric.
They show up naturally as representations of (real) self-adjoint operators.
One way of producing symmetric matrices obvious: by taking the product of a matrix with its transpose. That is, $A^TA$ is always a symmetric matrix.
The complex counterpart of symmetric matrices are Hermitian matrices. They are complex matrices $A$ that equals to its Hermitian transpose: $A^* = A$.
Certainly a real symmetric matrix, when treated as a complex matrix, is Hermitian.
Example The matrix $\left[\begin{array}{rrr} 1 & i & 0 \\ -i & -1 & -2 i + 1 \\ 0 & 2 i + 1 & 3 \end{array}\right]$ is Hermitian. And the matrix $\left[\begin{array}{rrr} -2 & -7 & 0 \\ -7 & 1 & -4 \\ 0 & -4 & -1 \end{array}\right]$ is symmetric.
An orthogonal matrix is a real square matrix $A$ so that $A^TA = I_n$, i.e. $A^T = A^{-1}$. They rise as the group of matrices that preserve the dot product of $\Rr^n$. Note that the definition works for an arbitrary field. Orthogonal matrix in $M_n(K)$ corresponds to the orthonormal basis of $\Rr^n$.
The complex counterpart of orthogonal matrices are unitary matrices. They are invertible complex square matrices whose inverse is its Hermitian transpose. That is $A$ is unitary if $A^*A = I_n$.
Checkpoint Check that the matrix $U = \frac{1}{2}\left[\begin{array}{rrr} 1 & -i & i - 1 \\ i & 1 & i + 1 \\ i + 1 & i - 1 & 0 \end{array}\right]$ is unitary.
Check that the matrix $R = \left[\begin{array}{rrr} \frac{1}{2} & -\frac{\sqrt{3}}{2} & 0 \\ \frac{\sqrt{3}}{2} & \frac{1}{2} & 0 \\ 0 & 0 & 1 \end{array}\right]$ is orthogonal.