$\newcommand{\la}{\langle}$ $\newcommand{\ra}{\rangle}$ $\newcommand{\vu}{\mathbf{u}}$ $\newcommand{\vv}{\mathbf{v}}$ $\newcommand{\vw}{\mathbf{w}}$ $\newcommand{\vzz}{\mathbf{z}}$ $\newcommand{\nc}{\newcommand}$ $\nc{\Cc}{\mathbb{C}}$ $\nc{\Rr}{\mathbb{R}}$ $\nc{\Qq}{\mathbb{Q}}$ $\nc{\Nn}{\mathbb{N}}$ $\nc{\cB}{\mathcal{B}}$ $\nc{\cE}{\mathcal{E}}$ $\nc{\cC}{\mathcal{C}}$ $\nc{\cD}{\mathcal{D}}$ $\nc{\mi}{\mathbf{i}}$ $\nc{\span}[1]{\langle #1 \rangle}$ $\nc{\ol}[1]{\overline{#1} }$ $\nc{\norm}[1]{\left\| #1 \right\|}$ $\nc{\abs}[1]{\left| #1 \right|}$ $\nc{\vz}{\mathbf{0}}$ $\nc{\vo}{\mathbf{1}}$ $\nc{\DMO}{\DeclareMathOperator}$ $\DMO{\tr}{tr}$ $\DMO{\nullsp}{nullsp}$ $\nc{\va}{\mathbf{a}}$ $\nc{\vb}{\mathbf{b}}$ $\nc{\vx}{\mathbf{x}}$ $\nc{\ve}{\mathbf{e}}$ $\nc{\vd}{\mathbf{d}}$ $\nc{\ds}{\displaystyle}$ $\nc{\bm}[1]{\begin{bmatrix} #1 \end{bmatrix}}$ $\nc{\gm}[2]{\bm{\mid & \cdots & \mid \\ #1 & \cdots & #2 \\ \mid & \cdots & \mid}}$ $\nc{\MN}{M_{m \times n}(K)}$ $\nc{\NM}{M_{n \times m}(K)}$ $\nc{\NP}{M_{n \times p}(K)}$ $\nc{\MP}{M_{m \times p}(K)}$ $\nc{\PN}{M_{p \times n}(K)}$ $\nc{\NN}{M_n(K)}$ $\nc{\im}{\mathrm{Im\ }}$ $\nc{\ev}{\mathrm{ev}}$ $\nc{\Hom}{\mathrm{Hom}}$ $\nc{\com}[1]{[\phantom{a}]^{#1}}$ $\nc{\rBD}[1]{ [#1]_{\cB}^{\cD}}$ $\DMO{\id}{id}$ $\DMO{\rk}{rk}$ $\DMO{\nullity}{nullity}$ $\DMO{\End}{End}$ $\DMO{\proj}{proj}$ $\nc{\GL}{\mathrm{GL}}$

Matrix Operations II

Inverses

A square matrix is a matrix with the same number of rows and columns.

If $A,B$ are both in $\NN$ (short for $M_{n \times n}(K)$) then so is $AB$. In other words, $\NN$ is closed under matrix multiplication.

The entries $a_{ii} (1 \le i \le n)$ of square matrix $A \in \NN$ are its diagonal entries.

The $n \times n$ identity matrix, denoted by $I_n$, is the $n \times n$ matrix whose diagonal entries are all $1$ and its off-diagonal entries are all $0$.

For example, \begin{equation*} I_3 = \bm{1 & 0 & 0 \\ 0 & 1 & 0 \\ 0 & 0 & 1} \end{equation*}

We simply write $I_n$ as $I$ (or $\vo$) if $n$ is understood or needs not be specified. It is clear that $AI_n = A = I_nA$ for all $A \in \NN$.

One verify directly that $\NN$ equipped with matrix addition and multiplication is a ring with $\vz$ and $\vo$ as the additive and multiplicative identity, respectively.

In fact, it is a $K$-algebra if the scalars $c$ are identified with $cI_n$ ($c \in K$).

A square matrix $A \in \NN$ is invertible if $AB = I_n = BA$ for some $B \in \NN$.

Checkpoint

Verify that $A=\bm{1 & -1 & 0 \\ 1 & -1 & 1\\ -1 & 0 & 1}$ and $B=\bm{-1 & 1 & -1 \\ -2 & 1 & -1 \\ -1 & 1 & 0}$ are inverse of each other.

Let us verify that if $B$ and $C$ are inverses of $A$, then $B=C$.

Proof. By assumption we have $AB= BA = I = CA = AC$. So, $$ B = BI = B(AC) = (BA)C = IC = C.$$

If $A$ is invertible, its (unique) inverse is denoted by $A^{-1}$.

The invertible matrices in $\NN$ form a group) under matrix multiplication with $I_n$ as the identity element. This group, denoted by $\GL_n(K)$, is called the general linear group over $K$ of order $n$.

Questions Given a square matrix how can we tell whether it is invertible or not? And if it is invertible, how can we find its inverse? We will answer these question when we study system of linear equations.

Trace

The trace of a square matrix $A$, denoted by $\tr(A)$, is the sum of its diagonal entries.

This definition seems concocted at the first sight, however, as we shall see, trace is an important invariant of the operator represented by the matrix.

Example 1. Let

Checkpoint. What are $\tr(A)$ and $\tr(B)$? Check that $\tr(AB) = \tr(BA)$ for $A,B$ in the example above.

Conjugate

For a square matrices $A$ over the complex numbers, its conjugate, denoted by $\ol{A}$ is the matrix whose entries are the complex conjugate of the entries of $A$.

Since complex conjugation preserves addition and multiplication of complex numbers (i.e. an automorphism of $\Cc$), the conjugation on $M_n(\Cc)$ preserves matrix addition and multiplication, i.e.

In other words, $A \mapsto \ol{A}$ is ring homomorphism of $M_n(K)$. In fact, it is an involution, i.e. $\ol{\ol{A}} = A$, since the complex conjugation itself is one such map.

Moreover, since the fix field of the complex conjugation is $\Rr$, $\ol{A} = A$ if and only if $A$ is over $\Rr$.

However, conjugation of complex matrices does not preserve scalar multiplication:

In general, $\ol{cA} = \ol{c}\ol{A} \neq c\ol{A}$

More generally, every automorphism $\sigma$ of $K$ extends entry-wise to a ring homomorphism of $\NN$, i.e. $\sigma([a_{ij}]) = [\sigma(a_{ij})]$.

The Hermitian transpose (or conjugate transpose) of a complex square matrix $A$, denoted by $A^*$ (or $A^{\dagger}$), is $\ol{A^T} = (\ol{A})^T$.