%display latex
latex.matrix_delimiters(left='[', right=']')
$\newcommand{\la}{\langle}$ $\newcommand{\ra}{\rangle}$ $\newcommand{\vu}{\mathbf{u}}$ $\newcommand{\vv}{\mathbf{v}}$ $\newcommand{\vw}{\mathbf{w}}$ $\newcommand{\vzz}{\mathbf{z}}$ $\newcommand{\nc}{\newcommand}$ $\nc{\Cc}{\mathbb{C}}$ $\nc{\Rr}{\mathbb{R}}$ $\nc{\Qq}{\mathbb{Q}}$ $\nc{\Nn}{\mathbb{N}}$ $\nc{\cB}{\mathcal{B}}$ $\nc{\cE}{\mathcal{E}}$ $\nc{\cC}{\mathcal{C}}$ $\nc{\cD}{\mathcal{D}}$ $\nc{\mi}{\mathbf{i}}$ $\nc{\span}[1]{\langle #1 \rangle}$ $\nc{\ol}[1]{\overline{#1} }$ $\nc{\norm}[1]{\left\| #1 \right\|}$ $\nc{\abs}[1]{\left| #1 \right|}$ $\nc{\vz}{\mathbf{0}}$ $\nc{\vo}{\mathbf{1}}$ $\nc{\DMO}{\DeclareMathOperator}$ $\DMO{\tr}{tr}$ $\DMO{\nullsp}{nullsp}$ $\nc{\va}{\mathbf{a}}$ $\nc{\vb}{\mathbf{b}}$ $\nc{\vx}{\mathbf{x}}$ $\nc{\ve}{\mathbf{e}}$ $\nc{\vd}{\mathbf{d}}$ $\nc{\vh}{\mathbf{h}}$ $\nc{\ds}{\displaystyle}$ $\nc{\bm}[1]{\begin{bmatrix} #1 \end{bmatrix}}$ $\nc{\gm}[2]{\bm{\mid & \cdots & \mid \\ #1 & \cdots & #2 \\ \mid & \cdots & \mid}}$ $\nc{\MN}{M_{m \times n}(K)}$ $\nc{\NM}{M_{n \times m}(K)}$ $\nc{\NP}{M_{n \times p}(K)}$ $\nc{\MP}{M_{m \times p}(K)}$ $\nc{\PN}{M_{p \times n}(K)}$ $\nc{\NN}{M_n(K)}$ $\nc{\im}{\mathrm{Im\ }}$ $\nc{\ev}{\mathrm{ev}}$ $\nc{\Hom}{\mathrm{Hom}}$ $\nc{\com}[1]{[\phantom{a}]^{#1}}$ $\nc{\rBD}[1]{ [#1]_{\cB}^{\cD}}$ $\DMO{\id}{id}$ $\DMO{\rk}{rk}$ $\DMO{\nullity}{nullity}$ $\DMO{\End}{End}$ $\DMO{\proj}{proj}$ $\nc{\GL}{\mathrm{GL}}$
The determinant function has the following properties:
$\det\begin{bmatrix} &\vdots & \\ \cdots &\mathbf{\color{blue} u} &\cdots \\ &\vdots & \\ \cdots &\mathbf{\color{red} v} &\cdots \\ &\vdots &\end{bmatrix} = (-1)\det\begin{bmatrix} &\vdots & \\ \cdots &\mathbf{\color{red} v} &\cdots \\ &\vdots & \\ \cdots &\mathbf{\color{blue} u} &\cdots \\ &\vdots &\end{bmatrix}$
$\det\begin{bmatrix} &\vdots & \\ \cdots &{\color{red} c}\ {\mathbf u} &\cdots \\ &\vdots &\end{bmatrix}= {\color{red} c}\det\begin{bmatrix} &\vdots & \\ \cdots &{\mathbf u} &\cdots \\ &\vdots &\end{bmatrix} \quad (c \in K)$
$\det\begin{bmatrix} &\vdots & \\ \cdots &{\mathbf u+ \mathbf v} &\cdots \\ &\vdots &\end{bmatrix} = \det\begin{bmatrix} &\vdots & \\ \cdots &\mathbf u &\cdots \\ &\vdots &\end{bmatrix} + \det\begin{bmatrix} &\vdots & \\ \cdots &\mathbf v &\cdots \\ &\vdots &\end{bmatrix}$
$\det I = 1$.
Property (1) is called alternating, Property (2) and (3) together is called linear (at each row).
For the next three checkpoints, let
$$A =\left[\begin{array}{rrr} -2 & 0 & 2 \\ 1 & 1 & -1 \\ 1 & -1 & 0 \end{array}\right]$$Checkpoint. Pick any two rows of $A$ and swap them, call the result $B$. Check that $\det(B) = -\det(A)$. (Property 1)
Checkpoint. Pick a scalar $c$ and a row of $A$, let $B$ be the matrix obtained by multiplying the row that you choose by $c$. Check that $\det(B) = c\det(A)$. (Property 2)
Checkpoint. Let
$$ B = \left[\begin{array}{rrr} -2 & 0 & 2 \\ 0 & 3 & 0 \\ 1 & -1 & 0 \end{array}\right]\quad \text{and} \quad C = \bm{-2 & 0 & 2 \\ 1+0 & 1 + 3 & 0 + (-1) \\ 1 & -1 & 0} = \bm{-2 & 0 & 2 \\1& 4 & -1 \\ 1 & -1 & 0} $$Note that $A$, $B$ and $C$ share all but one row (the 2nd row) and that the 2nd row of $C$ is the sum of the 2nd row of $A$ and the 2nd row of $B$.
Check that $\det(C) = \det(A) + \det(B)$. This verifies Property (3)
In general, the equality $\det(A+B) = \det(A) + \det(B)$ is *not true. Give an example.
Proposition 1. Let $A$ be a square matrix.
i. If a matrix $A$ has two identitical rows, then $\det(A)= 0$
ii. If a matrix $A$ has a zero row, then $\det(A) = 0$.
Proof: For i), suppose $A$ has two identitical rows, i.e. $A$ looks like $\begin{bmatrix} \phantom{--} &\vdots &\phantom{--} \\ -- &{\color{red} u} &-- \\ \phantom{--} &\vdots &\phantom{--} \\ -- &{\color{blue} u} &-- \\ \phantom{--} &\vdots &\phantom{--} \end{bmatrix}$ then
$A' = \begin{bmatrix} \phantom{--} &\vdots &\phantom{--} \\ -- &{\color{blue} u} &-- \\ \phantom{--} &\vdots &\phantom{--} \\ -- &{\color{red} u} &-- \\ \phantom{--} &\vdots &\phantom{--} \end{bmatrix} = A$
So $\det(A) = \det(A') = (-1)\det(A)$. Therefore, $\det(A) = 0$.
For ii), if $A = \begin{bmatrix} \phantom{--} &\vdots &\phantom{--} \\ -- &\mathbf{0} &-- \\ \phantom{--} &\vdots &\phantom{--} \end{bmatrix}$ then since $\mathbf{0}=\mathbf{0} + \mathbf{0}$, by Property 3, $\det(A) = \det(A) + \det(A)$ so $\det(A) = 0$.
Corollary 2. If $A'$ is obtained from $A$ by adding a multiple of a row to another, then $\det(A') = \det(A)$.
In other words, determinant of a matrix is unchanged by the 3rd type of ERO.
Proof. Suppose $A = \bm{ &\vdots & \\ \cdots &\vu &\cdots \\ &\vdots & \\ \cdots &\vv &\cdots \\ &\vdots & }$ and $A' = \bm{ &\vdots & \\ \cdots &\vu &\cdots \\ &\vdots & \\ \cdots &c\vu +\vv &\cdots \\ &\vdots &}$. Then, it follows from the Properties (2), (3) and (i) that
$$ \det A' = \det \bm{ &\vdots & \\ \cdots &\vu &\cdots \\ &\vdots & \\ \cdots &c\vu &\cdots \\ &\vdots & } + \det \bm{ &\vdots & \\ \cdots &\vu &\cdots \\ &\vdots & \\ \cdots &\vv &\cdots \\ &\vdots & } = c\det\bm{ &\vdots & \\ \cdots &\vu &\cdots \\ &\vdots & \\ \cdots &\vu &\cdots \\ &\vdots & } + \det A = 0 + \det A = \det A. $$Proposition 3.
i. $\det(A) = \det(A^T)$
ii. $\det(AB) = \det(A)\det(B)$.
We will prove them in the next lecture.
Property (ii) has a number of consequences:
(a) $\det(A^k) = \det(A)^k$ ($k \ge 0$)
(b) If $A$ is invertible then $\det(A^{-1}) = \dfrac{1}{\det(A)}$. In particular, $\det(A) \neq 0$ if $A$ is invertible.
Proof. (a) just follows from (ii) by induction.
(b) If $A$ is invertible, then $A^{-1}A =I$. Apply determinant on both sides, we have $$ \det(A^{-1})\det(A) = \det(A^{-1}A) = \det(I_n) = 1 $$ by Property (0) and (ii). Thus, $\det(A^{-1}) = \dfrac{1}{\det(A)}$.
The determinant is not a cooked up function:
Theorem 4. For every $n$, there is a unique function from $M_n(K)$ to $K$ that satisfies properties (1) to (4).
We call this unique function the determinant of order $n$ and denote it by $\det_n$ or simply $\det$.