$\newcommand{\la}{\langle}$ $\newcommand{\ra}{\rangle}$ $\newcommand{\vu}{\mathbf{u}}$ $\newcommand{\vv}{\mathbf{v}}$ $\newcommand{\vw}{\mathbf{w}}$ $\newcommand{\vzz}{\mathbf{z}}$ $\newcommand{\nc}{\newcommand}$ $\nc{\Cc}{\mathbb{C}}$ $\nc{\Rr}{\mathbb{R}}$ $\nc{\Qq}{\mathbb{Q}}$ $\nc{\Nn}{\mathbb{N}}$ $\nc{\cB}{\mathcal{B}}$ $\nc{\cE}{\mathcal{E}}$ $\nc{\cC}{\mathcal{C}}$ $\nc{\cD}{\mathcal{D}}$ $\nc{\mi}{\mathbf{i}}$ $\nc{\sp}[1]{\langle #1 \rangle}$ $\nc{\ol}[1]{\overline{#1} }$ $\nc{\norm}[1]{\left\| #1 \right\|}$ $\nc{\abs}[1]{\left| #1 \right|}$ $\nc{\vz}{\mathbf{0}}$ $\nc{\vo}{\mathbf{1}}$ $\nc{\DMO}{\DeclareMathOperator}$ $\DMO{\tr}{tr}$ $\DMO{\nullsp}{nullsp}$ $\nc{\va}{\mathbf{a}}$ $\nc{\vb}{\mathbf{b}}$ $\nc{\vx}{\mathbf{x}}$ $\nc{\vy}{\mathbf{y}}$ $\nc{\ve}{\mathbf{e}}$ $\nc{\vd}{\mathbf{d}}$ $\nc{\vh}{\mathbf{h}}$ $\nc{\ds}{\displaystyle}$ $\nc{\bm}[1]{\begin{bmatrix} #1 \end{bmatrix}}$ $\nc{\gm}[2]{\bm{\mid & \cdots & \mid \\ #1 & \cdots & #2 \\ \mid & \cdots & \mid}}$ $\nc{\MN}{M_{m \times n}(K)}$ $\nc{\NM}{M_{n \times m}(K)}$ $\nc{\NP}{M_{n \times p}(K)}$ $\nc{\MP}{M_{m \times p}(K)}$ $\nc{\PN}{M_{p \times n}(K)}$ $\nc{\NN}{M_n(K)}$ $\nc{\im}{\mathrm{Im\ }}$ $\nc{\ev}{\mathrm{ev}}$ $\nc{\Hom}{\mathrm{Hom}}$ $\nc{\com}[1]{[\phantom{a}]^{#1}}$ $\nc{\rBD}[1]{ [#1]_{\cB}^{\cD}}$ $\DMO{\id}{id}$ $\DMO{\rk}{rk}$ $\DMO{\nullity}{nullity}$ $\DMO{\End}{End}$ $\DMO{\proj}{proj}$ $\nc{\GL}{\mathrm{GL}}$

Basis and Dimension III

In this part, we discuss how to find a basis for various kinds of subspaces of $K^n$.

NullSpace (Kernel)

We already know how to find a basis of the nullspace of a matrix $A$ since nullspace($A$) is simply the solution space of $A\vx = \vz$.

The nullity of a matrix $A$ is the dimension of its nullspace (kernel).

**Example 1.** Let $A$ be the following matrix:

Finding a basis of the nullspace of $A$ is equivalent to solving $A\vx = \vz$.

Using $s_1,s_2$ to denote the two free variables, we can view this as a parametrization of the nullspace by $K^2$.

\begin{equation}\label{eq:nullsp} \bm{ s_1 \\ s_2} \in K^2 \longmapsto \begin{bmatrix} x_1 \\ x_2 \\ x_3 \\ x_4 \end{bmatrix}= \begin{bmatrix} 2s_1 - s_2 \\ s_1 \\ -4s_2 \\ s_2 \end{bmatrix}= s_1 \begin{bmatrix} 2 \\ \color{red}{1} \\ 0 \\ \color{blue}{0} \end{bmatrix} + s_2 \begin{bmatrix} -1 \\ \color{red}{0} \\ -4 \\ \color{blue}{1} \end{bmatrix} \end{equation}

This expresses the nullspace of $A$ as the span of the two linearly independent vectors (since the rows corresponding to the free variables form an identity matrix).

\begin{equation} \label{eq:null_basis} \begin{bmatrix} 2 \\ \color{red}{1} \\ 0 \\ \color{blue}{0} \end{bmatrix} , \ \begin{bmatrix} -1 \\ \color{red}{0} \\ -4 \\ \color{blue}{1} \end{bmatrix} \end{equation}

So those vectors in Example 1. form a basis of the nullspace of $A$ and the nullity of $A$ is 2.

Span

We now discuss how to produce a basis of the span of a finite set of vectors $A \subseteq K^n$.

If we do not require the basis produced is a subset of $A$, then simply identify elements of $A$ with the rows of a matrix, still call $A$, then the non-zero rows of $R_A$ form a basis of the row space of $A$ (Notes 14 Theorem 3, Notes 19 Proposition 4).

Checkpoint 1. Find a basis for the set of vectors

$$\left\{\left[\begin{array}{c} -2 \\ 0 \\ -1 \end{array}\right], \left[\begin{array}{c} -2 \\ 0 \\ -2 \end{array}\right], \left[\begin{array}{c} 2 \\ -2 \\ -2 \end{array}\right], \left[\begin{array}{c} -1 \\ 0 \\ 0 \end{array}\right]\right\}$$

However, if we want to find a subset of $A$ that form a basis of $\sp{A}$ (whose existence is guaranteed by Theorem 2 Notes 20), equivalently, we want to find a subset of columns of matrix $A$ then form a basis of its column space, then we can resort to the following result.

Theorem 1. Suppose $A$ and $A'$ are row equivalent. Then $\{\va_{i_1}, \ldots, \va_{i_r}\}$ is a basis of the column space of $A$ if and only if $\{\va'_{i_r}, \ldots, \va_{i_r}'\}$ is a basis of the column space of $A'$.

Corollary 2. The columns of $A$ corresponding to the pivotal columns of $R_A$ form a basis of the column space of $A$. In particular, column rank of $A$ = column rank of $R_A$.

Proof. The pivotal columns of $R_A$ clearly form a basis of its column space (Proposition 4 (2), Notes 19). Since $A$ and $R_A$ are row equivalent, Corollary 2 now follows from Theorem 1.

Theorem 3. For any matrix $A$, row rank of A = column rank of A. We call this common value the rank of $A$.

Proof. row rank of $A$ = row rank of $R_A$ = column rank of $R_A$ = column rank of $A$.

The first equality follows from the fact that $A$ and $R_A$ have the same row space (Theorem 3 Notes 14), the second equality was proved in (Proposition 4 Notes 19) and the last equality was proved in Corollary 2 above.

Before, proving Theorem 1, let's see an example.

Example. Find a subset of the following set of vectors that form a basis of its span. $$ \left\{\left(\begin{array}{r} -3 \\ -1 \\ 1 \\ 1 \end{array}\right), \left(\begin{array}{r} 6 \\ 2 \\ -2 \\ -2 \end{array}\right), \left(\begin{array}{r} -4 \\ -13 \\ 0 \\ -1 \end{array}\right), \left(\begin{array}{r} 0 \\ 3 \\ 3 \\ 0 \end{array}\right), \left(\begin{array}{r} -7 \\ -14 \\ 1 \\ 0 \end{array}\right)\right\} $$

Since the 1st, 3rd and the 4th columns are pivotal, the corresponding columns of $A$, i.e. $$ \left\{\left(\begin{array}{r} -3 \\ -1 \\ 1 \\ 1 \end{array}\right), \left(\begin{array}{r} -4 \\ -13 \\ 0 \\ -1 \end{array}\right), \left(\begin{array}{r} 0 \\ 3 \\ 3 \\ 0 \end{array}\right)\right\} $$ form a basis of the span of $A$. Can you read off from $R_A$, the linear combinations of them that make the other two vectors in $A$?

Proof (of Theorem 1.) By symmetry we only need to show that if $\{\va_{i_1}, \ldots, \va_{i_r}\}$ is a basis of the column space of $A$ then $\{\va'_{i_1}, \ldots, \va'_{i_r}\}$ is a basis of the column space of $A'$. Let $I = \{i_1, \ldots, i_r\}$. Since $\{\va_{i_1}, \ldots, \va_{i_r}\}$ is a basis of the column space of $A$, for every $j$, $\va_j = A\vv$ for some $\vv$ with $v_k = 0$ for any $k \notin I$ (only the columns with indices in $I$ are used). Since $A$, $A'$ are row equivalent, there exists an invertible $M$, so that $A' = MA$. Therefore, $$ \va'_j = M\va_j = MA\vv = A'\vv.$$ Since $v_k = 0$ for $k \notin I$, this shows that $\{\va'_{i_1}, \ldots, \va'_{i_r}\}$ spans the column space of $A'$.

To show that $\{\va'_{i_1}, \ldots, \va'_{i_r}\}$ is linearly independent, suppose $A'\vv = \vz$ for some $\vv$ with $v_j = 0$ for all $j \notin I$. We need to show that $v_j = 0$ for $j \in I$ as well. Note that $A\vv = M^{-1}A'\vv = M^{-1}\vz = \vz$. Since $\{\va_{i_1}, \ldots, \va_{i_r}\}$ are linearly independent, the equation implies $v_j$ ($j \in I$) must also be zero.