$\newcommand{\la}{\langle}$ $\newcommand{\ra}{\rangle}$ $\newcommand{\vu}{\mathbf{u}}$ $\newcommand{\vv}{\mathbf{v}}$ $\newcommand{\vw}{\mathbf{w}}$ $\newcommand{\vzz}{\mathbf{z}}$ $\newcommand{\nc}{\newcommand}$ $\nc{\Cc}{\mathbb{C}}$ $\nc{\Rr}{\mathbb{R}}$ $\nc{\Qq}{\mathbb{Q}}$ $\nc{\Nn}{\mathbb{N}}$ $\nc{\cB}{\mathcal{B}}$ $\nc{\cE}{\mathcal{E}}$ $\nc{\cC}{\mathcal{C}}$ $\nc{\cD}{\mathcal{D}}$ $\nc{\mi}{\mathbf{i}}$ $\nc{\sp}[1]{\langle #1 \rangle}$ $\nc{\ol}[1]{\overline{#1} }$ $\nc{\norm}[1]{\left\| #1 \right\|}$ $\nc{\abs}[1]{\left| #1 \right|}$ $\nc{\vz}{\mathbf{0}}$ $\nc{\vo}{\mathbf{1}}$ $\nc{\DMO}{\DeclareMathOperator}$ $\DMO{\tr}{tr}$ $\DMO{\nullsp}{nullsp}$ $\nc{\va}{\mathbf{a}}$ $\nc{\vb}{\mathbf{b}}$ $\nc{\vx}{\mathbf{x}}$ $\nc{\vy}{\mathbf{y}}$ $\nc{\ve}{\mathbf{e}}$ $\nc{\vd}{\mathbf{d}}$ $\nc{\vh}{\mathbf{h}}$ $\nc{\ds}{\displaystyle}$ $\nc{\bm}[1]{\begin{bmatrix} #1 \end{bmatrix}}$ $\nc{\gm}[2]{\bm{\mid & \cdots & \mid \\ #1 & \cdots & #2 \\ \mid & \cdots & \mid}}$ $\nc{\MN}{M_{m \times n}(K)}$ $\nc{\NM}{M_{n \times m}(K)}$ $\nc{\NP}{M_{n \times p}(K)}$ $\nc{\MP}{M_{m \times p}(K)}$ $\nc{\PN}{M_{p \times n}(K)}$ $\nc{\NN}{M_n(K)}$ $\nc{\im}{\mathrm{Im\ }}$ $\nc{\ev}{\mathrm{ev}}$ $\nc{\Hom}{\mathrm{Hom}}$ $\nc{\com}[1]{[\phantom{a}]^{#1}}$ $\nc{\rBD}[1]{ [#1]_{\cB}^{\cD}}$ $\DMO{\id}{id}$ $\DMO{\rk}{rk}$ $\DMO{\nullity}{nullity}$ $\DMO{\End}{End}$ $\DMO{\proj}{proj}$ $\nc{\GL}{\mathrm{GL}}$

Basis and Dimension II

Theorem 1. Every vector space has a basis.

Proving this in general rely on a set theoretic principle called the Axiom of Choice.

Theorem 2. Suppose $X$ is a linearly independent subset of $V$ contained in a finite spanning $Y$ of $V$, then there is a basis $\cB$ of $V$ in between $X$ and $Y$, i.e. $X \subseteq \cB \subseteq Y$.

Proof. Suppose $Y$ is finite. Let $\cB = X$.

While $Y \not \subseteq \sp{\cB}$, add to $\cB$ a $\vv \in Y \setminus \sp{\cB}$. By Lemma 3 below, the extended $\cB$ is still a linearly independent subset of $Y$. Since $Y$ is finite, this loop will terminate in finitely many steps resulting in a linearly indepedent subset $\cB$ of $Y$ with $\sp{\cB} \supseteq Y$. Since $Y$ spans $V$ so does $\cB$. Therefore, $X \subseteq \cB \subseteq Y$ is a basis of $V$.

Under the assumption of the existence of a finite spanning set $Y$, Theorem 1 follows from Theorem 2 since $Y$ contains $\emptyset$ which is linearly independent.

In our proof of Theorem 2, we use a lemma which interesting on its own right.

Lemma 3. If $X$ is a linearly independent and $\vv \notin \sp{X}$ then $X \cup \{\vv\}$ is also linearly independent.

Proof. Suppose $c\vv + c_1\vx_1 + \cdots + c_k\vx_k = \vz$ with $\vx_i \in X$ ($1 \le i \le k$). Then $c$ must be zero; otherwise,

$$ \vv = -\frac{1}{c}(c_1\vx_1 + \cdots + c_k\vx_k) \in \sp{X} $$

contracting the assumption (note that the sum on the right can be the empty sum). But then

$$ c_1\vx_1 + \cdots + c_k\vx_k = \vz$$

which is either and empty sum or all $c_i$'s are zeros since $X$ is linearly indepedent. In either case, the original linear combination $c\vv + c_1\vx_1 + \ldots + c_k\vv_k$ is trivial. This completes the proof.

More characterizations of basis

Proposition 4. Let $V$ be a vector space

  1. A minimal spanning set of $V$ is a basis.
  2. A maximal linearly independent subset of $V$ is a basis.

Elaborating on this,

(1) means if $Y$ is a spanning set of $V$ and no proper subset of $Y$ spans $V$, then $Y$ is a basis of $V$.

(2) means if $X$ is a linearly independent subset of $V$ and $X$ is not properly contained in any linearly independent subset of $V$ then $X$ is a basis of $V$.

Proof. For (1), suppose $X$ is a minimial spanning set of $V$ but $X$ is linearly dependent. Then there is some $\vv \in X$ so that $\vv \in \sp{X \setminus \{\vv\}}$. But then $X \setminus \{\vv\}$ would still be a spanning set, contradicting the minimiality of $X$.

For (2), suppose $X$ is a maximal linearly independent set in $V$. If $\sp{X}$ is not the whole $V$, pick $\vv \in V \setminus \sp{X}$. By Lemma 3, $X \cup \{\vv\}$ is linearly independent, contradicting the maximality assumption on $X$.

Statememt (1) of Proposition 4 suggests the following way to build a subset of a basis of $\sp{A}$ as a subset of $A$. The existence of such a basis is guaranteed by Theorem 2.

Start with $\cB_0 = \emptyset$ and $A_0 = A$. At step $i \ge 0$,

if $A_i = \emptyset$, then stop.

Otherwise, pick an element $\va \in A_i$, set $A_{i+1} = A_{i} \setminus{\va}$.

if $\cB_i \cup \{\va\}$ is linearly dependent, then $\cB_{i+1} = \cB$.

Otherwise $\cB_{i+1} = \cB_{i} \cup \{\va\}$.

Since $A$ is finite, the algorithm above clearly will terminate, say at step $k$.

$\cB_i$ is linearly independent for any $1 \le i \le k$), in particularly $\cB_k$ is. Any vector $\va$ of $A$ is in some $\sp{\cB_i}$ Since $\cB_k$ contains all $\cB_i$'s, therefore $\sp{\cB_k} \supseteq A$ and hence is a spanning set of $\sp{A}$. Thus, $\cB_k$ is a subset of $A$ that is a basis of $\sp{A}$.

Dimension

Theorem 4. The size of any linearly independent subset of $V$ is at most the size of any spanning set of $V$.

Proof. Suppose $X = \{\vx_1, \ldots, \vx_n\} \subseteq V$ is linearly independent and $Y = \{\vy_1, \ldots, \vy_m\}$ is a spanning set. Then for each $1 \le j \le n$, there exists $a_{ij} \in K$ ($1 \le i \le m$) such that

$$ \vx_j = \sum_{i=1}^m \vy_i a_{ij}. $$

If $m < n$, then, by the fundamental theorem of linear algebra, there would be a non-zero $\mathbf{c} = \bm{c_1 \\ \vdots \\ c_n}$ such that $A \mathbf{c} = \vz$, where $A$ is the $m \times n$ matrix $[a_{ij}]$, i.e. $\sum_{j=1}^n a_{ij}c_j = 0$ for each $1 \le i \le m$. But then

$$ \sum_{j=1}^n \vx_j c_j = \sum_{j=1}^n \sum_{i=1}^m \vy_i a_{ij}c_j = \sum_{i=1}^m \vy_i \sum_{j=1}^n a_{ij}c_j = \sum_{i=1}^m \vy_i0 = \vz.$$

This contradicts the linear independence of $X$. Therefore, we must have $m \le n$.

Corollary. Any two bases of $V$ have the same size.

Proof. If $\cB_1, \cB_2$ are two bases of a vector space $V$, then $|\cB_1| \le|\cB_2|$ (by Theorem 4) because $\cB_1$ is linearly independent and $\cB_2$ is a spanning set.

Reversing the role of $\cB_1$ and $\cB_2$, we see that $|\cB_2| \le |\cB_1|$ hence $|\cB_1| = |\cB_2|$.

The dimension of $V$ (over $K$), denoted by $\dim_K V$ (or simply by $\dim V$), is the common size of its bases.

Example. $\dim_K K^n = n$. (We have already seen that in Notes 19)

Remarks

  1. The fact that bases of a vector space have the same cardinality is true even if the vector space has no finite spanning set.
  2. Since the vector space structure of a set depends on the scalar field, so is its dimension. For example, $\dim_{\Cc} \Cc = 1$ ($\{1\}$ is a basis of $\Cc$ over $\Cc$) but $\dim_{\Rr} \Cc = 2$ ($\{1,i\}$ is a basis of $\Cc$ over $\Rr$).