$\newcommand{\la}{\langle}$ $\newcommand{\ra}{\rangle}$ $\newcommand{\vu}{\mathbf{u}}$ $\newcommand{\vv}{\mathbf{v}}$ $\newcommand{\vw}{\mathbf{w}}$ $\newcommand{\vzz}{\mathbf{z}}$ $\newcommand{\nc}{\newcommand}$ $\nc{\Cc}{\mathbb{C}}$ $\nc{\Rr}{\mathbb{R}}$ $\nc{\Qq}{\mathbb{Q}}$ $\nc{\Nn}{\mathbb{N}}$ $\nc{\cB}{\mathcal{B}}$ $\nc{\cE}{\mathcal{E}}$ $\nc{\cC}{\mathcal{C}}$ $\nc{\cD}{\mathcal{D}}$ $\nc{\mi}{\mathbf{i}}$ $\nc{\sp}[1]{\langle #1 \rangle}$ $\nc{\ol}[1]{\overline{#1} }$ $\nc{\norm}[1]{\left\| #1 \right\|}$ $\nc{\abs}[1]{\left| #1 \right|}$ $\nc{\vz}{\mathbf{0}}$ $\nc{\vo}{\mathbf{1}}$ $\nc{\DMO}{\DeclareMathOperator}$ $\DMO{\tr}{tr}$ $\DMO{\nullsp}{nullsp}$ $\nc{\va}{\mathbf{a}}$ $\nc{\vb}{\mathbf{b}}$ $\nc{\vx}{\mathbf{x}}$ $\nc{\ve}{\mathbf{e}}$ $\nc{\vd}{\mathbf{d}}$ $\nc{\vh}{\mathbf{h}}$ $\nc{\ds}{\displaystyle}$ $\nc{\bm}[1]{\begin{bmatrix} #1 \end{bmatrix}}$ $\nc{\gm}[2]{\bm{\mid & \cdots & \mid \\ #1 & \cdots & #2 \\ \mid & \cdots & \mid}}$ $\nc{\MN}{M_{m \times n}(K)}$ $\nc{\NM}{M_{n \times m}(K)}$ $\nc{\NP}{M_{n \times p}(K)}$ $\nc{\MP}{M_{m \times p}(K)}$ $\nc{\PN}{M_{p \times n}(K)}$ $\nc{\NN}{M_n(K)}$ $\nc{\im}{\mathrm{Im\ }}$ $\nc{\ev}{\mathrm{ev}}$ $\nc{\Hom}{\mathrm{Hom}}$ $\nc{\com}[1]{[\phantom{a}]^{#1}}$ $\nc{\rBD}[1]{ [#1]_{\cB}^{\cD}}$ $\DMO{\id}{id}$ $\DMO{\rk}{rk}$ $\DMO{\nullity}{nullity}$ $\DMO{\End}{End}$ $\DMO{\proj}{proj}$ $\nc{\GL}{\mathrm{GL}}$

Linear Combinations I

A linear combination of $\vv_1, \ldots, \vv_n \in V$ is an expression of the form

$$ c_1\vv_1 + \cdots + c_n\vv_n \tag{1} $$

where $c_i \in K$ ($1 \le i \le n$). The scalar $c_i$ is called the coefficient of $\vv_i$ ($1 \le i \le n$) in the linear combination (1).

A linear combination is trivial if all its coefficients are zeros. Clearly, a trivial linear combination sums to $\vz$.

Checkpoint 1. Two different linear combinations could have the same sum.

Take $\vv_1 = \left[\begin{array}{c} 1 \\ 2 \\ -1 \end{array}\right]$, $ \vv_2 = \left[\begin{array}{c} 1 \\ 1 \\ 0 \end{array}\right] $ and $\vv_3 = \left[\begin{array}{c} 1 \\ 0 \\ 1 \end{array}\right] $

Check that $\vv_1 + (-2)\vv_2 + \vv_3 = \vz$.

So $\vv_1+(-2)\vv_2 +\vv_3$ and $0\vv_1 + 0\vv_2 + 0\vv_3$, the trivial combination of $\vv_1,\vv_2$ and $\vv_3$, both sum to $\vz$.

Span

Let $\emptyset \neq X \subseteq V$, the span of $X$, denoted by span($X$) or $\sp{X}$, consists of the sums of linear combinations of elements of $X$. That is,

$$\sp{X} = \left\{\vv \in V \colon \vv = \sum_{i=1}^n c_i\vx_i\ \text{for some}\ n \ge 1, \vx_i \in X, c_i \in K, (1 \le i \le n)\right\}.$$

The only linear combination of empty set, is the combination that has no coefficients (hence trivial vacuously). The span of $\emptyset$ is hence defined to be $\{\vz\}$.

Note that for any $X$, $\vz \in \sp{X}$ hence $\sp{X}$ never empty.

A linear relation of elements of $X$ is an equation of the form $c_1\vx_1 + \ldots c_n\vx_n = \vz$. The linear relation is non-trivial if $c_i \neq 0$ for some $i$.

From a non-trivial linear relation $\vv_1 -2\vv_2 + \vv_3 = \vz$ in Checkpoint 1, we get $\vv_1 = 2\vv_2 -\vv_3$ is a linear combination of $\vv_2,\vv_3$. In other words, $\vv_1 \in \sp{\{\vv_2,\vv_3\}}$.

Checkpoint 2. Is $\vv_2$ in span$(\vv_1,\vv_3)$? Is $\vv_3$ is span$(\vv_1,\vv_2)$?

Example. Let $A = \bm{1 & -1 \\ 0 & 1}$, $B=\bm{0 & 3 \\ 2 & -1}$. Then $\sp{\{A,B\}} = \left\{c_1\bm{1 & -1 \\ 0 & 1} + c_2\bm{0 & 3 \\ 2 & -1} \colon c_1,c_2 \in K\right\}$ and

$$\bm{2 & -5 \\ -2 & 3} = 2A -B \in \sp{\{A,B\}}.$$

Checkpoint 3. Give another nonzero vector in the span of $\{A,B\}$. Can one make the zero vector as a non-trivial combination of $A$ and $B$, why or why not?

Proposition 1. $\sp{X}$ is the smallest vector subspace of $V$ that contains $X$. That is $X \subseteq \sp{X} \le V$ and if $X \subseteq W \le V$, then $\sp{X} \subseteq W$.

Proof. Clearly, $\sp{X} \supseteq X$. Suppose $W$ is a subspace of $V$ containing $X$. Since $W$ is closed under sum and scalar multiplication, $W \supseteq \sp{W} \supseteq \sp{X}$. It remains to show that $\sp{X}$ is a subspace of $V$. First $\vz \in \sp{X}$, so it is non-empty. If $\vv = \sum c_i \vx_i$ and $\vv' = \sum c_i'\vx_i'$ are in $\sp{X}$ and $c \in K$, then $c\vv + \vv'$ is the sum of the linear combination $\sum (cc_i)\vx_i + \sum c_i'\vx_i'$ of elements of $X$, therefore $\sp{X}$ is a subspace.

Corollary. For any $X \subseteq V$, $\sp{\sp{X}} = \sp{X}$.

Proof. $\sp{X}$ is a vector subspace of $V$ cotaining $\sp{X}$, hence $\sp{\sp{X}} \subseteq \sp{X}$. The reverse inclusion is clear.

Column space and Row space

The column space (resp. row space) of a matrix is the span of its columns (resp. rows).

Then the column space of $A \in \MN$ is a subspace of $K^m$ and the row space of $A$ is a subspace of $K^n$.

The fundamental question is:

Given finite set of vectors $A$ and a vector $\vb$, is $\vb \in \sp{A}$; that is, can $\vb$ be written as a linear combination of vectors from $A$? If so, write $\vb$ as a linear combination of the elements in $A$.

This question can be answered by row reduction.

Recall that $$ A\vx = \gm{\va_1}{\va_n}\bm{x_1 \\ \vdots \\ x_n} = x_1\va_1 + x_2\va_2 + \cdots + x_n\va_n, $$ Therefore, by identifying the (ordered) set $A = \{\va_1, \ldots, \va_n\}$ with the matrix, also named $A$, whose $i$-th column is $\va_i$, we have

Proposition 2.

$\vb \in \sp{A} =$ column space of $A \iff$ the equation $A\vx = \vb$ is consistent.

Moreover, the solutions of $A\vx = \vb$ are in 1-to-1 correspondence with the linear combinations $\va_1,\ldots,\va_n$ that express $\vb$.

Example 3. Let us decide if the vector $ \vb= \bm{-5 \\ -1 \\3} $ is in the span of $ \va_1 = \left[\begin{array}{c} 2\cr 1\cr -1 \end{array}\right], \va_2 = \left[\begin{array}{c} -2\cr -2\cr 1 \end{array}\right], $ and $ \va_3 = \left[\begin{array}{c} 1\cr 2\cr -1 \end{array}\right]. $

By the discussion above, we can decide that by deciding whether the system

$$ \bm{ 2 & -2 &1 \\ 1 &-2 &2 \\ -1 & 1 & -1} \bm{x_1 \\x_2 \\x_3} = \bm{-5 \\ -1 \\3} $$

is consistent or not.

As before, to answer this, we form the augmented matrix $M$ of the system and compute its rref:

From this we know that the system is consistent and has a unique solution namely $x_1 = -5, x_2 = -3$ and $x_3 = -1$.

That means

$$ \vb = \begin{bmatrix} -5 \\ -1 \\3 \end{bmatrix} = \begin{bmatrix} 2 & -2 &1 \\ 1 &-2 &2 \\ -1 & 1 & -1 \end{bmatrix} \begin{bmatrix} \color{red}{-5} \\ \color{blue}{-3} \\ \color{green}{-1} \end{bmatrix} = (\color{red}{-5}) \begin{bmatrix} 2 \\ 1 \\ -1 \end{bmatrix} +(\color{blue}{-3}) \begin{bmatrix} -2\cr -2\cr 1 \end{bmatrix} +(\color{green}{-1}) \begin{bmatrix} 1\cr 2\cr -1 \end{bmatrix} $$

which expresses $\vb$ as a linear combination of the $\va_i$'s.

Example 4. Decide if $\vb = \bm{1 \\ 1 \\ 2} $ is in the span of $\va_1=\bm{1 \\ -2 \\ 1}, \va_2 = \bm{2 \\ -3\\ -1}, \va_3 = \bm{1 \\ 0 \\-5}$

As before, we form the augmented matrix $M = [A|\vb]$ of the system and compute its rref.

The last row corresponds to the equation $0x_1 + 0x_2 + 0x_3 = 1$ which is inconsistent.

So we conclude that $\vb=\bm{1\\1\\2}$ is not in the span of $\va_1,\va_2, \va_3$.

Example 5. Decide if $\vb = \bm{2\\-1\\-7}$ is in the span of the vectors $\va_1,\va_2,\va_3$ as in Example 4.

The system is consistent hence $\vb = \bm{2\\-1\\-7}$ is in the span of $\va_1, \va_2,\va_3$.

Moreover, by solving the system: $$ \begin{cases} x_1 - 3s_1 &= -4 \\ x_2 + 2s_1 & = 3 \\ x_3 & =s_1 \end{cases} $$

we get a solution $\bm{-4 \\3 \\0}$ (by setting the free variable $s_1$ to 0).

So we find a linear combination of $\va_1,\va_2,\va_3$ that sum to $\vb$: $$ \vb = \bm{2\\-1\\-7} = (-4)\va_1 + 3\va_2 + 0\va_3 = (-4)\bm{1 \\ -2\\1} + (3)\bm{2\\-3\\-1} +(0)\bm{1\\0\\-5}. $$

Checkpoint. Find another way of expressing $\vb$ as a linear combination of $\va_1, \va_2, \va_3$.

Describe all linear combinations of these vectors expressing $\vb$.

Nullspace, Row space and row equivalence

Theorem 3. Two matrices are row equivalent if and only if they have the same row space.

Proof. The rows of $EA$ are clearly linear combinations of the rows of $A$. That means the row space of $EA$ is a subspace of the row space of $A$. If $E$ is invertible, the reverse inclusion is true as well. Therefore, $EA$ and $A$ have the same row space when $E$ is an elementary matrix. Consequently, any two row equivalent matrices have the same row space.

Conversely, suppose $A$ and $A'$ have the same row space. By the implication that we have just proved, $R_A$ and $R_{A'}$ have the same row space. Since they are matrices in rref, having the same row space means they are equal. Thus, $A$ and $A'$ are row equivalent because both row equivalent to the same matrix.

You may skip the proof of the following result in the first reading. In MAT 331, we will learn that row space and null space of a matrix are orthogonal complement of each other. In particular, either one of them determined the other, thus,

Theorem. Two matrices are row equivalent if and only if they have the same nullspace.

Proof. Suppose $A$ and $A'$ have the same nullspace, say $H$. We claim that the row space of $A$ (and $A'$) is the space

$$W = \{\vv \colon \vv\vh = 0 \ \forall \vh \in H\} = \bigcap_{\vh \in H} \{\vv \colon \vv \vh = 0\}.$$

Since $H$ is the nullspace of $A$, $W$ contains the rows of $A$ and hence the row space of $A$ because $W$ is a subspace itself. On the other hand, for any linear combination $\vv = c_1(A)^1 + \cdots c_m(A)^m$ of rows of $A$ and $\vh \in H$, we have $(A)^i\vh = 0$ for each $1 \le i \le m$. Therefore,

$$ \vv \vh = (c_1(A)^1 + \cdots c_m(A)^m)\vh = \sum_{i=1}^m c_i (A)^i \vh = \sum_{i=1}^m c_i0 = 0. $$

Hence $\vv \in W$.

Conversely, suppose $A$ and $A'$ have the same row space, say $W$. Then by this result

$$ H = \bigcap_{\vw \in W} \{\vh \colon \vw\vh = 0\} $$

is there common nullspace.