%display latex
latex.matrix_delimiters(left='[', right=']')
latex.matrix_column_alignment('c')
$\newcommand{\la}{\langle}$ $\newcommand{\ra}{\rangle}$ $\newcommand{\vu}{\mathbf{u}}$ $\newcommand{\vv}{\mathbf{v}}$ $\newcommand{\vw}{\mathbf{w}}$ $\newcommand{\vzz}{\mathbf{z}}$ $\newcommand{\nc}{\newcommand}$ $\nc{\Cc}{\mathbb{C}}$ $\nc{\Rr}{\mathbb{R}}$ $\nc{\Qq}{\mathbb{Q}}$ $\nc{\Nn}{\mathbb{N}}$ $\nc{\cB}{\mathcal{B}}$ $\nc{\cE}{\mathcal{E}}$ $\nc{\cC}{\mathcal{C}}$ $\nc{\cD}{\mathcal{D}}$ $\nc{\mi}{\mathbf{i}}$ $\nc{\sp}[1]{\langle #1 \rangle}$ $\nc{\ol}[1]{\overline{#1} }$ $\nc{\norm}[1]{\left\| #1 \right\|}$ $\nc{\abs}[1]{\left| #1 \right|}$ $\nc{\vz}{\mathbf{0}}$ $\nc{\vo}{\mathbf{1}}$ $\nc{\DMO}{\DeclareMathOperator}$ $\DMO{\tr}{tr}$ $\DMO{\nullsp}{nullsp}$ $\nc{\va}{\mathbf{a}}$ $\nc{\vb}{\mathbf{b}}$ $\nc{\vx}{\mathbf{x}}$ $\nc{\ve}{\mathbf{e}}$ $\nc{\vd}{\mathbf{d}}$ $\nc{\vh}{\mathbf{h}}$ $\nc{\ds}{\displaystyle}$ $\nc{\bm}[1]{\begin{bmatrix} #1 \end{bmatrix}}$ $\nc{\gm}[2]{\bm{\mid & \cdots & \mid \\ #1 & \cdots & #2 \\ \mid & \cdots & \mid}}$ $\nc{\MN}{M_{m \times n}(K)}$ $\nc{\NM}{M_{n \times m}(K)}$ $\nc{\NP}{M_{n \times p}(K)}$ $\nc{\MP}{M_{m \times p}(K)}$ $\nc{\PN}{M_{p \times n}(K)}$ $\nc{\NN}{M_n(K)}$ $\nc{\im}{\mathrm{Im\ }}$ $\nc{\ev}{\mathrm{ev}}$ $\nc{\Hom}{\mathrm{Hom}}$ $\nc{\com}[1]{[\phantom{a}]^{#1}}$ $\nc{\rBD}[1]{ [#1]_{\cB}^{\cD}}$ $\DMO{\id}{id}$ $\DMO{\rk}{rk}$ $\DMO{\nullity}{nullity}$ $\DMO{\End}{End}$ $\DMO{\proj}{proj}$ $\nc{\GL}{\mathrm{GL}}$
Here comes the most important concept in linear algebra.
A set of vectors $X$ is linearly dependent if some $\vv \in X$ is in the span of $X \setminus\{\vv\}$.
A set of vector is linearly independent if it is not linearly dependent.
Here is an analogy: the set of colors $C =\{ \color{red}{red}, \color{blue}{blue}, \color{purple}{purple} \}$ is "dependent".
Because $\color{purple}{purple} = \color{red}{red} + \color{blue}{blue}$. That verifies that $\color{purple}{purple}$ is in the span of $C \setminus \{\color{purple}{purple}\} = \{\color{red}{red}, \color{blue}{blue}\}$
If colors can be "subtracted" then the equation $\color{blue}{blue} = \color{purple}{purple} - \color{red}{red}$ also shows that $\color{blue}{blue}$ is in the span of $\{\color{red}{red},\color{purple}{purple}\}$ verfying that $C$ is a "dependent" set.
Checkpoint. Can you give another equation showing that $C$ is "dependent"?
Checkpoint. Is the $\emptyset$ linearly depedent or not?
Checkpoint. Is $\{\vz\}$ linearly dependent or not?
Proposition 1. Any set of vectors that contains a linearly dependent set is also linearly dependent.
Equivalently, every subset of a linearly independent set is linearly independent.
Consequently, every set that contains $\vz$ is linearly dependent. Also a singleton $\{\vv\}$ is linearly dependent if and only if $\vv = \vz$.
Proposition 2. Every linearly dependent set contains a finite linearly dependent set.
In other words, if every finite subset of a set of vectors $A$ is linearly independent then so is $A$.
Proof. Suppose $X$ is linearly dependent. Then some $\vv \in X$ can be written as a linear combination $\sum_{i=1}^k c_i \vx_i$ of $\vx_i$'s in $X \setminus\{\vv\}$ ($1 \le i \le k$). Hence, $\{\vv, \vx_1, \ldots \vx_k\}$ is a finite linearly dependent subset of $X$.
Proposition 3. Let $A =\{\va_1,\ldots, \va_m\}$ ($m \ge 2$) be a finite set of vectors from $K^n$. The following statements are equivalent.
Proof. (1) $\implies$ (2) Suppose two distinct linearly combinations of vectors in $A$ express the same vector:
$$ c_1 \va_1 + \cdots + c_m \va_m = c'_1 \va_1 + \cdots + c'_m \va_m. $$WLOG we can assume $c_1 \neq c'_1$.
So, $$ (c_1 -c'_1)\va_1 = (c'_2-c_2)\va_2 + \cdots + (c'_m-c_m)\va_m $$
and,
$$\va_1 = \frac{c_2'-c_2}{c_1 - c_1'}\va_2 + \cdots + \frac{c_m'-c_m}{c_1-c_1'}\va_m.$$(2) $\implies$ (3). By (2), there is only one way to express $\vz \in \sp{A}$ as a linear combination of vectors in $A$ which is, $$ \vz = 0\va_1 + \cdots + 0\va_m. $$
(3) $\implies$ (4) Since $A\vx = x_1\va_1 + x_2\va_2 + \cdots + x_m\va_m$, so (4) is simply a restatement of (3).
(4) $\iff$ (5) If the rref of $A$ has free columns then $A\vx = \vz$ must have a non-trivial solution (by setting a free variable to 1).
Conversely, if the rref of $A$ has no free columns then it must be $\bm{I_m \\ \vz}$ where $\vz$ is the $(n-m) \times m$ zero matrix and hence $\vz \in K^m$ is the only solution of $A\vx = \vz \in K^n$.
(4) $\implies$ (1). If $A$ is linearly dependent then some vector in $A$ is a linearly combination of the rest. WLOG, we can assume $\va_1 = \sum_{j=2}^m c_i \va_i$ from some $c_j \in K$.
Therefore, $$ \vz = \va_1 - c_2\va_2 - \cdots - c_m\va_m = \gm{\va_1}{\va_m}\bm{1 \\ c_2 \\ \vdots \\ c_m} $$ and so $\vx_ 0 = \begin{bmatrix} 1 \\ c_2 \\ \vdots \\ c_n \end{bmatrix}$ is a nontrivial solution to the homogeneous system $A\vx = \vz$.
Corollary 4. A linearly independent susbset $K^n$ has size at most $n$.
Proof. If $A \subseteq K^n$ has more than $n$ vectors, then $A$, its corresponding matrix, will have more columns than rows and so its rref must have a free column. That means $A$ is linearly dependent by Proposition 3.