%display latex
latex.matrix_delimiters(left='[', right=']')
latex.matrix_column_alignment('c')
$\newcommand{\la}{\langle}$ $\newcommand{\ra}{\rangle}$ $\newcommand{\vu}{\mathbf{u}}$ $\newcommand{\vv}{\mathbf{v}}$ $\newcommand{\vw}{\mathbf{w}}$ $\newcommand{\vzz}{\mathbf{z}}$ $\newcommand{\nc}{\newcommand}$ $\nc{\Cc}{\mathbb{C}}$ $\nc{\Rr}{\mathbb{R}}$ $\nc{\Qq}{\mathbb{Q}}$ $\nc{\Nn}{\mathbb{N}}$ $\nc{\cB}{\mathcal{B}}$ $\nc{\cE}{\mathcal{E}}$ $\nc{\cC}{\mathcal{C}}$ $\nc{\cD}{\mathcal{D}}$ $\nc{\mi}{\mathbf{i}}$ $\nc{\span}[1]{\langle #1 \rangle}$ $\nc{\ol}[1]{\overline{#1} }$ $\nc{\norm}[1]{\left\| #1 \right\|}$ $\nc{\abs}[1]{\left| #1 \right|}$ $\nc{\vz}{\mathbf{0}}$ $\nc{\vo}{\mathbf{1}}$ $\nc{\DMO}{\DeclareMathOperator}$ $\DMO{\tr}{tr}$ $\DMO{\nullsp}{nullsp}$ $\nc{\va}{\mathbf{a}}$ $\nc{\vb}{\mathbf{b}}$ $\nc{\vx}{\mathbf{x}}$ $\nc{\ve}{\mathbf{e}}$ $\nc{\vd}{\mathbf{d}}$ $\nc{\ds}{\displaystyle}$ $\nc{\bm}[1]{\begin{bmatrix} #1 \end{bmatrix}}$ $\nc{\gm}[2]{\bm{\mid & \cdots & \mid \\ #1 & \cdots & #2 \\ \mid & \cdots & \mid}}$ $\nc{\MN}{M_{m \times n}(K)}$ $\nc{\NM}{M_{n \times m}(K)}$ $\nc{\NP}{M_{n \times p}(K)}$ $\nc{\MP}{M_{m \times p}(K)}$ $\nc{\PN}{M_{p \times n}(K)}$ $\nc{\NN}{M_n(K)}$ $\nc{\im}{\mathrm{Im\ }}$ $\nc{\ev}{\mathrm{ev}}$ $\nc{\Hom}{\mathrm{Hom}}$ $\nc{\com}[1]{[\phantom{a}]^{#1}}$ $\nc{\rBD}[1]{ [#1]_{\cB}^{\cD}}$ $\DMO{\id}{id}$ $\DMO{\rk}{rk}$ $\DMO{\nullity}{nullity}$ $\DMO{\End}{End}$ $\DMO{\proj}{proj}$ $\nc{\GL}{\mathrm{GL}}$
A vector space over $K$ (or a $K$-vector space) is a set $V$ together with two operations called addition $+:V \times V \to V$ and scalar multiplication $\cdot : K \times V \to V$ satisfying a certain set of rules called the axioms of vector space.
We have checked that the $M_{m \times n}(K)$ is a $K$-vector space, i.e. matrix addition and scalar multiplication of matrix satisfy the axioms of vector space.
When $K^n$ is identified with the $K$-vector space $M_{n \times 1}(K)$ (resp. $M_{1 \times n}(K)$), we call its elements column (resp. row) vectors.
In MAT 247, we focus on matrix spaces (so including $K^n$) only.
A subset $W$ of $V$ is a subspace, written as $W \le V$, if
There is an more succinct way to checking whether a subset of $V$ is a subspace.
Proposition 1. A subset $W$ of $V$ is a subspace if and only if $W$ is non-empty and for any $\vv, \vv' \in W$ and $c \in K$, $$ c \vv + \vv' \in W. $$
Proof. If $W \le V$, then $\vz \in W$ so, $W$ is non-empty. Also, for any $c \in K$, $\vv, \vv' \in W$, $c\vv \in W$ (because of (3)) and so $c\vv + \vv' \in W$ (due to (2) in the definition of subspace).
Conversely, suppose $W$ is a non-empty subset of $V$ satisfying the property stated in the proposition. Since $W$ is non-empty, there is a vector $\vv \in W$ and so $(-1)\vv + \vv = -\vv + \vv = \vz \in W$.
Now pick any $c \in K$, $\vv, \vv' \in W$, we have $(1)\vv + \vv' =\vv + \vv' \in W$ and $c\vv + \vz = c\vv \in W$. Therefore, we established $W \le V$.
Example 0. $\{\vz\} \le V$ and $V \le V$.
Example 1.
Solution space of a homogeneous linear system. For any $A \in M_{m\times n}(K)$, $W = \{\vv \in K^n \colon A\vv = \mathbf{0}\}$ is a subspace of $K^n$.
This is called the nullspace of the matrix $A$ (aka kernel of $A$). Denote it by $\nullsp(A)$ (or $\ker(A)$).
Let's check that $\nullsp(A)$ (where $A \in M_{m \times n}(K))$ is a subspace of $K^n$.
Since $A \vz = \vz$ so $\vz \in W$. So $W$ contains $\vz$.
Take any $\lambda \in K$, $\vv, \vv' \in W$. Then $A\vv = \vz$ and $A\vv' = \vz$. So, $A(\lambda\vv+\vv') = A \lambda \vv + A\vv' = \lambda A\vv + A\vv' = \lambda \vz + \vz = \vz$.
Thus, $W$ is a subspace of $K^n$ according to Proposition 1.
Checkpoint. More generally, fix $A \in \MN$. Check that
$\{X \in \NP \colon AX = \vz \in \MP\}$ and $\{Y \in M_{k \times m}(K) \colon YA = \vz \in M_{k \times n}(K)\}$ are subspaces of $\NP$ and $M_{k \times m}(K)$, respectively.
Example 2. Let $W$ be a set of trace zero $2\times 2$ matrices, i.e. $W = \{A \in M_{2}(K) \colon \tr(A) = 0\}$. Check that $W \le M_2(K)$.
Clearly, the trace of the zero matrix in $M_2(K)$ is 0, so $W$ contains the zero of $M_2(K)$.
Pick any $c \in K$ and $A, B \in W$. We have $\tr(A) = a_{11} + a_{22} =0$ and $\tr(B) = b_{11} + b_{22} = 0$.
The diagonal entries of $A+B$ are $a_{11}+b_{11}$ and $a_{22} + b_{22}$. So, $\tr(cA+B) = (ca_{11}+b_{11}) + (ca_{22}+b_{22}) = c(a_{11} + a_{22}) + (b_{11}+b_{22}) = c0+0 = 0$. Thus, $cA + B \in W$.
Since $c \in K$ and $A,B \in W$ are arbitrarily choosen, by Proposition~1, this shows that $W \le M_2(K)$.
The same argument shows that the trace-zero $n\times n$ matrices form a subspace of $M_n(K)$ (nothing special about the matrices involved being $2\times 2$).
Checkpoint.
Do the trace-one $2 \times 2$ matrices from a subspace of $M_2(K)$? Why or why not?
Example 3. Let's demonstrate that the $2\times 2$ idempotent matrices, i.e. matrix $A$ such that $A^2 = A$ do not form a subspace of $M_2(K)$.
Let $H = \{A \in M_2(K) \colon A^2 = A\}$ be the set of all $2\times 2$ idempotent matrices. Since $\bm{0 & 0 \\ 0 & 0}^2 = \bm{0 & 0 \\ 0 & 0} = \vz$ so $\vz$ is an idempotent and hence $H$ contains the zero vector of $M_2(K)$.
However, $H$ is not closed under scalar multiplication. This can be seen from that fact that $I_2 \in H$ ($I_2^2 = I_2$) but $2I_2 = \bm{2 & 0 \\ 0 & 2} \notin H$ because $(2I_2)^2 = 4I_2^2 = 4I_2 \neq 2I_2$. So $H$ is not a subspace (not closed under scalar multiplication).
$H$ is also not closed under addition. If it were that means for any $A,B \in H$, $A+B \in H$ and so $$A+B = (A+B)^2 = A^2 + AB + BA + B^2 = A + AB + BA + B.$$
That means $AB = -BA$. So to find a counterexample, we can search for two idempotent matrices $A$ and $B$ so that $AB = - BA$. One example of such is $A = \bm{1 & 0 \\ 0 & 1}$ and $B = \bm{1 & 0 \\ 0 & 0}$. I will leave it to you to verify that $A,B \in H$ but $A+B \notin H$.