A vector space is a set in which elements can be added and multiplied by scalars. $\text{Mat(n,m)}$ is an example of an abstract vector space. So is the space of polynomials.

Definition 2.6.1 (Vector space). A vector space $V$ is a set of vectors such that two vectors can be added to form another vector in $V$, and a vector can be multiplied by a scalar to form another vector in $V$.

  1. Additive identity. There exists a vector $0 \in V$ such that for any $v\in V$, we have $0+v = v$.
  2. Additive inverse. For any $v\in V$, tehre exists a vector $-v \in V$ such that $v + (-v) = 0$.
  3. Communative law for addition. For all $v,w\in V$, we have $v+w = w+v$.
  4. Associative law for addition. For all $v_1, v_2, v_3 \in V$, $v_1 + (v_2+ v_3) = (v_1 + v_2) + v_3$.
  5. Multiplicative identity. For all $v\in V$, we have $1v = v$.
  6. Associative law for multiplication.
  7. Distributive law for scalar addition.
  8. Distributive law for vector addition.

Definition 2.6.12 (Concrete to abstract function). Let $\{v\} = v_1,\dots, v_n$ be a finite, ordered collection of $n$ vectors in a vector space $V$. The concrete to abstract function $\Phi_{\{v\}}$ is the linear transformation $\Phi_{\{v\}}: \R^n \to V$ that translates $\R^n$ to $V$:

$$ \Phi_{\{v\}} (\vec a) = \Phi_{\{v\}}\begin{pmatrix}\begin{bmatrix}a_1\\\vdots \\ a_n\end{bmatrix}\end{pmatrix} = a_1v_1 + \dots + a_nv_n. $$

$\vec v_1, \dots \vec v_n$ are linearly independent if and only if $\Phi_{\{v\}}$ is injective.

$\vec v_1, \dots, \vec v_n$ span $V$ if and only if $\Phi_{\{v\}}$ is surjective.

$\vec v_1, \dots, \vec v_n$ are a basis if and only if $\Phi_{\{v\}}$ is bijective.

Matrix with respect to a basis and change of basis