We explore matrix-vector multiplication.
Given a matrix $A \in \R ^{m \times n}$ and
a vector $x \in \R ^{n}$, the
product of $A$
with $x$ is the vector
$y \in \R ^{m}$ defined by
\[
y_i = \sum_{j = 1}^{n} A_{ij} x_j, \quad i = 1, \dots , m.
\]
We denote the product of $A$ with $x$ by $Ax$. With which we concisely write the system of linear equations $(A, b)$ as $b = Ax$.
This notation suggests both algebraic and geometric interpretations of solving systems of linear equations. The algebraic interpretation is that we are interested in the invertibility of the function $x \mapsto Ax$. In other words, we are interested in the existence of an inverse element of $A$. The geometric interpretation is that $A$ transforms the vector $x$.
Conversely, we can view $x$ as transforming
(acting on) $A$.
Let $a^j \in \R ^m$ denote the $j$th column of
$A$, then
\[
Ax = \sum_{j = 1}^{n} x_j a^j.
\]
We call the function $f: \R ^n \to \R ^m$
defined by $f(x) = Ax$ the
matrix multiplication
function (or matrix-vector
product function) associated with $A$.
$f$ is satisfies the following two important
properties:
We call such a function $f$
linear.
In other words, the matrix multiplication
function is linear.
Conversely, if $g: \R ^n \to \R ^m$ is linear,
there exists a matrix inducing $g$.
Moreover, this matrix representation of $f$ is unique.