We discuss a decomposition using eigenvalues and eigenvectors.1
An eigenvalue decomposition of a matrix $A \in \R ^{n \times n}$ is an ordered pair $(X, \Lambda )$ in which $X$ is invertible, $\Lambda $ is diagonal, and $A = X\Lambda X^{-1}$.
In this case, $AX = X\Lambda $, in other
words,
\[
\bmat{ && \\ & A & \\ && }
\bmat{x_1 & \cdots & x_m} =
\bmat{x_1 & \cdots & x_m}
\bmat{\lambda _1 && \\ & \ddots & \\&& \lambda _n}.
\]
If $X$ is orthonormal, so that $X^{-1} = X^\top $, then we can interpret such a decomposition as a change of basis to eigenvector coordinates. If $Ax = b$, and $A = X\Lambda X^{-1}$ then $(X^{-1}b) = \Lambda (X^{-1}x)$. Here, $X^{-1}x$ expands $x$ is the basis of columns of $X$. So to compute $Ax$ , we first expand into the basis of columns of $X$, scale by $\Lambda $, and then interpret the result as the coefficients of a linear combination of the columns of $X$.
In this case that $A = X\Lambda X^\top $ for
an eigenvalue decomposition $(X, \Lambda )$ of
$A$, we can also write
\[
A = X\Lambda X^\top = \sum_{i = 1}^{n}
\Lambda _{ii}x_ix_i^\top .
\]