\(\DeclarePairedDelimiterX{\Set}[2]{\{}{\}}{#1 \nonscript\;\delimsize\vert\nonscript\; #2}\) \( \DeclarePairedDelimiter{\set}{\{}{\}}\) \( \DeclarePairedDelimiter{\parens}{\left(}{\right)}\) \(\DeclarePairedDelimiterX{\innerproduct}[1]{\langle}{\rangle}{#1}\) \(\newcommand{\ip}[1]{\innerproduct{#1}}\) \(\newcommand{\bmat}[1]{\left[\hspace{2.0pt}\begin{matrix}#1\end{matrix}\hspace{2.0pt}\right]}\) \(\newcommand{\barray}[1]{\left[\hspace{2.0pt}\begin{matrix}#1\end{matrix}\hspace{2.0pt}\right]}\) \(\newcommand{\mat}[1]{\begin{matrix}#1\end{matrix}}\) \(\newcommand{\pmat}[1]{\begin{pmatrix}#1\end{pmatrix}}\) \(\newcommand{\mathword}[1]{\mathop{\textup{#1}}}\)
Needs:
Real Matrices
Real Vectors
Norms
Matrix Determinants
Monic Polynomials
Needed by:
Eigenvalue Decomposition
Symmetric Real Matrix Eigenvalues
Links:
Sheet PDF
Graph PDF

Eigenvalues and Eigenvectors

Why

We discuss vectors for which the action of a matrix is scalar multiplication.1

Definition

Let $A \in \R ^{n \times n}$ be a square matrix. A nonzero vector $x \in \R ^{n}$ is an eigenvector of $A$, and $\lambda \in \R $ is its corresponding eigenvalue, if $A x = \lambda x$. In other words, $x \neq 0$ is an eigenvector if the action of $A$ on $x$ is to mimic scalar multiplication.

Speaking of eigenvalues is sensible only when the matrix involved is square. In other words, when the domain and codomain are the same. We often care about eigenvalue computations when a matrix is compounded iteratively.

Eigenspaces

If $x$ is an eigenvector with eigenvalue $\lambda $, then for any $\alpha \in \R $, $\alpha x$ is an eigenvector with eigenvalue $\alpha \lambda $, since $A (\alpha x) = \alpha (Ax) = (\alpha \lambda ) x$. In other words, if $A$ has an eigenvector then the action of $A$ on some subspace $S \subset \R ^n$ is to mimic scalar multiplication. In this case, we call the subspace $S$ an eigenspace, and any nonzero $x \in S$ an eigenvector.

An eigenspace is an invariant subspace of $A$. In other words, if $E$ is an eigenspace corresponding to eigenvalue $\lambda $ then $AE \subset E$.

The dimension of $E$ is the maximum number of linear independent eigenvectors which have the same eigenvalue $\lambda $. We call this number the geometric multiplicity of $\lambda $.

Characteristic polynomial

If $x$ is an eigenvector for $A$ associated with $\lambda $ then $Ax = \lambda x$ so $Ax - \lambda x = 0$ and $(A - \lambda I)x = 0$. In other words, $x$ is an element of the nullspace of $A - \lambda I$. Or equivalently, $\lambda I - A$.

The characteristic polynomial of $A \in \R ^{n \times n}$ is the polynomial $p: \R \to \R $ defined by

\[ p(x) = \det (zI - A). \]

$p$ is monic: the coefficient of the degree $n$ term is 1.
$\lambda $ is an eigenvalue of $A$ if and only if $p(\lambda ) = 0$.
Since $\lambda $ is an eigenvalue if and only if there is a nonzero vector $x$ such that $\lambda x - Ax = 0$, if and only if $\lambda I - A$ is singular, if and only if $\det(\lambda I - A) = 0$.
A simple consequence of the above proposition is that a (real) matrix may have complex eigenvalues.


  1. Future editions will expand. ↩︎
Copyright © 2023 The Bourbaki Authors — All rights reserved — Version 13a6779cc About Show the old page view