\(\DeclarePairedDelimiterX{\Set}[2]{\{}{\}}{#1 \nonscript\;\delimsize\vert\nonscript\; #2}\) \( \DeclarePairedDelimiter{\set}{\{}{\}}\) \( \DeclarePairedDelimiter{\parens}{\left(}{\right)}\) \(\DeclarePairedDelimiterX{\innerproduct}[1]{\langle}{\rangle}{#1}\) \(\newcommand{\ip}[1]{\innerproduct{#1}}\) \(\newcommand{\bmat}[1]{\left[\hspace{2.0pt}\begin{matrix}#1\end{matrix}\hspace{2.0pt}\right]}\) \(\newcommand{\barray}[1]{\left[\hspace{2.0pt}\begin{matrix}#1\end{matrix}\hspace{2.0pt}\right]}\) \(\newcommand{\mat}[1]{\begin{matrix}#1\end{matrix}}\) \(\newcommand{\pmat}[1]{\begin{pmatrix}#1\end{pmatrix}}\) \(\newcommand{\mathword}[1]{\mathop{\textup{#1}}}\)
Needs:
Real Matrix-Matrix Products
Inverse Elements
Needed by:
Affine MMSE Estimators
Least Squares Linear Regressors
Matrix Determinant of Inverses
Matrix Inverses
Matrix Product Inverses
Matrix Similarity
Monotonic Functions of Real Matrices
Multivariate Normals
Permutation Matrices
Probabilistic Errors Linear Model
Real General Linear Groups
Links:
Sheet PDF
Graph PDF

Real Matrix Inverses

Why

Let $A \in \R ^{m \times n}$ and define $f: \R ^n \to \R ^m$ by $f(x) = Ax$. Then $f$ is a linear function from $\R ^{n}$ to $\R ^{m}$. Conversely, suppose $g: \R ^n \to \R ^m$ is a linear function. Then there exists a matrix $B \in \R ^{m \times n}$ so that $g(z) = Bz$. Does this function have an inverse?

Derivation

If $A \in \R ^{m \times n}$, with $m \neq n$, then the inverse of $f$ can not exist. For a square matrix $A \in \R ^{n \times n}$, $B \in \R ^{n \times n}$ is a left inverse if $BA = I$. In other words, $B$ is a left inverse element of $A$ in the algebra of matrices with the operation of multiplication. $C \in \R ^{n \times n}$ is a right inverse if $AC = I$.

Definition

We call a square matrix $A$ invertible if there is $B \in \R ^{n \times n}$ so that $BA = I$.

Now suppose that $A \in \R ^{n \times n}$. Of course, the inverse may not exist. Consider, for example if $A$ was the $n$ by $n$ matrix of zeros. If there exists a matrix $B$ so that $BA = I$ we call $B$ the left inverse of $A$ and likewise if $AC = I$ we call $C$ the right inverse of $A$. In the case that $A$ is square, the right inverse and left inverse coincide.

Suppose $A, B, C \in \R ^{n \times n}$. If $BA = I$ and $AC = I$, then $B = C$.
Since $BA = AC$ we have $BBA = BAC$ so $B = C$ since $BA = I$.
Copyright © 2023 The Bourbaki Authors — All rights reserved — Version 13a6779cc About Show the old page view