\(\DeclarePairedDelimiterX{\Set}[2]{\{}{\}}{#1 \nonscript\;\delimsize\vert\nonscript\; #2}\) \( \DeclarePairedDelimiter{\set}{\{}{\}}\) \( \DeclarePairedDelimiter{\parens}{\left(}{\right)}\) \(\DeclarePairedDelimiterX{\innerproduct}[1]{\langle}{\rangle}{#1}\) \(\newcommand{\ip}[1]{\innerproduct{#1}}\) \(\newcommand{\bmat}[1]{\left[\hspace{2.0pt}\begin{matrix}#1\end{matrix}\hspace{2.0pt}\right]}\) \(\newcommand{\barray}[1]{\left[\hspace{2.0pt}\begin{matrix}#1\end{matrix}\hspace{2.0pt}\right]}\) \(\newcommand{\mat}[1]{\begin{matrix}#1\end{matrix}}\) \(\newcommand{\pmat}[1]{\begin{pmatrix}#1\end{pmatrix}}\) \(\newcommand{\mathword}[1]{\mathop{\textup{#1}}}\)
Needs:
Real Linear Equation Solutions
Needed by:
Ordinary Row Reductions
Row Reducer Matrices
Links:
Sheet PDF
Graph PDF

Linear System Row Reductions

Why

We want to solve linear equations. Our approach is to “eliminate” variables from equations in our system. Once we reach an equation in one variable, we will back-substitute to solve.

Two-variable example

Suppose we want to find $x_1, x_2 \in \R $ to satisfy

\[ \begin{aligned} 3x_1 + 2x_2 &= 10, \text{ and} \\ 6x_1 + 5x_2 &= 20. \\ \end{aligned} \]

We can list the coefficients in a two-dimensional array $A = (3, 2; 6, 5)$ and $b = (10,20)$. We can eliminate $x_1$ from the second equation by subtracting twice the first equation from the second. In doing so we obtain the system of equations

\[ \begin{aligned} 3x_1 + 2x_2 &= 10 \text{ and } \\ x_2 &= 0. \end{aligned} \]

The key insight is that this system has the same solution set. We call the process of moving between these two systems a row reduction.

Four-variable example

What if instead we have four unknowns? Suppose

\[ A = \barray{2 & 1 & 1 & 0 \\ 4 & 3 & 3 & 1 \\ 8 & 7 & 9 & 5 \\ 6 & 7 & 9 & 8} \text{ and } b = \barray{ 1 \\ 2 \\ 3 \\ 4}. \]

We might first eliminate $x_1$ (the variable associated with the first column of coefficients) from the remaining three equations to obtain the linear system $S_1 = (A^1, b^1)$ in which

\[ A^1 = \barray{ 2 & 1 & 1 & 0 \\ 0 & 1 & 1 & 1 \\ 0 & 3 & 5 & 5 \\ 0 & 4 & 6 & 8 \\ } \text{ and } b^1 = \barray{ 1 \\ 0 \\ -1 \\ 1 \\ } \]

The trick is that, since $A_{22}' \neq 0$, we can take the same route to eliminate $x_2$, to obtain the system $S_2 = (A^2, b^2)$ in which

\[ A_2 = \barray{ 2 & 1 & 1 & 0 \\ 0 & 1 & 1 & 1 \\ 0 & 0 & 2 & 2 \\ 0 & 0 & 2 & 4 \\ } \text{ and } b^2 = \barray{ 1 \\ 0 \\ -1 \\ 1 \\ } \]

Likewise for $x_3$, we obtain $S_3 = (A^3, b^3)$ in which

\[ A^3 = \barray{ 2 & 1 & 1 & 0 \\ 0 & 1 & 1 & 1 \\ 0 & 0 & 2 & 2 \\ 0 & 0 & 0 & 2 \\ } \text{ and } b^3 = \barray{ 1 \\ 0 \\ -1 \\ 3 \\ }. \]

Here, as in the two-variable case, the key insight is that all these systems have the same solution set and the last one, $(A^3, b^3)$, is easy to solve. We solve it by back substitution. First, since $2x_4 = 3$, we find $x_4 = 3/2$. Second, since $2x_3 + 2x_4 = -1$, we find $x_3 = -2$. Similarly we find $x_2 = 1/2$ and $x_3 = 5/4$.

Definition

Let $S = (A \in \R ^{m \times n}, b \in \R ^{n})$ be a linear system. The lower row reduction of $S$ for index $i$ with $A_{ii} \neq 0$ (or the $i$-row reduction) is the linear system $\tilde{A}_{st} = A_{st} - (A_{sj}/A_{ij})A_{it}$ if $i < s \leq m$ and $A_{st}$ otherwise. We say that the system $(A, b)$ is ordinarily reducible.

Let $a^k, \tilde{a}^k \in \R ^{n}$ denote the $k$th row of $A$ and $\tilde{A}$, respectively. Then if $k \neq i$, $\tilde{a}^k = a^k - \alpha _k a^i$ where $\alpha _k = A_{kj}/A_{ij}$. In other words, a row $k$ of the matrix $\tilde{A}$ is obtained by subtracting a multiple of the $i$th row of matrix $A$ from row $k$ of matrix $A$. We are “reducing” the rows of $A$.

Let $(A \in \R ^{m \times n}, b \in \R ^{n})$ be a linear system which row reduces to $(C, d)$. Then $x \in \R ^{n}$ is a solution of $(A, b)$ if and only if it is a solution of $(C, d)$.1

First we reduce by subtracting twice row 1 from row 2, four times row 1 from row 3, and three times row 1 from row 4.

\[ S_1 = \parens{\barray{ 2 & 1 & 1 & 0 \\ 0 & 1 & 1 & 1 \\ 0 & 3 & 5 & 5 \\ 0 & 4 & 6 & 8 \\ }, \barray{ 1 \\ 0 \\ -1 \\ 1 \\ }}. \]

We then subtract three times row 2 from row 3 and four times row 2 from row 4 to obtain

\[ S_2 = \left(\barray{ 2 & 1 & 1 & 0 \\ 0 & 1 & 1 & 1 \\ 0 & 0 & 2 & 2 \\ 0 & 0 & 2 & 4 \\ }, \barray{ 1 \\ 0 \\ -1 \\ 1 \\ }\right). \]

Finally, we subtract two times row 3 from row 4 to obtain $S_4$, which we can write as

\[ \begin{aligned} 2x_1 + x_2 + x_3 &= 1,& \\ x_2 + x_3 + x_4 &= 0,& \\ 2x_3 + 2x_4 &= -1,& \text{ and } \\ 2x_4 &= 3.& \end{aligned} \]

We can now back-substitute to find $x_4 = 3/2$, $x_3 = -2$, $x_2 = 1/2$ and $x_1 = 5/4$. The above proposition says that this is the only solution of $S$, as well.


  1. Future editions will include an account. ↩︎
Copyright © 2023 The Bourbaki Authors — All rights reserved — Version 13a6779cc About Show the old page view