\(\DeclarePairedDelimiterX{\Set}[2]{\{}{\}}{#1 \nonscript\;\delimsize\vert\nonscript\; #2}\) \( \DeclarePairedDelimiter{\set}{\{}{\}}\) \( \DeclarePairedDelimiter{\parens}{\left(}{\right)}\) \(\DeclarePairedDelimiterX{\innerproduct}[1]{\langle}{\rangle}{#1}\) \(\newcommand{\ip}[1]{\innerproduct{#1}}\) \(\newcommand{\bmat}[1]{\left[\hspace{2.0pt}\begin{matrix}#1\end{matrix}\hspace{2.0pt}\right]}\) \(\newcommand{\barray}[1]{\left[\hspace{2.0pt}\begin{matrix}#1\end{matrix}\hspace{2.0pt}\right]}\) \(\newcommand{\mat}[1]{\begin{matrix}#1\end{matrix}}\) \(\newcommand{\pmat}[1]{\begin{pmatrix}#1\end{pmatrix}}\) \(\newcommand{\mathword}[1]{\mathop{\textup{#1}}}\)
Needs:
Real Linear Combinations
Needed by:
Orthonormal Set of Real Vectors
Real Vector Bases
Links:
Sheet PDF
Graph PDF

Independent Set of Real Vectors

Why

We want to capture the useful properties of the standard basis vectors.

Definition

A set of vectors $\set{v_1, \dots , v_k} \subset \R ^n$ is independent if

\[ \alpha _1v_1 + \alpha _2v_2 + \cdots + \alpha _kv_k = 0 \Rightarrow \alpha _1 = \alpha _2 = \cdots = \alpha _k = 0. \]

Notice that independence is a property of a set of vectors, not of any vector in particular. Another way of saying this is that no vector can be represented as a linear combination of another.

Unique representation

Suppose $v_1, \dots , v_k$ are independent and we have

\[ x = \sum_{i = 1}^{k}\alpha _i v_i \quad \text{ and } \quad x = \sum_{i = 1}^{k} \beta _i v_i. \]

Then

\[ 0 = x - x = \sum_{i = 1}^{n} (\alpha _i - \beta _i)v_k. \]

Using the definition of independence, we conclude $\alpha _i - \beta _i = 0$ for $i = 1, \dots k$. Consequently, $\alpha _i = \beta _i$. In other words, if $x$ can be represented as a linear combination of the vectors $v_1, \dots , v_k$, that representation is unique. We have shown that independence implies uniqueness? What of the converse?

We show that lack of independence gives a lack of uniqueness. Suppose there exists $\alpha _1, \dots , \alpha _k$, not all zero, so that

\[ \alpha _1v_1 + + \alpha _2 v_2 + \cdots + \alpha _kv_k = 0. \]

In particular, suppose $\alpha _i \neq 0$. Then we have

\[ v_i = (1/\alpha _i) \sum_{j \neq i}\alpha _j v_j. \]

Suppose $x$ can be written as a linear combination of $v_1, \dots , v_k$. In other words, there are $\beta _1, \dots , \beta _k$ so that

\[ x = \beta _1 v_1 + \beta _2 v_2 + \cdots + \beta _k v_k \]

Copyright © 2023 The Bourbaki Authors — All rights reserved — Version 13a6779cc About Show the old page view