# The Gram-Schmidt Process

Recall from the Orthonormal Bases of Vector Spaces page that orthonormal sets of vectors, more specifically, orthonormal bases of finite-dimensional inner product spaces have some very nice properties. For that reason, we often will to be able to take an linearly independent list of vectors and convert it into an orthonormal list of vectors.

We will now look at a well known method for doing just that known as the **Gram-Schmidt Process** or Gram-Schmidt Procedure which we outline in the following theorem.

Theorem 1 (The Gram-Schmidt Process):Let $V$ be an inner product space and let $\{ v_1, v_2, …, v_n \}$ be a set of linearly independent vectors in $V$. Then there exists an orthonormal set of vectors $\{e_1, e_2, …, e_n \}$ of $V$ such that $\mathrm{span} (v_1, v_2, …, v_j) = \mathrm{span} (e_1, e_2, …, e_j)$ for each $j = 1, 2, …, n$. |

**Proof:**Let $\{v_1, v_2, …, v_n \}$ be a set of linearly independent vectors in $V$. We need to construct a set of vectors $\{ e_1, e_2, …, e_n \}$ that is orthonormal. We will carry this proof out by induction.

- Consider the case when $j = 1$. Let $e_1 = \frac{v_1}{\| v_1 \|}$. (Note that $\| v_1 \| \neq 0$ because $v_1 \neq 0$ since a set of linearly independent vectors does NOT contain the zero vector). Now clearly $\mathrm{span} (v_1) = \mathrm{span} (e_1)$ since $v_1$ and $e_1$ differ only by $\| v_1 \|$ and are hence scalar multiples of each other. Furthermore, $\| e_1 \| = \frac{\| v_1 \|}{\| v_1 \|} = 1$.

- Now consider the case when $j > 1$, and suppose that we have constructed an orthonormal set of $j – 1$ vectors, $\{ e_1, e_2, …, e_{j-1} \}$ such that $\mathrm{span} (v_1, v_2, …, v_{j-1}) = \mathrm{span} (e_1, e_2, …, e_{j-1})$. Now since $\{ v_1, v_2, …, v_n \}$ is a linearly independent set of vectors, we have that $v_j \not \in \mathrm{span} (v_1, v_2, …, v_{j-1})$. Thus we define the vector $e_j$ as:

(1)

\begin{align} \quad e_j = \frac{v_j – e_1 – e_2 – … – e_{j-1}}{\| v_j – e_1 – e_2 – … – e_{j-1} \| } \end{align}

- Clearly we have that $\| e_j \| = 1$. We now need to show that $e_j$ is orthonormal to $e_1, e_2, …, e_{j-1}$. For any integer $k$ such that $1 ≤ k , consider the inner product of $e_j$ with $e_k$. Let $N = \| v_j –
e_1 – . Then we have that:e_2 – … – e_{j-1} \|$

(2)

\begin{align} \quad = \left e_1 – e_2 – … – e_{j-1}}{N} , e_k \right > \\ \quad = \frac{1}{N} \left e_1 – e_2 – … – e_{j-1}, e_k \right > \\ \quad \frac{1}{N} \left ( – \right ) = 0 \end{align}

- Therefore the set of vectors $\{ e_1, e_2, …, e_j \}$ is orthonormal. Now from above, we see that $v_j \in \mathrm{span}(e_1, e_2, …, e_n)$. Furthermore, by our induction hypothesis, we have that $v_1, v_2, …, v_{j-1} \in \mathrm{span} (e_1, e_2, …, e_j)$. Therefore $v_j \in \mathrm{span} (e_1, e_2, …, e_j)$. Both sets of vectors $\{ v_1, v_2, …, v_j \}$ and $\{ e_1, e_2, .., e_j \}$ are linearly independent and these subspaces have the same dimension $j$ and so these spaces must be equal, that is:

(3)

\begin{align} \quad \mathrm{span} (v_1, v_2, …, v_j) = \mathrm{span} (e_1, e_2, …, e_j) \quad \blacksquare \end{align}

Corollary 1: Let $V$ be a finite-dimensional inner product space. Then $V$ has an orthonormal basis. |

**Proof:**If $V$ is finite-dimensional, then $V$ has a basis, say $\{ v_1, v_2, …, v_n \}$ of $\mathrm{dim}(V) = n$ vectors in $V$. This set of vectors is linearly independent as a basis, and thus, by applying the Gram-Schmidt process we can obtain a set of $n$ vectors $\{ e_1, e_2, …, e_n \}$ that are orthonormal. But this set of vectors is linearly independent and has length $n$, so $\{ e_1, e_2, …, e_n \}$ is an orthonormal basis of $V$. $\blacksquare$

Corollary 2: Let $V$ be a finite-dimensional inner product space. Then any orthonormal set of vectors $\{ e_1, e_2, …, e_n \}$ in $V$ can be extended to an orthonormal basis of $V$. |

**Proof:**Let $\{ e_1, e_2, …, e_n \}$ be an orthonormal set of vectors in $V$. We already know that any orthonormal set of vectors is linearly independent, and thus we can extend this set of linearly independent vectors to a basis $\{ e_1, e_2, …, e_n, f_1, f_2, …, f_m \}$.

- We can then apply the Gram-Schmidt process to this set of vectors to obtain an orthonormal set of vectors, $\{ e_1, e_2, …, e_n, e_{n+1}, …, e_{n+m} \}$. This set of vectors is linearly independent and there are $\mathrm{dim} (V)$ of them, so this is an orthonormal basis for $V$. $\blacksquare$