Gram-Schmidt Orthonormalization: the Idea

The Gram-Schmidt orthonormalization procedure might look a bit intimidating at first. However, there is just one simple beautiful idea underlying: the orthogonal vector decomposition, \(\Vect{x} = \Vect{x}_V + \Vect{x}_{\bot}\), of a vector \(\Vect{x}\) into the sum of a vector \(\Vect{x}_V\) in a subspace \(V\), and a vector \(\Vect{x}_{\bot}\) in the orthogonal complement of \(V\).

So, how does this orthogonal vector decomposition help in orthonormalizing an ordered set of vectors \((\Vect{a}_1,\dots ,\Vect{a}_n)\)?

  1. We begin by turning \(\Vect{a}_1\) into a vector of length \(1\) and in the direction of \(\Vect{a}_1\). This is accomplished by setting

    \[\Vect{v}_1 \DefEq \frac{ \Vect{a}_1 }{ \Norm{ \Vect{a}_1 } }\]

    Then \(\span\Set{\Vect{a}_1} = \span\Set{\Vect{v}_1} =:V_1\).

  2. Next we split \(\Vect{a}_2\) as

    \[\Vect{a}_2 = (\Vect{a}_2)_{V_1} + (\Vect{a}_2)_{\bot} = \OrthoPrjctn{\VSpc{V_1}}{\Vect{a}_2} + \left( \Vect{a}_2 - \OrthoPrjctn{\VSpc{V_1}}{\Vect{a}_2}\right)\]

    Thus \((\Vect{a}_2)_{V_1}\) is in \(V_1\), and \((\Vect{a}_2)_{\bot}\) is in the orthogonal complement of \(V_1\). So

    \[(\Vect{a}_2)_{\bot} = \Vect{a}_2 - (\DotPr{\Vect{a}_2}{\Vect{v}_1})\cdot \Vect{v}_1\]

    is perpendicular to \(V_1\). This vector need not have length \(1\). So we normalize it

    \(\Vect{v}_2\)\(\DefEq \)\(\dfrac{ \Vect{a}_2 - (\DotPr{\Vect{a}_2}{\Vect{v}_1})\cdot \Vect{v}_1 }{ \Norm{ \Vect{a}_2 - (\DotPr{\Vect{a}_2}{\Vect{v}_1})\cdot \Vect{v}_1 } }\)

    We then check that \(\span\Set{ \Vect{a}_1,\Vect{a}_2 } = \span\Set{ \Vect{v}_1,\Vect{v}_2 } =: V_2\).

  3. Next we split \(\Vect{a}_3\) as

    \[\Vect{a}_3 = (\Vect{a}_3)_{V_2} + (\Vect{a}_3)_{\bot} = \OrthoPrjctn{\VSpc{V_2}}{\Vect{a}_3} + \left( \Vect{a}_3 - \OrthoPrjctn{\VSpc{V_2}}{\Vect{a}_3}\right)\]

    Thus \((\Vect{a}_3)_{V_2}\) is in \(V_2\), and \((\Vect{a}_3)_{\bot}\) is in the orthogonal complement of \(V_2\). So

    \[(\Vect{a}_3)_{\bot} = \Vect{a}_3 - (\DotPr{\Vect{a}_2}{\Vect{v}_1})\cdot \Vect{v}_1 - (\DotPr{\Vect{a}_3}{\Vect{v}_2})\cdot \Vect{v}_2\]

    is perpendicular to \(V_2\). This vector need not have length \(1\). So we normalize it

    \(\Vect{v}_3\)\(\DefEq \)\(\dfrac{ \Vect{a}_3 - (\DotPr{\Vect{a}_2}{\Vect{v}_1})\cdot \Vect{v}_1 - (\DotPr{\Vect{a}_3}{\Vect{v}_2})\cdot \Vect{v}_2 }{ \Norm{ \Vect{a}_3 - (\DotPr{\Vect{a}_2}{\Vect{v}_1})\cdot \Vect{v}_1 - (\DotPr{\Vect{a}_3}{\Vect{v}_2})\cdot \Vect{v}_2 } }\)

    We then check that \(\span\Set{ \Vect{a}_1,\Vect{a}_2,\Vect{a}_3 } = \span\Set{ \Vect{v}_1,\Vect{v}_2,\Vect{v}_3 } =: V_3\).

  4. In general, suppose \(1\leq k-1\leq r-1\), and vectors \(\Vect{a}_1, \dots ,\Vect{a}_{k-1}\) have been orthonormalized into vectors \(\Vect{v}_1,\dots ,\Vect{v}_{k-1}\), and

    \(\span\Set{\Vect{a}_1,\dots ,\Vect{a}_{k-1}}\)\(=\)\(\span\Set{\Vect{v}_1,\dots ,\Vect{v}_{k-1}} =: V_{k-1}\)

    Then we orthonormalize \(\Vect{a}_k\) by splitting it as

    \[\Vect{a}_k = (\Vect{a}_k)_{V_{k-1}} + (\Vect{a}_k)_{\bot} = \OrthoPrjctn{\VSpc{V_{k-1}}}{\Vect{a}_k} + \left( \Vect{a}_k - \OrthoPrjctn{\VSpc{V_{k-1}}}{\Vect{a}_k}\right)\]

    Thus \((\Vect{a}_k)_{V_{k-1}}\) is in \(V_{k-1}\), and \((\Vect{a}_k)_{\bot}\) is in the orthogonal complement of \(V_{k-1}\). So

    \[(\Vect{a}_k)_{\bot} = \Vect{a}_k - (\DotPr{\Vect{a}_k}{\Vect{v}_1}) \cdot \Vect{v}_1 - \cdots - (\DotPr{\Vect{a}_k}{\Vect{v}_{k-1}})\cdot \Vect{v}_{k-1}\]

    is perpendicular to \(V_{k-1}\). This vector need not have length \(1\). So we normalize it

    \(\Vect{v}_k\)\(\DefEq \)\(\dfrac{ \Vect{a}_k - (\DotPr{\Vect{a}_k}{\Vect{v}_1}) \cdot \Vect{v}_1 - \cdots - (\DotPr{\Vect{a}_k}{\Vect{v}_{k-1}})\cdot \Vect{v}_{k-1} }{ \Norm{ \Vect{a}_k - (\DotPr{\Vect{a}_k}{\Vect{v}_1}) \cdot \Vect{v}_1 - \cdots - (\DotPr{\Vect{a}_k}{\Vect{v}_{k-1}})\cdot \Vect{v}_{k-1} } }\)

    We then check that \(\span\Set{ \Vect{a}_1,\cdots ,\Vect{a}_k } = \span\Set{ \Vect{v}_1,\cdots ,\Vect{v}_k } =: V_k\).

Thus we orthonormalize inductively the entire set of vectors \(\Vect{a}_1\), ... , \(\Vect{a}_r\).