Linear Independence

Introduction

If we use a subset \(S\) of \(\RNrSpc{n}\) to span a subvector space \(V\) of \(\RNrSpc{n}\), we know that every \(\Vect{x}\) in \(V\) can be expressed in some way as a linear combination of vectors in \(S\). However, there might be many ways of doing so. With the concept of ‘linear independence’ we single out those sets \(S\) for which each \(\Vect{x}\) in \(\span(S)\) can be expressed in exactly one way as a linear combination of vectors in \(S\).

DefinitionLinearly independent set

A set of vectors \(S\) in \(\RNrSpc{n}\) is called linearly independent if, for any choice of pairwise distinct vectors \(\Vect{a}_1,\dots , \Vect{a}_r\) from \(S\), the vector equation

\(t_1 \Vect{a}_1 + \cdots + t_r \Vect{a}_r\)\(=\)\(0\)

has exactly one solution, namely \(t_1=\cdots = t_r=0\). – If \(S\) fails to be linearly independent, it is called linearly dependent.

Here are some basic facts about linear independence:

LemmaFacts about linear independence

The following hold

  1. Every subset of a linearly independent set of vectors is also linearly independent.

  2. If \(S\) is a linearly independent set of vectors then, for any vector \(\Vect{b}\), \(S\cup \Set{ \Vect{b} }\) is linearly independent if and only if \(\Vect{b}\) in not in \(\span(S)\).

  3. If \(\Vect{b}_1,\dots ,\Vect{b}_n\) are linearly independent vectors in \(V\DefEq \span\Set{ \Vect{a}_1,\dots ,\Vect{a}_m }\), then \(n\leq m\).

The following result confirms our reason for introducing the concept of linear independence: If a collection of vectors \(S\) is linearly independent, every vector in \(\span(S)\) can be expressed in exactly one way as a linear combination of vectors in \(S\).

PropositionLinear independence / linear combination

Let \(S\) be a linearly independent subset of \(\RNrSpc{n}\). If \(\Vect{x} \neq \Vect{0}\) belongs to \(\span(S)\), there are unique vectors \(\Vect{a}_1,\dots ,\Vect{a}_m\) in \(S\) and unique nonzero numbers \(t_1,\dots ,t_m\) such that

\(\Vect{x}\)\(=\)\(t_1 \Vect{a}_1 + \cdots + t_m \Vect{a}_m\)

Now, if we are given some set of vectors, how can we tell if the vectors are linearly independent? – Here are two methods which are often easy to apply:

PropositionLinear independence test by rank

For an \((m,n)\)-matrix \(\Mtrx{A}\) the following hold

  1. The column vectors of \(\Mtrx{A}\) are linearly independent if and only if \(\Mtrx{A}\) has rank \(n\).

  2. The row vectors of \(\Mtrx{A}\) are linearly independent if and only if \(\Mtrx{A}^T\) has rank \(m\).

PropositionLinear independence test using determinants

For an \((m,n)\)-matrix \(\Mtrx{A}\) the following hold

  1. If \(n\leq m\), the column vectors of \(\Mtrx{A}\) are linearly independent if and only if there are \(n\) rows in \(\Mtrx{A}\) whose determinant is not \(0\).

  2. If \(m\leq n\), the row vectors of \(\Mtrx{A}\) are linearly independent if and only if there are \(m\) columns in \(\Mtrx{A}\) whose determinant is not 0.

As we have seen, the task of determining whether a given set of vectors \(S\) is linearly independent or not can be quite laborious. However, this task simplifies considerably if the vectors in \(S\) are pairwise perpendicular. This leads to the following concept:

DefinitionOrthogonal / Orthonormal Vectors

A set \(S\) of nonzero vectors is called orthgonal if

\[\DotPr{ \Vect{x} }{ \Vect{y} } = 0\quad \text{for all}\quad \Vect{x}\neq \Vect{y}\in S\]

The set \(S\) is called orthonormal if it is orthogonal and, in addition, \(\Norm{ \Vect{x} } = 1\), for each \(\Vect{x}\) in \(S\).

... and the nice thing here is: A set of orthogonal vectors is linearly independent.

PropositionOrthogonal set is linearly independent

An orthogonal set of vectors is linearly independent.

We conclude this section with a result which tells us how to generate a new set of linearly independent vectors from a given one.

PropositionLinear independence / linear combination

Let \(\Vect{a}_1,\dots ,\Vect{a}_p\) be linearly independent vectors in \(\RNrSpc{k}\), and let \(\Vect{x}_1=(x_{11},\dots ,x_{1p})\), ... , \(\Vect{x}_n=(x_{n1},\dots ,x_{np})\) be linearly independent vectors in \(\RNrSpc{p}\). Then

\(\Vect{y}_1\)\(\DefEq \)\(x_{11}\Vect{a}_1+ \cdots + x_{1p} \Vect{a}_p\)
\(\)\(\vdots\)\(\)
\(\Vect{y}_n\)\(\DefEq \)\(x_{n1} \Vect{a}_1 + \cdots + x_{np} \Vect{a}_p\)

are linearly independent vectors in \(\span\Set{ \Vect{a}_1,\dots ,\Vect{a}_p }\).

Study Materials