Every Linear Transformation is a Matrix Transformation
Introduction
Abstract We show that every linear transformation \(\RNrSpc{n}\to \RNrSpc{m}\) is a matrix transformation, and we discuss examples.
Outline We just learned that every matrix of size \((m,n)\) determines a linear map \(\RNrSpc{n}\to \RNrSpc{m}\), and so matrices provide a convenient source of linear transformations ... in a form suitable for direct computation. On the other hand, we have examples of linear transformations, such as projections onto a hyperspace, reflections about a hyperspace, and shear transformations (see below) on which we have a good conceptual grip. So, the question presents itself: can such transformations be represented by a matrix?
The answer to this question is very pleasing: every linear transformation can be represented by a matrix! - and here is the
method how to find it. For example, we found earlier that the
orthogonal projection of \(\RNrSpc{3}\) onto any hyperspace is a linear transformation.. Therefore any such projection is represented by a matrix of size \((3,3)\), and here is an
example
of how to compute it.
Similarly, we found earlier that the
orthogonal reflection of \(\RNrSpc{3}\) about any hyperspace is a linear transformation.. Therefore any such reflection is represented by a matrix of size \((3,3)\), and here is an
example
of how to compute it.
As another example, we have general shear transformations: instead of
shearing \(\RNrSpc{2}\) parallel to the \(x\)-axis, we place ourselves in \(\RNrSpc{n}\), \(n\geq 2\), and
shear parallel to the hyperspace
which is perpendicular to some unit vector \(\Vect{a}\). We find that
such a transformation is linear. So, it is
represented by a unique matrix.
TheoremGiven Linear map, Find Matrix
Given an arbitrary linear transformation \(L\from \RNrSpc{n}\to \RNrSpc{m}\), form the matrix
\[
A\ \DefEq \
\left[\begin{array}{cccc}
\uparrow & \uparrow & \cdots & \uparrow \\
L(\StdBssVec{1}) & L(\StdBssVec{2}) & \cdots & L(\StdBssVec{n}) \\
\downarrow & \downarrow & \cdots & \downarrow
\end{array}\right]
\]
Then \(L(\Vect{x}) = \Mtrx{A}\Vect{x}\), for all \(\Vect{x}\) in \(\RNrSpc{n}\). Moreover \(\Mtrx{A}\), so defined, is the only matrix with this property.
In the context of the theorem above, we say that the matrix \(\Mtrx{A}\) represents \(L\).
DefinitionShear Transformation
The shear transformation of \(\RNrSpc{n}\) parallel to the hyperspace perpendicular to a nonzero vector \(\Vect{a}\), and with shear vector \(\Vect{s}\bot \Vect{a}\) is given by
\[\FnctnDAAT{S}{\RNrSpc{n}}{\RNrSpc{n}},\quad \FnctnOf{S}{\Vect{x}} = \Vect{x} + (\DotPr{ \Vect{a} }{ \Vect{x} })\cdot \Vect{s}\]
Earlier we considered shearing of \(\RNrSpc{2}\) parallel to the \(x\)-axis. In that situation the transformation was described by a matrix from the start. So, we knew that it was linear. In general, a shear transformation parallel to a hyperspace is given by a formula. So, we have no advance knowledge that it is linear; i.e. we need to check if it commutes with vector addition and with scalar multiplication.
PropositionShear transformation parallel to a hyperspace is linear
The shear transformation of \(\RNrSpc{n}\) parallel to the hyperspace perpendicular to a nonzero vector \(\Vect{a}\) is a linear transformation.
\[\FnctnDAAT{S}{\RNrSpc{n}}{\RNrSpc{n}},\quad \FnctnOf{S}{\Vect{x}} = \Vect{x} + (\DotPr{ \Vect{a} }{ \Vect{x} })\cdot \Vect{s}\]
http://emath.ualberta.ca/LinrAlgbra/LinAlgInRn.xml/pages/Sec_MatrixGivesLinearMapEvery Linear Transformation is a Matrix Transformationhttp://emath.ualberta.ca/LinrAlgbra/LinAlgInRn.xml/pages/Sec_LinearTransformationsBasicProperties