Video Transcript
Hey, everyone! I’ve got another quick footnote for
you between chapters today. When I’ve talked about linear
transformation so far, I’ve only really talked about transformations from 2D vectors
to other 2D vectors, represented with two-by-two matrices, or from 3D vectors to
other 3D vectors, represented with three-by-three matrices. But several commenters have asked
about nonsquare matrices. So I thought I’d take a moment to
just show what do those mean geometrically. By now in the series, you actually
have most of the background you need to start pondering a question like this on your
own. But I’ll start talking through it,
just to give a little mental momentum.
It’s perfectly reasonable to talk
about transformations between dimensions, such as one that takes 2D vectors to 3D
vectors. Again, what makes one of these
linear is that grid lines remain parallel and evenly spaced and that the origin maps
to the origin. What I have pictured here is the
input space on the left, which is just 2D space, and the output of the
transformation shown on the right. The reason I’m not showing the
inputs move over to the outputs, like I usually do, is not just animation
laziness. It’s worth emphasizing that 2D
vector inputs are very different animals from these 3D vector outputs, living in a
completely separate unconnected space.
Encoding one of these
transformations with a matrix is really just the same thing as what we’ve done
before. You look at where each basis vector
lands and write the coordinates of the landing spots as the columns of a matrix. For example, what you’re looking at
here is an output of a transformation that takes 𝑖-hat to the coordinates two,
negative one, negative two and 𝑗-hat to the coordinates zero, one, one. Notice, this means the matrix
encoding our transformation has three rows and two columns, which, to use standard
terminology, makes it a three-by-two matrix.
In the language of last video, the
column space of this matrix, the place where all the vectors land, is a 2D plane
slicing through the origin of 3D space. But the matrix is still full rank,
since the number of dimensions in this column space is the same as the number of
dimensions of the input space. So, if you see a three-by-two
matrix out in the wild, you can know that it has the geometric interpretation of
mapping two dimensions to three dimensions, since the two columns indicate that the
input space has two basis vectors and the three rows indicate that the landing spots
for each of those basis vectors is described with three separate coordinates.
Likewise, if you see a two-by-three
matrix with two rows and three columns, what do you think that means? Well, the three columns indicate
that you’re starting in a space that has three basis vectors, so we’re starting in
three dimensions. And the two rows indicate that the
landing spot for each of those three basis vectors is described with only two
coordinates, so they must be landing in two dimensions. So it’s a transformation from 3D
space onto the 2D plane; a transformation that should feel very uncomfortable if you
imagine going through it. You could also have a
transformation from two dimensions to one dimension. One-dimensional space is really
just the number line, so transformation like this takes in 2D vectors and spits out
numbers.
Thinking about grid lines remaining
parallel and evenly spaced is a little bit messy due to all of the squishification
happening here. So in this case, the visual
understanding for what linearity means is that if you have a line of evenly spaced
dots, it would remain evenly spaced once they’re mapped onto the number line. One of these transformations is
encoded with a one-by-two matrix, each of whose two columns has just a single
entry. The two columns represent where the
basis vectors land. And each one of those columns
requires just one number, the number that that basis vector landed on.
This is actually a surprisingly
meaningful type of transformation with close ties to the dot product, and I’ll be
talking about that next video. Until then, I encourage you to play
around with this idea on your own, contemplating the meanings of things like matrix
multiplication and linear systems of equations in the context of transformations
between different dimensions. Have fun!