In this explainer, we will learn how to identify the properties of matrix multiplication, including the transpose of the product of two matrices, and how they compare with the properties of number multiplication.
To begin the discussion about the properties of matrix multiplication, let us start by recalling the definition for a general matrix.
Definition: Matrix Multiplication
Suppose that is a matrix with order and that is a matrix with order such that
Then, the matrix product is a matrix with order , with the form where each entry is the pairwise summation of entries from and given by
It should already be apparent that matrix multiplication is an operation that is much more restrictive than its real number counterpart. For one, we know that the matrix product can only exist if has order and has order , meaning that the number of columns in must be the same as the number of rows in .
Another thing to consider is that many of the properties that apply to the multiplication of real numbers do not apply to matrices. For instance, for any two real numbers and , we have
This is known as the commutative property.
If matrix multiplication were also commutative, it would mean that for any two matrices and . Assuming that has order and has order , then calculating would mean attempting to combine a matrix with order and a matrix with order . This means that is only well defined if . Immediately, this shows us that matrix multiplication cannot always be commutative for the simple reason that reversing the order may not always be possible.
Let us suppose that we did have a situation where . Consider the two matrices
Since is a matrix and is a matrix, the product exists and is a matrix. Let us demonstrate the calculation of the first entry, where we have computed . We can continue this process for the other entries to get the following matrix:
However, let us now consider the multiplication in the reversed direction (i.e., ). Since is a matrix and is a matrix, the result will be a matrix. For the first entry, we have where we have computed . Repeating this process for the remaining entries, we get
So, even though both and are well defined, the two matrices are of orders and , respectively, meaning that they cannot be equal. In fact, the only situation in which the orders of and can be equal is when and are both square matrices of the same order (i.e., when and both have order ). However, even in that case, there is no guarantee that and will be equal. This is a general property of matrix multiplication, which we state below.
Property: Noncommutativity of Matrix Multiplication
If and are matrices of orders and , respectively, then generally,
In other words, matrix multiplication is noncommutative.
We note that although it is possible that matrices can commute under certain conditions, this will generally not be the case. To see this, let us consider some examples in order to demonstrate the noncommutativity of matrix multiplication. In the first example, we will determine the product of two square matrices in both directions and compare their results.
Example 1: Calculating the Multiplication of Two Matrices in Both Directions
Given that find and .
In this example, we want to determine the matrix multiplication of two matrices in both directions.
Since both and have order , their product in either direction will have order . Let us consider the calculation of the first entry of the matrix . We have
where we have computed . We continue doing this for every entry of , which gets us the following matrix:
It remains to calculate , which we can do by swapping the matrices around, giving us
We note that is not equal to , meaning in this case, the multiplication does not commute.
Let us consider another example where we check whether changing the order of multiplication of matrices gives the same result.
Example 2: Verifying Whether the Multiplication of Two Matrices Is Commutative
Consider the matrices and . Is ?
In this example, we want to determine the matrix multiplication of two matrices in both directions in order to check the commutativity of matrix multiplication.
Computing the multiplication in one direction gives us
Meanwhile, the computation in the other direction gives us
If we examine the entry of both matrices, we see that , meaning the two matrices are not equal. Therefore, .
Having seen two examples where the matrix multiplication is not commutative, we might wonder whether there are any matrices that do commute with each other. Let us recall a particular class of matrix for which this may be the case.
Definition: Diagonal Matrix
Suppose that is a square matrix (i.e., a matrix of order ). Then, is a diagonal matrix if all the entries outside the main diagonal are zero, or, in other words, if for . That is to say, matrices of this kind take the following form:
In the and cases (which we will be predominantly considering in this explainer), diagonal matrices take the forms
Now, in the next example, we will show that while matrix multiplication is noncommutative in general, it is, in fact, commutative for diagonal matrices. In particular, we will consider diagonal matrices.
Example 3: Verifying a Statement about Matrix Commutativity
True or False: If and are both matrices, then is never the same as .
In this example, we want to determine whether a statement regarding the possibility of commutativity in matrix multiplication is true or false.
In order to prove the statement is false, we only have to find a single example where it does not hold. To do this, let us consider two arbitrary diagonal matrices and (i.e., matrices that have all their off-diagonal entries equal to zero):
Computing , we find
Next, if we compute , we find
Thus, since both matrices have the same order and all their entries are equal, we have . This proves that the statement is false: can be the same as .
The phenomenon demonstrated above is not unique to the matrices and we used in the example, and we can actually generalize this result to make a statement about all diagonal matrices.
Property: Commutativity of Diagonal Matrices
If and are both diagonal matrices with order , then the two matrices commute. In other words, .
To prove this for the case, let us consider two diagonal matrices and :
Then, their products in both directions are
Thus, for any two diagonal matrices. Note that the product of two diagonal matrices always results in a diagonal matrix where each diagonal entry is the product of the two corresponding diagonal entries from the original matrices. Thus, it is easy to imagine how this can be extended beyond the case.
It is important to note that the property only holds when both matrices are diagonal. For example, consider the two matrices where is a diagonal matrix and is not a diagonal matrix. In this instance, we find that
Therefore, even though the diagonal entries end up being equal, the off-diagonal entries are not, so .
In spite of the fact that the commutative property may not hold for all diagonal matrices paired with nondiagonal matrices, there are, in fact, certain types of diagonal matrices that can commute with any other matrix of the same order. Let us consider a special instance of this: the identity matrix.
Definition: Identity Matrix
An identity matrix (also known as a unit matrix) is a diagonal matrix where all of the diagonal entries are 1. in other words, identity matrices take the form where denotes the identity matrix of order (if the size does not need to be specified, is often used instead).
In the majority of cases that we will be considering, the identity matrices take the forms
A key property of identity matrices is that they commute with every matrix that is of the same order. However, they also have a more powerful property, which we will demonstrate in the next example.
Example 4: Calculating Matrix Products Involving the Identity Matrix
Given that and is the identity matrix of the same order as , find and .
Recall that the identity matrix is a diagonal matrix where all the diagonal entries are 1. Given that is a matrix and that the identity matrix is of the same order as , is therefore a matrix, of the form
We have been asked to find and , so let us find these using matrix multiplication. First, we have
Next, we have
Thus, we have shown that and .
The last example demonstrated that the product of an arbitrary matrix with the identity matrix resulted in that same matrix and that the product of the identity matrix with itself was also the identity matrix. In fact, had we computed , we would have similarly found that
So, , meaning that not only do the matrices commute, but the product is also equal to in both cases.
One might notice that this is a similar property to that of the number 1 (sometimes called the multiplicative identity). For the real numbers, namely for any real number , we have
This is, in fact, a property that works almost exactly the same for identity matrices.
Property: Multiplicative Identity for Matrices
The identity matrix is the multiplicative identity for matrix multiplication. That is, for any matrix of order , then where and are the and identity matrices respectively.
We note that the orders of the identity matrices used above are chosen purely so that the matrix multiplication is well defined. In the case that is a square matrix, , so .
Let us prove this property for the case by considering a general matrix
If we calculate the product of this matrix with the identity matrix , we find that
Reversing the order, we get
Thus, it is indeed true that for any matrix , and it is equally possible to show this for higher-order cases.
So far, we have discovered that despite commutativity being a property of the multiplication of real numbers, it is not a property that carries over to matrix multiplication. However, even though this particular property does not hold, there do exist other properties of the multiplication of real numbers that we can apply to matrices. Let us consider them now.
Recall that for any real numbers , , and , we have
This is known as the associative property. The associative property means that in situations where we have to perform multiplication twice, we can choose what order to do it in; we can either find , then multiply that by , or we can find and multiply it by , and both answers will be the same.
To investigate whether this property also applies to matrix multiplication, let us consider an example involving the multiplication of three matrices.
Example 5: Investigating the Associative Property of Matrix Multiplication
Given that is it true that ?
In this example, we are being tasked with calculating the product of three matrices in two possible orders; either we can calculate and then multiply it on the right by , or we can calculate and multiply it on the left by .
Let us begin by finding . Since is and is , will be a matrix. To demonstrate the process, let us carry out the details of the multiplication for the first row. We have
where we have calculated . For the next entry in the row, we have
since . Repeating this process for every entry in , we get
Next, to find , we multiply this matrix on the right by . This gives us
Now, we need to find , which means we must first calculate (a matrix). Doing this gives us
Then, to find , we multiply this on the left by . This gives us
In conclusion, we see that the matrices we calculated for and are equivalent. Therefore, we can conclude that the associative property holds and the given statement is true.
As we saw in the previous example, matrix associativity appears to hold for three arbitrarily chosen matrices. As a matter of fact, this is a general property that holds for all possible matrices for which the multiplication is valid (although the full proof of this is rather cumbersome and not particularly enlightening, so we will not cover it here).
Property: Associativity of Matrix Multiplication
Let be a matrix of order , be a matrix of order , and be a matrix of order . Then, we have
That is to say, matrix multiplication is associative.
While we are in the business of examining properties of matrix multiplication and whether they are equivalent to those of real number multiplication, let us consider yet another useful property.
Recall that for any real numbers , , and , we have
This is known as the distributive property, and it provides us with an easy way to expand the parentheses in expressions. As a matter of fact, we have already seen that this property holds for the scalar multiplication of matrices. Recall that the scalar multiplication of matrices can be defined as follows.
Definition: Scalar Multiplication
For a matrix of order defined by the scalar multiple of by a constant is found by multiplying each entry of by , or, in other words,
As we have seen, the property of distributivity holds for scalar multiplication in the same way as it does for real numbers: namely, given a scalar and two matrices and of the same order, we have
As for full matrix multiplication, we can confirm that is in indeed the case that the distributive property still holds, leading to the following result.
Property: Distributivity of Matrix Multiplication
Let be a matrix of order and and be matrices of order . Then, we have
In other words, matrix multiplication is distributive with respect to matrix addition.
It is important to be aware of the orders of the matrices given in the above property, since both the addition and the multiplications , , and need to be well defined.
Note that much like the associative property, a concrete proof of this is more time consuming than it is interesting, since it is just a case of proving it entry by entry using the definitions of matrix multiplication and addition.
Let us consider an example where we can see the application of the distributive property of matrices.
Example 6: Investigating the Distributive Property of Matrix Multiplication over Addition
Suppose that , , and .
- Find .
- Find .
- Find .
- Express in terms of and .
To begin with, we have been asked to calculate , which we can do using matrix multiplication. and are matrices, so their product will also be a matrix. To demonstrate the calculation of the bottom-left entry, we have
where we have computed . Repeating this for the remaining entries, we get
We can calculate in much the same way as we did . will also be a matrix since and are both matrices. Performing the matrix multiplication, we get
For the next part, we have been asked to find . To calculate this directly, we must first find the scalar multiples of and , namely and . We do this by multiplying each entry of the matrices by the corresponding scalar. Thus, we have
The next step is to add the matrices using matrix addition. We do this by adding the entries in the same positions together. This gives us
Finally, to find , we multiply this matrix by . Just as before, we will get a matrix since we are taking the product of two matrices. We get
For the final part, we must express in terms of and . The easiest way to do this is to use the distributive property of matrix multiplication. That is, for matrices , , and of the appropriate order, we have
In this case, if we substitute in and , we find that
Thus, we have expressed in terms of and . Nevertheless, we may want to verify that our solution is correct and that the laws of distributivity hold. Since we have already calculated , , and in previous parts, it should be fairly easy to do this. and can be found using scalar multiplication of and ; that is,
Finally, we can add these two matrices together using matrix addition, to get
Since this corresponds to the matrix that we calculated in the previous part, we can confirm that our solution is indeed correct: .
For the final part of this explainer, we will consider how the matrix transpose interacts with matrix multiplication. Let us begin by recalling the definition.
Definition: The Transpose of a Matrix
Suppose that is a matrix of order . The transpose of matrix is an operator that flips a matrix over its diagonal. In other words, it switches the row and column indices of a matrix. This operation produces another matrix of order denoted by .
If are the entries of matrix with and , then are the entries of and it takes the form
For example, consider the matrix
The transpose of this matrix is the following matrix:
As it turns out, matrix multiplication and matrix transposition have an interesting property when combined, which we will consider in the theorem below.
Property: Matrix Multiplication and the Transpose
Suppose that is a matrix of order and is a matrix of order , ensuring that the matrix product is well defined. The transpose of and are matrices and of orders and , respectively, so their product in the opposite direction is also well defined.
Matrix multiplication combined with the transpose satisfies the following property:
Once again, we will not include the full proof of this since it just involves using the definitions of multiplication and transposition on an entry-by-entry basis.
In the final example, we will demonstrate this transpose property of matrix multiplication for a given product.
Example 7: The Properties of Multiplication and Transpose of a Matrix
Given that what is ?
In this example, we want to determine the product of the transpose of two matrices, given the information about their product.
Recall that the transpose of an matrix switches the rows and columns to produce another matrix of order . Matrix multiplication combined with the transpose satisfies the property
Therefore, in order to calculate the product , we simply need to take the transpose of by using this property.
Hence, we have
Let us finish by recapping the properties of matrix multiplication that we have learned over the course of this explainer.
- Matrix multiplication is in general not commutative; that is, .
- If and are both diagonal matrices of the same order, then .
- An identity matrix is a diagonal matrix with 1 for every diagonal entry. Identity matrices (up to order 4) take the forms shown below:
- If is an identity matrix and is a square matrix of the same order, then
- Matrix multiplication is associative; that is, for valid matrices , , and , we have
- Matrix multiplication is distributive over addition, so for valid matrices , , and , we have
- For any valid matrix product , the matrix transpose satisfies the following property: