Lesson Explainer: Properties of Matrix Multiplication Mathematics

In this explainer, we will learn how to identify the properties of matrix multiplication, including the transpose of the product of two matrices, and how they compare with the properties of number multiplication.

Suppose that we define the matrix ๐ด=๏€8โˆ’210โˆ’134โˆ’3โˆ’3โˆ’1181๏Œ and decide to โ€œscaleโ€ it by the number 5. To complete this operation, we would simply take every entry of ๐ด and multiply by 5, 5๐ด=๏ƒ5ร—85ร—(โˆ’2)5ร—105ร—(โˆ’1)5ร—35ร—45ร—(โˆ’3)5ร—(โˆ’3)5ร—1(โˆ’1)5ร—15ร—85ร—1๏, giving 5๐ด=๏€40โˆ’1050โˆ’51520โˆ’15โˆ’15โˆ’55405๏Œ.

Two effects of this operation are easily observed:

  • The scaled matrix has the same order as the original matrix (meaning that it has the same number of rows and columns)
  • The same operation has been applied to every entry (in this case, multiplication by 5).

Scaling a matrix is a very straightforward operation to understand, as well as being routinely useful when working in linear algebra. Without further exploration, it is believable that this type of operation would be the only type of matrix multiplication that we can define. However, there is another form of matrix multiplication that can be well defined and with properties that are different to either of the two properties of scalar multiplication that are described above. Furthermore, this alternative type of matrix multiplication will be suffused with a range of algebraic properties that can be studied in contrast to the properties of multiplication in conventional algebra. Before we begin studying some of these properties, we will first give the definition of matrix multiplication.

Definition: Matrix Multiplication

Suppose that ๐ด is a matrix with order ๐‘šร—๐‘› and that ๐ต is a matrix with order ๐‘›ร—๐‘ such that ๐ด=โŽ›โŽœโŽœโŽ๐‘Ž๐‘Žโ‹ฏ๐‘Ž๐‘Ž๐‘Žโ‹ฏ๐‘Žโ‹ฎโ‹ฎโ‹ฑโ‹ฎ๐‘Ž๐‘Žโ‹ฏ๐‘ŽโŽžโŽŸโŽŸโŽ ,๐ต=โŽ›โŽœโŽœโŽœโŽ๐‘๐‘โ‹ฏ๐‘๐‘๐‘โ‹ฏ๐‘โ‹ฎโ‹ฎโ‹ฑโ‹ฎ๐‘๐‘โ‹ฏ๐‘โŽžโŽŸโŽŸโŽŸโŽ .๏Šง๏Šง๏Šง๏Šจ๏Šง๏Š๏Šจ๏Šง๏Šจ๏Šจ๏Šจ๏Š๏‰๏Šง๏‰๏Šจ๏‰๏Š๏Šง๏Šง๏Šง๏Šจ๏Šง๏Œ๏Šจ๏Šง๏Šจ๏Šจ๏Šจ๏Œ๏Š๏Šง๏Š๏Šจ๏Š๏Œ

Then, the matrix multiplication ๐ถ=๐ด๐ต is a matrix with order ๐‘šร—๐‘, having the following form: ๐ถ=๐ด๐ต=โŽ›โŽœโŽœโŽ๐‘๐‘โ‹ฏ๐‘๐‘๐‘โ‹ฏ๐‘โ‹ฎโ‹ฎโ‹ฑโ‹ฎ๐‘๐‘โ‹ฏ๐‘โŽžโŽŸโŽŸโŽ .๏Šง๏Šง๏Šง๏Šจ๏Šง๏Œ๏Šจ๏Šง๏Šจ๏Šจ๏Šจ๏Œ๏‰๏Šง๏‰๏Šจ๏‰๏Œ

The entries ๐‘๏ƒ๏… are calculated by the pairwise summation of entries from ๐ด and ๐ต as shown: ๐‘=๏„š๐‘Ž๐‘=๐‘Ž๐‘+โ‹ฏ+๐‘Ž๐‘.๏ƒ๏…๏Š๏‡๏Šฒ๏Šง๏ƒ๏‡๏‡๏…๏ƒ๏Šง๏Šง๏…๏ƒ๏Š๏Š๏…

A common theme in linear algebra, especially for newer students, is that the definitions normally appear to be very abstract and more complicated than concepts that they represent, especially when considering that the majority of matrix calculations are fairly straightforward to perform. To demonstrate that the definition above is not as difficult as it might first seem, we give an example.

Suppose that we have the two matrices ๐ด=๏€ผ2310809๏ˆ,๐ต=๏€โˆ’6โˆ’45104๏Œ.

We see that ๐ด is a matrix with order 2ร—3 and ๐ต is a matrix with order 3ร—2. The matrix ๐ด๐ต must therefore have order 2ร—2 and hence be of the form ๐ด๐ต=๏€ปโˆ—โˆ—โˆ—โˆ—๏‡, where the โˆ— entries are values that are currently unknown.

To calculate the entry in the first row and first column of ๐ด๐ต, we highlight the first row of ๐ด and the first column of ๐ต: ๏€ปโˆ—โˆ—โˆ—โˆ—๏‡=๏€ผ2310809๏ˆ๏€โˆ’6โˆ’45104๏Œ.

Then, we multiply together the highlighted entries in order of their appearance, adding them together: 2ร—(โˆ’6)+3ร—5+10ร—0=3. Now we can insert the first entry into the matrix: ๏€ป3โˆ—โˆ—โˆ—๏‡=๏€ผ2310809๏ˆ๏€โˆ’6โˆ’45104๏Œ.

Now we calculate the entry of ๐ด๐ต which appears in the first row and the second column, by highlighting the entries in the first row of ๐ด and the second column of ๐ต: ๏€ป335โˆ—โˆ—๏‡=๏€ผ2310809๏ˆ๏€โˆ’6โˆ’45104๏Œ, where we have performed the calculation 2ร—(โˆ’4)+3ร—1+10ร—4=35.

Continuing, we calculate the entry in the second row and first column of ๐ด๐ต, by highlighting the second row of ๐ด and the first column of ๐ต: ๏€ผ335โˆ’48โˆ—๏ˆ=๏€ผ2310809๏ˆ๏€โˆ’6โˆ’45104๏Œ, where we have calculated that 8ร—(โˆ’6)+0ร—5+9ร—0=โˆ’48.

Finally, we highlight the second row of ๐ด and the second column of ๐ต: ๏€ผ335โˆ’484๏ˆ=๏€ผ2310809๏ˆ๏€โˆ’6โˆ’45104๏Œ, with the final calculation being 8ร—(โˆ’4)+0ร—1+9ร—4=4.

The conditions on matrix multiplication are actually quite restrictive. We know that the matrix product ๐ด๐ต will only exist if ๐ด has order ๐‘šร—๐‘› and ๐ต has order ๐‘›ร—๐‘, meaning that the number of columns in ๐ด is the same as the number of rows in ๐ต. If we attempted to reverse the order of the multiplication and calculate ๐ต๐ด, we would then be attempting to combine a matrix with order ๐‘›ร—๐‘ and a matrix with order ๐‘šร—๐‘›. This means that ๐ต๐ด is only well defined if ๐‘š=๐‘.

For the matrices ๐ด and ๐ต above, we can see that ๐ด๐ต is well defined. Although it would not generally be the case, in this instance we also find that ๐ต๐ด is well defined and has order 3ร—3. We have ๐ต๐ด=๏€โˆ’44โˆ’18โˆ’9618155932036๏Œ.

Even though, rarely, both ๐ด๐ต and ๐ต๐ด are well defined for the two matrices above, we immediately see that ๐ด๐ตโ‰ ๐ต๐ด for the simple reason that ๐ด๐ต has order 2ร—2 and ๐ต๐ด has order 3ร—3. This is a general property of matrix multiplication, which is referred to as the โ€œnoncommutativeโ€ property.

Theorem: Matrix Multiplication Is Not Commutative

Suppose that ๐ด is a matrix with order ๐‘šร—๐‘› and that ๐ต is a matrix with order ๐‘›ร—๐‘š, meaning that ๐ด๐ต and ๐ต๐ด are both well defined. Then generally ๐ด๐ตโ‰ ๐ต๐ด, which means that matrix multiplication is not โ€œcommutative.โ€

In the case where ๐‘šโ‰ ๐‘›, this result is obvious because ๐ด๐ต will have order ๐‘šร—๐‘š and ๐ต๐ด will have order ๐‘›ร—๐‘›, meaning that equality is impossible. However, if ๐‘š=๐‘›, then ๐ด๐ต will be a square matrix with order ๐‘›ร—๐‘› and ๐ต๐ด will also be a square matrix with order ๐‘›ร—๐‘›. This situation therefore allows the possibility that ๐ด๐ต and ๐ต๐ด are equal, since they have the same order. Although there are some categories of matrices which are commutative under certain conditions, this will generally not be the case.

Example 1: Noncommutativity of Matrix Multiplication

Given that ๐ด=๏€ผโˆ’422โˆ’4๏ˆ,๐ต=๏€ผโˆ’3โˆ’3โˆ’11๏ˆ, find ๐ด๐ต and ๐ต๐ด.


We will illustrate the calculation of the matrix ๐ด๐ต. Since ๐ด has order 2ร—2 and ๐ต has order 2ร—2, the resulting matrix ๐ด๐ต will have order 2ร—2.

We calculate the entry of ๐ด๐ต which appears in the first row and the first column: ๏€ป10โˆ—โˆ—โˆ—๏‡=๏€ผโˆ’422โˆ’4๏ˆ๏€ผโˆ’3โˆ’3โˆ’11๏ˆ, where we have calculated that (โˆ’4)ร—(โˆ’3)+2ร—(โˆ’1)=10.

We continue this process for the entry of ๐ด๐ต which appears in the first row and second column: ๏€ป1014โˆ—โˆ—๏‡=๏€ผโˆ’422โˆ’4๏ˆ๏€ผโˆ’3โˆ’3โˆ’11๏ˆ, after calculating that (โˆ’4)ร—(โˆ’3)+2ร—1=14.

Next, we move onto the entry in the second row and first column of ๐ด๐ต: ๏€ผ1014โˆ’2โˆ—๏ˆ=๏€ผโˆ’422โˆ’4๏ˆ๏€ผโˆ’3โˆ’3โˆ’11๏ˆ, which was calculated as 2ร—(โˆ’3)+(โˆ’4)ร—(โˆ’1)=โˆ’2.

Finally, we determine the remaining entry: ๏€ผ1014โˆ’2โˆ’10๏ˆ=๏€ผโˆ’422โˆ’4๏ˆ๏€ผโˆ’3โˆ’3โˆ’11๏ˆ, since 2ร—(โˆ’3)+(โˆ’4)ร—1=โˆ’10.

Had we chosen to reverse the order of the matrix multiplication, we would have found that ๐ต๐ด=๏€ผ666โˆ’6๏ˆ.

Clearly ๐ด๐ตโ‰ ๐ต๐ด, which gives one example of why matrix multiplication is generally not commutative.

Example 2: Noncommutativity of Matrix Multiplication

Consider the 2ร—2 matrices ๐ด=๏€ผ1100๏ˆ and ๐ต=๏€ผ0101๏ˆ. Is ๐ด๐ต=๐ต๐ด?


Completing both sets of multiplication gives ๐ด๐ต=๏€ผ1100๏ˆ๏€ผ0101๏ˆ=๏€ผ0200๏ˆ and ๐ต๐ด=๏€ผ0101๏ˆ๏€ผ1100๏ˆ=๏€ผ0000๏ˆ.

As we can see in this example, ๐ด๐ตโ‰ ๐ต๐ด.

The two examples above show that generally ๐ด๐ตโ‰ ๐ต๐ด for two matrices. However, this does not necessarily mean that matrix multiplication is always noncommutative. There are, in fact, categories of matrices which are commutative under matrix multiplication.

Example 3: Possibility of Commutativity in Matrix Multiplication

State whether the following statement is true or false: If ๐ด and ๐ต are both 2ร—2 matrices, then ๐ด๐ต is never the same as ๐ต๐ด.


Suppose that ๐ด and ๐ต are diagonal matrices, which means that every nondiagonal entry must have a value of zero: ๐ด=๏€ฝ๐‘Ž00๐‘Ž๏‰,๐ต=๏€พ๐‘00๐‘๏Š.๏Šง๏Šง๏Šจ๏Šจ๏Šง๏Šง๏Šจ๏Šจ We can check that ๐ด๐ต=๏€ฝ๐‘Ž00๐‘Ž๏‰๏€พ๐‘00๐‘๏Š=๏€พ๐‘Ž๐‘00๐‘Ž๐‘๏Š๏Šง๏Šง๏Šจ๏Šจ๏Šง๏Šง๏Šจ๏Šจ๏Šง๏Šง๏Šง๏Šง๏Šจ๏Šจ๏Šจ๏Šจ and that ๐ต๐ด=๏€พ๐‘00๐‘๏Š๏€ฝ๐‘Ž00๐‘Ž๏‰=๏€พ๐‘Ž๐‘00๐‘Ž๐‘๏Š.๏Šง๏Šง๏Šจ๏Šจ๏Šง๏Šง๏Šจ๏Šจ๏Šง๏Šง๏Šง๏Šง๏Šจ๏Šจ๏Šจ๏Šจ

In this instance, we see that ๐ด๐ต=๐ต๐ด, meaning that the two matrices above are commutative. The statement in the question is therefore false.

The phenomena above is not unique to the two given matrices ๐ด and ๐ต, and we can actually generalize this result to make a statement about all diagonal matrices.

Theorem: Diagonal Matrices Commute

If ๐ด and ๐ต are both diagonal matrices with order ๐‘›ร—๐‘›, then the two matrices commute. In other words, ๐ด๐ต=๐ต๐ด.

We could demonstrate this theorem by picking any two diagonal matrices with the same order, which we did for 2ร—2 matrices in the previous example. It is important to note that the theorem above states that ๐ด and ๐ต must both be diagonal matrices. For example, consider the two matrices ๐ด=๏€ผ700โˆ’3๏ˆ,๐ต=๏€ผโˆ’10583๏ˆ, where ๐ด is a diagonal matrix and ๐ต is not a diagonal matrix. In this instance, we find that ๐ด๐ต=๏€ผโˆ’7035โˆ’24โˆ’9๏ˆ,๐ต๐ด=๏€ผโˆ’70โˆ’1556โˆ’9๏ˆ, so clearly ๐ด๐ตโ‰ ๐ต๐ด.

In addition to diagonal matrices, there are other categories of matrices which will always commute with each other, such as pairs of matrices that are simultaneously diagonalizable (this is a topic which should be explored separately!). There are also some very special matrices that commute with every other matrix of a compatible order.

Definition: The Identity Matrix

An โ€œidentity matrix,โ€ also known as a โ€œunit matrix,โ€ is a diagonal matrix where all of the diagonal entries have a value of 1. The identity matrix of order ๐‘›ร—๐‘› is normally denoted as ๐ผ๏Š or 1๏Š.

For example, all of the following matrices are identity matrices: ๐ผ=๏€ผ1001๏ˆ,๐ผ=๏€100010001๏Œ,๐ผ=โŽ›โŽœโŽœโŽ1000010000100001โŽžโŽŸโŽŸโŽ .๏Šจ๏Šฉ๏Šช

A key property of the identity matrix ๐ผ๏Š is that it commutes with every matrix ๐ด which is of order ๐‘›ร—๐‘›. For example, consider the identity matrix ๐ผ๏Šฉ and the matrix ๐ด=๏€91โˆ’4โˆ’56โˆ’10โˆ’10โˆ’4โˆ’4๏Œ.

Then, we could calculate that ๐ด๐ผ=๏€91โˆ’4โˆ’56โˆ’10โˆ’10โˆ’4โˆ’4๏Œ๏€100010001๏Œ=๏€91โˆ’4โˆ’56โˆ’10โˆ’10โˆ’4โˆ’4๏Œ=๐ด.๏Šฉ

Similarly, we could calculate that ๐ผ๐ด=๏€100010001๏Œ๏€91โˆ’4โˆ’56โˆ’10โˆ’10โˆ’4โˆ’4๏Œ=๏€91โˆ’4โˆ’56โˆ’10โˆ’10โˆ’4โˆ’4๏Œ=๐ด.๏Šฉ

We see that ๐ด๐ผ=๐ผ๐ด=๐ด๏Šฉ๏Šฉ, meaning that these two matrices are commutative. Interestingly, this property is actually the result of a much stronger condition that is true of the identity matrices.

Theorem: Effect of the Identity Matrix

Consider the identity matrix ๐ผ๏Š and a matrix ๐ด of order ๐‘›ร—๐‘›. It is always the case that ๐ด๐ผ=๐ผ๐ด=๐ด.๏Š๏Š

A property of the identity matrices is that they leave any matrix unchanged after multiplication, meaning that they must commute with all matrices that have a compatible order. The identity matrix is also used to define another type of matrix that is commutative. We have already seen that two diagonal matrices will be commutative if they are of the same order and that any matrix will be commutative with respect to the identity matrix. However, for a general square matrix ๐ด, it may be the case that there is another unique matrix with which this matrix commutes.

Definition: Multiplicative Inverse of a Matrix

For a square matrix ๐ด with order ๐‘›ร—๐‘›, the matrix โ€œinverseโ€ ๐ด๏Šฑ๏Šง is the matrix such that ๐ด๐ด=๐ด๐ด=๐ผ.๏Šฑ๏Šง๏Šฑ๏Šง๏Š

The inverse of a matrix ๐ด๏Šฑ๏Šง is the matrix that returns the identity matrix ๐ผ๏Š when combined with ๐ด under the operation of matrix multiplication. The idea of a matrix inverse is so central to linear algebra that some of the best mathematicians have created their own methods for calculating these matrices, including Newton, Gauss, Cayley and Hamilton. Although we will not demonstrate any methods for calculating the matrix inverse in this explainer, we should note that the matrix inverse does not exist for all matrices. To be able to understand which matrices have an inverse matrix, we need to study the โ€œdeterminantโ€ of a square matrix, which is another topic entirely.

Another feature of the matrix inverse is that it is unique if it exists. Therefore, a matrix ๐ด only has one inverse ๐ด such that ๐ด๐ด=๐ด๐ด=๐ผ.๏Šฑ๏Šง๏Šฑ๏Šง๏Š

As we have already seen, the matrix ๐ด commutes with the matrix inverse ๐ด๏Šฑ๏Šง under matrix multiplication, whereas two general matrices ๐ด and ๐ต would not have this property unless they are both diagonal (or simultaneously diagonalizable, which is beyond the scope of this explainer).

Example 4: Inverse Matrices

Are ๏€ผ1237๏ˆ,๏€ผ7โˆ’2โˆ’31๏ˆ multiplicative inverses of each other?


If the two matrices are multiplicative inverses of each other, then they should be commutative under matrix multiplication, producing the 2ร—2 identity matrix ๐ผ๏Šจ. We can check that ๏€ผ1237๏ˆ๏€ผ7โˆ’2โˆ’31๏ˆ=๏€ผ1001๏ˆ and also that ๏€ผ7โˆ’2โˆ’31๏ˆ๏€ผ1237๏ˆ=๏€ผ1001๏ˆ.

In both calculations, the output is ๐ผ๏Šจ and therefore the two given matrices are multiplicative inverses of each other.

The algebraic properties of the identity matrix and matrix inverses are especially useful when looking to solve problems in linear algebra, often appearing in the solution methods for systems of linear equations. Even though matrices represent arrays of numbers of arbitrary dimensions, their operations can be treated with a deceptive algebraic simplicity.

Example 5: Properties Of Matrix Multiplication with Invertible Matrices

Suppose ๐ด๐ต=๐ด๐ถ and ๐ด is an invertible ๐‘›ร—๐‘› matrix. Does it follow that ๐ต=๐ถ?


Since ๐ด is an ๐‘›ร—๐‘› matrix and is also invertible, there must exist a matrix ๐ด๏Šฑ๏Šง such that ๐ด๐ด=๐ด๐ด=๐ผ,๏Šฑ๏Šง๏Šฑ๏Šง๏Š where ๐ผ๏Š is the ๐‘›ร—๐‘› identity matrix. We now take the original equation ๐ด๐ต=๐ด๐ถ and multiply on the left-hand side by the matrix ๐ด๏Šฑ๏Šง, giving ๐ด๐ด๐ต=๐ด๐ด๐ถ.๏Šฑ๏Šง๏Šฑ๏Šง We know that ๐ด๐ด=๐ผ๏Šฑ๏Šง๏Š, giving ๐ผ๐ต=๐ผ๐ถ.๏Š๏Š

We also know that there is no change to the matrices ๐ต and ๐ถ when combining with the identity matrix under matrix multiplication, meaning that ๐ผ๐ต=๐ต๏Š and that ๐ผ๐ถ=๐ถ๏Š. In other words, we have indeed shown that ๐ต=๐ถ.

Although we have explored some of the fundamental properties of matrix multiplication, there is another important result that we will cover in this explainer, which can simplify many calculations in linear algebra. We will cover this first by way of example and then we will state a general theorem.

Example 6: Distributive Property of Matrix Multiplication

Given that ๐ด=๏€ผ121โˆ’3๏ˆ,๐ต=๏€ผโˆ’6464๏ˆ,๐ถ=๏€ผ046โˆ’5๏ˆ, is it true that ๐ด(๐ต+๐ถ)=๐ด๐ต+๐ด๐ถ?


We first calculate ๐ด(๐ต+๐ถ)=๏€ผ121โˆ’3๏ˆ๏€ผ๏€ผโˆ’6464๏ˆ+๏€ผ046โˆ’5๏ˆ๏ˆ=๏€ผ121โˆ’3๏ˆ๏€ผโˆ’6812โˆ’1๏ˆ=๏€ผ186โˆ’421๏ˆ.

Next, we calculate ๐ด๐ต+๐ด๐ถ=๏€ผ121โˆ’3๏ˆ๏€ผโˆ’6464๏ˆ+๏€ผ121โˆ’3๏ˆ๏€ผ046โˆ’5๏ˆ=๏€ผ612โˆ’24โˆ’8๏ˆ+๏€ผ12โˆ’6โˆ’1819๏ˆ=๏€ผ186โˆ’4211๏ˆ.

As we might have expected, we do find that ๐ด(๐ต+๐ถ)=๐ด๐ต+๐ด๐ถ.

The phenomenon that we have just witnessed is known as the โ€œdistributiveโ€ property of matrix multiplication with respect to matrix addition. This property is summarized by the following theorem.

Theorem: Matrix Multiplication is Distributive

Suppose that ๐ด is a matrix with order ๐‘šร—๐‘› and ๐ต and ๐ถ are matrices with order ๐‘›ร—๐‘. Then, matrix multiplication is distributive with respect to matrix addition; that is, ๐ด(๐ต+๐ถ)=๐ด๐ต+๐ด๐ถ.

In the statement of this theorem, it was necessary to specify the orders of the three matrices, to ensure that all operations were possible. If ๐ต and ๐ถ had been of different orders, then it would have been impossible to define the matrix addition ๐ต+๐ถ. Also, if ๐ด did not have the same number of columns as the number of rows in ๐ต and ๐ถ, then we could not have calculated ๐ด๐ต or ๐ด๐ถ. When working with matrices, it is very common for the orders to be specified in any definition or theorem because not doing so might mean that the involved operations cannot be completed.

Example 7: Distributive Property of Matrix Multiplication

State whether the following statement is true or false: If ๐ด is a 2ร—3 matrix and ๐ต and ๐ถ are 3ร—2 matrices, then ๐ด(๐ต+๐ถ)=๐ด๐ต+๐ด๐ถ.


The matrices ๐ต and ๐ถ have the same order of 3ร—2, which means that ๐ต+๐ถ can be calculated, with the resulting matrix also having the order 3ร—2. The multiplication ๐ด(๐ต+๐ถ) is therefore between a matrix with order 2ร—3 and a matrix with order 3ร—2, meaning that this is well defined. The resultant matrix has order 2ร—2.

For the same reasons, the multiplication ๐ด๐ต produces a matrix with order 2ร—2, which is also true for the multiplication ๐ด๐ถ. Every matrix operation is therefore well defined, with the result being a matrix of order 2ร—2. Since matrix multiplication is always commutative with respect to addition, it is therefore true in this case that ๐ด(๐ต+๐ถ)=๐ด๐ต+๐ด๐ถ.

There are many more properties of matrix multiplication that we have not explored in this explainer, especially in regard to transposition and scalar multiplication. For square matrices in particular, matrix multiplication is a key area of consideration when studying crucial concepts such as the determinant. One of the most curious aspects of matrix multiplication is that the definition is apparently quite complicated, at least for people who are new to linear algebra. In contrast to the definition, the algebraic properties of matrix multiplication are fairly straightforward, being largely similar to multiplication in conventional algebra except for the key difference around the area of commutativity. There are also many types of matrix which have special algebraic properties in regard to multiplication, and we have already covered diagonal matrices, the identity matrix, and inverse matrices. In particular, symmetric matrices and simultaneously diagonalizable matrices all have enhanced algebraic properties which make them interesting to study.

Key Points

  • Matrix multiplication is generally not commutative; that is, ๐ด๐ตโ‰ ๐ต๐ด.
  • If ๐ด and ๐ต are both diagonal matrices with the same order, then ๐ด๐ต=๐ต๐ด.
  • The identity matrix ๐ผ๏Š commutes with all matrices of the same order, leaving the original matrix unchanged; that is, ๐ด๐ผ=๐ผ๐ด=๐ด๏Š๏Š.
  • For a square matrix ๐ด of order ๐‘›ร—๐‘›, there may exist a unique inverse matrix ๐ด๏Šฑ๏Šง such that ๐ด๐ด=๐ด๐ด=๐ผ๏Šฑ๏Šง๏Šฑ๏Šง๏Š.
  • Matrix multiplication is distributive with respect to matrix addition; that is, ๐ด(๐ต+๐ถ)=๐ด๐ต+๐ด๐ถ.

Nagwa uses cookies to ensure you get the best experience on our website. Learn more about our Privacy Policy.