Matrix Computation Cheatsheet

A review of the most basic matrix operations

Travis Cooper
4 min readJul 18, 2022

Matrices are common among many areas of pure and applied mathematics. It is important that we understand the basic operations we can perform for a given computation or proof. Let’s review some of the most basic operations performed on matrices.

We will assume the following definition for A and B as we proceed in the definitions.

Matrix Addition/Subtraction

The simplest operation is addition and/or subtraction. We add or subtract matrices in the following way

Addition and subtraction are performed component wise to produce the same sized matrix. Matrix addition and subtraction have the following properties

Matrix Multiplication

Matrix multiplication is a bit confusing when you are first introduced to it. We define matrix multiplication in the following way

It is not a component wise multiplication as you may have naturally gravitated towards. It is better to think of matrix multiplication as a composition of two matrices. If we think of it that way, it is easier to wrap your head around what matrix multiplication is trying to accomplish.

An interesting note about matrix multiplication is that not all matrices can be multiplied together. A good check is to determine if the number of columns in A equals the number of rows in B. If not, the matrices cannot be multiplied. Here are a few examples to illustrate this point:

Matrix multiplication has the following properties

Matrix Transpose

The transpose of a matrix is defined as swapping the rows and the columns of a matrix.

The tranpose of a matrix has the following properties

Matrix Inversion

The matrix inverse is defined as

Inverses can be notoriously difficult to compute. In fact, you typically will not compute inverses by hand for anything bigger than a 2x2 matrix. For a 2x2 matrix, specifically, there is a formula to compute inverses

We swap the diagonal elements and negate the counterdiagonal elements of the original matrix, A. We also have an added step of multiplying the transformed matrix by one over the determinant of A.

To illustrate how inverses can be used throughout linear algebra, let’s assume we are attempting to solve

We can solve for x by multiplying both sides by the inverse of A.

Notice that when we multiply the matrix A by its inverse, it disappears from the left hand side. While matrices do not have the concept of division, the matrix inverse can be used to accomplish a very similar operation.

However, not all matrices have an inverse. If the determinant of a matrix is zero, we consider the matrix to be singular, or non-invertible.

Properties of matrix inverses include

Matrix operations and their properties are extremely useful in linear algebra. Whether you are computing a value, solving an expression, or proving a theorem, you will need matrix operations to navigate the world of linear algebra.

--

--