Content of the topic

Matrix operations

Matrices are really harmless creatures in mathematics.

An  matrix is simply a rectangular array of numbers, arranged in n rows and k columns.

We will use uppercase letters to refer to matrices. Look at this example:

This is a (2X3) matrix.

We refer to the elements of a matrix by lowercase letters with subscripts indicating two indices. Each element has a row index,

and a column index.

We refer to the elements of a matrix by lowercase letters with subscripts indicating two indices. Each element has a row index,

and a column index.

So, an  matrix that has n rows and k columns,

looks something like this:

Matrices are very useful tools, that's why they take center stage in Linear Algebra.

However, before we could see how useful they are, we need to review some basic matrix operations.


A scalar is not a disease; it means a number, in most cases a real number.


An  matrix can only be added to another  matrix.


Well, this is the most exciting operation.

An  matrix can only be multiplied by a  matrix.

The resulting product matrix will have as many rows as A had, and as many columns as B had. The elements are created by multiplying a row from A by a column from B.

Here comes a trick, its scientific name is Falk's scheme. The point is that we arrange the matrices in a special way, with their "corners" touching, like this:

We got the product!

One important property of matrix multiplication is

that it is not commutative.

For example, if we try to do this multiplication in the opposite order,

we realize that it isn't even possible.

Let's see a few special types of matrices.

Special types of matrices


It is a square-shaped matrix with the same number of rows and columns.



The diagonal matrix is a square matrix where all elements outside the main diagonal are zero.


Therefore, in diagonal matrices, only the main diagonal matters, as all the other elements are zero.

That's why some people only indicate the main diagonal elements. This strange symbol

indicates a diagonal matrix.


The identity matrix (or unit matrix), denoted by I, is a matrix where for any , .

The identity matrix is a diagonal matrix where all elements on the main diagonal are equal to one.


The inverse matrix is denoted by , and this is a matrix that does this:

  (right inverse)         (left inverse)

Later we will see that it isn't that easy to figure out the inverse of a matrix.

This inverse thing is a lot easier with real numbers where:

the inverse of   is      because 

the inverse of   is      because 


The transpose matrix is created by swapping the rows and the columns of the matrix. It is indicated by  or                                




A square matrix whose transpose is equal to itself is called a symmetric matrix.

Here is an example of a symmetric matrix:

None of this sounds too exciting right now, but soon will come the time when we will need them.

Now, let's take on vectors!

A few operations on matrices and vectors

Now we have a few matrices and vectors, and we need to do a few operations on them.

Well, let's do them one by one.

There is a little problem here.  doesn’t work.

Unfortunately there is no trick for exponentiation of matrices, so if we need the square of this matrix, we have to raise it to the second power by multiplying the matrix by itself.

If we needed to raise this matrix to the fourth power, that would take a long time.

But we are lucky, as we only need its square.

We only have  left. We just hit the jackpot with this one, as  is a diagonal matrix.

Diagonal matrices are easy to raise to powers, because all we have to do is take the elements of the main diagonal one by one, and raise them to the required power.

This method only works for diagonal matrices, but it does wonders there.

If we multiplied it for times in a sequence, we would get the same result,

except slower – if you want to verify this, try it yourself and see.