Contents of this Linear algebra episode:

Matrices, Row, Column, Matrix operations, Scalar multiplication, Addition, Multiplication, Commutativity, Associativity.

Matrices are really harmless creatures in mathematics.

An matrix is simply a rectangular array of numbers, arranged in n rows and k columns.

We will use uppercase letters to refer to matrices. Look at this example:

This is a (2X3) matrix.

We refer to the elements of a matrix by lowercase letters with subscripts indicating two indices. Each element has a row index,

and a column index.

We refer to the elements of a matrix by lowercase letters with subscripts indicating two indices. Each element has a row index,

and a column index.

So, an matrix that has n rows and k columns,

looks something like this:

Matrices are very useful tools, that's why they take center stage in Linear Algebra.

However, before we could see how useful they are, we need to review some basic matrix operations.

1. SCALAR MULTIPLICATION

A scalar is not a disease; it means a number, in most cases a real number.

2. ADDITION

An matrix can only be added to another matrix.

3. MULTIPLICATION

Well, this is the most exciting operation.

An matrix can only be multiplied by a matrix.

The resulting product matrix will have as many rows as A had, and as many columns as B had. The elements are created by multiplying a row from A by a column from B.

Here comes a trick, its scientific name is Falk's scheme. The point is that we arrange the matrices in a special way, with their "corners" touching, like this:

We got the product!

One important property of matrix multiplication is

that it is not commutative.

For example, if we try to do this multiplication in the opposite order,

we realize that it isn't even possible.

Let's see a few special types of matrices.

Linear algebra episode