Barion Pixel Linear algebra | mathXplain
 
5 topics, 38 short and super clear episodes

This Linear algebra course includes 38 short and super clear episodes that take you through 5 topics and help you navigate the bumpy roads of Linear algebra. The casual style makes you feel like you are discussing some simple issue, such as cooking scrambled eggs.

Table of contents: 

The course consists of 5 sections: Matrices and vectors, Dependent and independent vectors, Systems of linear equations, Determinants, eigenvectors and eigenvalues, Linear transformations

MATRICES

  • Matrices - Matrices are really harmless creatures in mathematics. An nXk matrix is simply a rectangular array of numbers, arranged in n rows and k columns.
  • Matrix operations - Scalar multiplication, addition and multiplication.
  • Square and diagonal matrices - It is a square-shaped matrix with the same number of rows and columns.The diagonal matrix is a square matrix where all elements outside the main diagonal are zero.
  • Transpose - The transpose matrix is created by swapping the rows and the columns of the matrix

VECTORS

  • Dot product - The dot product, or sometimes inner product, is an algebraic operation that takes two vectors and turns into a single number.
  • Cross product - The cross product, is an algebraic operation that takes two vectors and turns into anather vector.
  • Dyadic product - The dyadic product, is an algebraic operation that takes two vectors and turns into a matrix.
  • Angle between the two vectors - To calculate the angle between the two vectors, we write their dot product using two formulas.

SOME GEOMETRY

VECTOR SPACES

  • The axioms - The most prominent of them is something called vector space, which is nothing more than a set of vectors fulfilling some special properties.
  • Coordinates - Real vector spaces are usually denoted by Rn where the n stands for the number of coordinates the vectors have.
  • Linearly dependent vectors - A vector either of them can be written as a combination of the other vectors.
  • Linearly independent vectors - Non of them can be written as a combination of the other vectors.
  • Spanning set - Set of vectors in the vector space whose all vectors can be reached.
  • Basis - An independent spanning set is called a basis.
  • Subspaces - W is called a linear subspace of vector space V, if w is a subset of V and W itself is also a vector space with the same operations as V.

LINEAR SYSTEMS OF EQUATIONS

  • Coefficient matrix - The coefficient matrix of the linear system is a matrix consisting of the coefficients of the x variables.
  • Augmented matrix - The coefficient matrix with constants on the right side.
  • Echelon form - Turn the linear system into a much more friendly system where each row contains only one unknown.
  • Gaussian elimination - This is an algorithm for solving systems of linear equations. It is usually understood as a sequence of operations performed on the coefficient matrix.
  • Pivoting - If we compare the tables of the Gaussian elimination and the pivoting, we can see that the steps are the same, but the pivoting is profitable.
  • Degree of freedom - The remaining variables are so called free variables. The number of free variables is the degree of freedom.
  • Rank - Rank is the number of x variables that can be brought down in the course of pivoting.
  • General solution - Solution with free variables.
  • Inverse of matrix nxn - We will compute the inverse of nxn matrices.
  • Inverse of matrix nxk - Try finding the inverse of matrices whose size is not nxn.

THE DETERMINANT

  • Definition of determinant - The determinant takes a matrix, and turns it into a single number. Let’s see how that happens.
  • Sarrus' Rule - There is a rule for calculating the determinant of a 3x3 matrix. It is known as Sarrus' rule.
  • The expansion rule - The essence of the expansion rule is that for an nxn matrix of any size, the rather nasty calculation of its determinant can be reduced to calculating the determinants of 2x2 matrices, which is fairly easy.
  • Singular and invertible matrices - The nxn matrices can be classified into two large groups. There are those whose determinant is zero, and there are those whose is non-zero.
  • Eigenvector - An eigenvector of an nxn matrix A is a non-zero vector, where there is a real number such that A multiplied by the vector is equal to the number multiplied by the vector.
  • Eigenvalue - An eigenvalue of an nxn matrix A is a number, where there is a non-zero vector such that A multiplied by the vector is equal to the number multiplied by the vector.
  • Characteristic equation - The solutions to the characteristic equation will be the eigenvalues.
  • The diagonal form - If an nxn matrix has n independent eigenvectors, then the matrix has a diagonal form, where the main diagonal contains the eigenvalues, and all other elements are zero.
  • Definiteness of matrices - In order to figure out definiteness, we will need these leading principal minors – more precisely, we will need their signs.
  • Quadratic forms - The quadratic form a homogeneous quadratic (second-degree) polynomial. This means the x variables are either raised to the second power, or raised to the first power but multiplied by another x, and that counts as quadratic as well.
  • Definitness of quadratic forms - The matrix of the quadratic form helps us determine the definiteness.

LINEAR TRANSFORMATIONS

  • Linear transformation - The linear transformation is a mapping V1 to V2 between two vectors that hold the operations of addition and scalar multiplication
  • Image - The transformation takes vectors of V1, and assigns vectors of V2 to them. This part of V2 is called the image.
  • Kernel - There are vectors from which the transformation creates a zero vector. The part of V1 containing these vectors is called the kernel.
  • Dimension formula - The sum of the dimensions of the image and the kernel gives the dimension of V1.
  • Matrix of transformation - Each linear transformation can be represented by a matrix. As a matter of fact, each can be represented by an infinite number of matrices. These matrices are created by taking an arbitrary basis in V1, and then writing the images of the basis vectors next to each other.
  • Inverse transformation - The reverse of the transformation is also called the inverse of the transformation.
  • Eigenbasis - If a transformation has an eigenbasis, that means the matrix of the transformation has n independent eigenvectors, so the matrix is diagonalizable.
  • The diagonal form - If an nxn matrix has n independent eigenvectors, then the matrix has a diagonal form, where the main diagonal contains the eigenvalues, and all other elements are zero.
  • Homomorphism - Linear transformation V1 to V2 is also called a homomorphism.
  • Isomorphism - An isomorphism is a one-to-one correspondence between the vectors of two vector spaces.
  • Similar matrices - If A and B matrices such that there exists a matrix C for which A=C-1BC we say that the two matrices are similar.
  • Rotation about the origin - Let’s see the matrix in the usual basis.
  • Reflection about the x axis - Let’s see the matrix in the usual basis.
  • Projection onto the x axis - Let’s see the matrix in the usual basis.