Robotics and Learning
The central theme of Linear Algebra is to solve systems of linear equations. The canonical representation of a system of linear equations in Linear Algebra is $Ax = b$. There are two ways of interpreting this system of linear equations in Linear Algebra.
Again assume the system of linear equations given by $Ax=b$. Where $A$ is the coefficient matrix, $x$ is to be solved for and $b$ is the right hand matrix. Just as we use elimination in solving systems of linear equations, Matrix elimination utilises pivot multipliers to retrieve Upper Triangle system** of Matrices. The idea is to transform the coefficient matrix to an upper triangular system and then use the method **back substitution to arrive at the solution to the system. This forms the base algorithmic approach to solving a system of linear equations.
There are 3 different failure cases as described in the book
In the third case, you can switch up rows in the system to retrieve a triangular system and arrive at a solution. In the case of the first two cases - you get no solution or infinitely many solutions. The third case is called non-singular and the first two cases are called singular.
The idea of elimination given here is called the Gaussian Elimination Method.
For the algorithm, please refer to section 2.2 of the book.
Each of the steps mentioned in the Gaussian Elimination Method, can be implemented as matrix operations. The idea is to device an Elimination Matrix. This matrix, $E$, is then used to multiply the coefficient matrix to perform the necessary row eliminations. So the question then arises, what is matrix multiplication. Funnily this is the first time the book addresses matrix multiplication. The following is how matrix multiplication is defined
If $B$ has several columns $b_{1}, b_{2}, b_{3}$, then the columns of $AB$ are $Ab_{1}, Ab_{2}, Ab_{3}$
The section also introduces the idea of an augmented matrix.
Matrix $P_{1, 2}$ is the row-interchange matrix. When $A$ is multiplied with this matrix, itβs first and second rows are exchanged.
Addition is obvious, so for now I am going to leave out how you add two matrices. Multiplication is interesting, and also fundamental to Robotics. Basic transformation operations between frames are done using Matrix multiplication. The textbook gives 4 ways to multiply 2 matrices.
To multiply $AB$, $A$ must have $n$ columns and $B$ must have $n$ rows.
The Fundamental Law of Matrix Multiplication is given by the following:
\[(AB)C = A(BC)\]For matrix multiplication:
\[AB \neq BA \\ A(B + C) = AB + AC \\ (A + B)C = AC + BC \\ A(BC) = (AB)C\]Assume $A$ is a square matrix. Whatever the matrix $A$ does, $A^{-1}$ undoes. The product of a matrix and itβs inverse is the indentity matrix.
\[A A^{-1} = I\]The following are a few notes on inverses
Think about the geometry of the above matrix multiplication. At this point in the book, the geometry of matrices has not yet been discussed. So you can visualise it as a bunch of vectors.
The central question about inverting matrices, how do we know if a matrix has an inverse? This is asked particularly in the light of formulating the solution of a set of linear equations $Ax = b$. The solution is given as $x = A^{-1}b$, only if $A^{-1}$ exists.
A square matrix $A$ with a full set of pivots will always have a two-sided inverse. What is a two sided inverse?
And finally, something about diagonally dominant matrices and their inverses.
Any matrix that does not have an inverse is a singular matrix.
Many key ideas in Linear Algebra, are really factorizations of a matrix. The original matrix can be described as a product of 2 to 3 special matrices. The first factorization comes from elimination.
\[A = LU\]$L$ represents the inverse of all the elimination matrices used to transform $A$ to $U$. The final step is to divide by the pivots to attain the following formulation.
\[A = LDU\]Symmetric Matrices follow $A^{T} = A$.