## Coursera Machine Learning Review: Linear Algebra Review

This is a series where I’m discussing what I’ve learned in Coursera’s machine learning course taught by Andrew Ng by Stanford University.  Why?  See Machine Learning, Nanodegrees, and Bitcoin.  I’m definitely not going into depth, but just briefly summarizing from a 10,000 foot view.

This is a continuation of the review for week one.

## Why Review Linear Algebra? I was a little interested in why we would be covering linear algebra in this course, but I think I figured it out.  I believe there are two reasons:

1. Linear algebra allows us to solve for several functions all at once.  Remember how we have a dataset that can help use determine a hypothesis function?  Linear algebra allows us to use that entire dataset in one problem rather than solving for it multiple times.
2. Using these matrices goes well with Matlab/Octave/other programming languages.  The can solve these problems in parallel.  A  matrix can take advantage of your computer’s parallelism with multiple cores.

## Matrix/Vector Definition and Operations

Just as a side-note, my nomenclature for the symbols I use with matrices may not be standard.  This is because I did not want to take the time to learn WordPress formatting.  The concepts are sound, but the nomenclature might be a little non-standard.

#### Definitions

A matrix is a two-dimensional grouping of numbers, usually within square brackets (I’ve omitted them below).

This matrix has two rows and three columns, and is noted as a 2×3 matrix.

A vector is a one-dimensional matrix (so it is a subset of matrices).

When you refer to an element in a matrix, you specify row then column.  So in our 2×3 matrix above, element 2 is A12.

To add or subtract two matrices, just add or subtract each of their elements that are in the same position.

To multiple or divide by a scalar (a number, not a matrix/vector), multiply that number by every element in the matrix.

To multiply a matrix by a vector, first the number of columns in the multiplicand (the matrix to the left of the multiplication symbol) and the number of rows in the multiplier (the matrix to the right of the multiplication symbol) must be the same.  The result will be a matrix that is the multiplicand’s number of rows and the multiplier’s number of columns.

For example, A45 * B51 will result in a C41 matrix.

If that condition holds, you multiply each row by the column in the vector and add the results.  For example…

For matrix/matrix multiplication, you do it vector by vector in the second matrix.  For example…

Matrix multiplication is not commutative, meaning A * B is not equal to B * A.  However, it is associative, meaning D * (E * F) = (D * E) * F.

The identity matrix is the equivalent of multiplying by 1 for scalar values.  So multiplying a matrix by the identify matrix just gives back that same matrix.  It turns out the identity matrix operates on square matrices (the number of rows and columns are the same), and it is just a diagonal row of 1s.  Everything else is 0.  So, as an example, an identity matrix for a 5×5 matrix is…

## Inverse and Transverse

• The inverse of a matrix is whatever you can multiply a matrix by in order to get the identity matrix.
• Not all matrices have an inverse.
• Only square matrices can potentially have an inverse.
• The transposition of a matrix is defined as reversing the column and row for every element in a matrix.  This is kind of like rotating it then flipping it backwards.  It is denoted as AT.

## Conclusion

This part of Coursera also gave examples of Matlab/Octave commands that defined and performed operations of matrices.  I won’t show those here until we actually start programming in class.  My assumption is we won’t actually be doing very much linear algebra by hand.  Instead, we will be using software to calculate it for us.  But rather than just use the software, this course showed us how the software calculates those values.

Keep in mind, if this material interests you, you can join the course for free and follow along as I post these.