- Published on
Deep Learning Notes. Chapter 2 - Linear Algebra
Note: This is my notes while reading Deep Learning by Ian Goodfellow
2.1 Scalars, Vectors, Matrices and Tensors
Scalars: A scalar is just a single number. s ∈ R could be the slope of the line.
Vectors: A vector is an array of numbers arranged in order. We write a vector as:
=
Matrices: A matrix is a 2-D array of numbers, so each element is identified by two indices instead of just one.
Tensors: A tensor is an array of numbers arranged on a regular grid with a variable number of axes. We identify the element of A at coordinates by writing .
Transpose
Transpose of a matrix is the mirror image across it's diagonal.
Example:
Vector as a matrix with only one column:
then,
2.2 Multiplying Matrices and Vectors
The matrix product of matrices and is a third matrix .
Properties:
2.3 Identity and Inverse Matrices
An identity matrix is a matrix that does not change any vector when we multiply that vector by that matrix.
The matrix inverse of is denoted as , and it is defined as the matrix such that
So of exists,
2.4 Linear Dependence and Span
2.7 Eigendecomposition
Just like integers which can be decomposed in prime numbers, matrices can be decomposed in a set of eigenvectors and eigenvalues.
An eigenvector is a special type of vector in linear algebra that remains in the same direction after a linear transformation is applied to it, although it may be scaled by a factor known as the eigenvalue. if A is a square matrix representing the linear transformation, and v is an eigenvector, then the relationship can be expressed as:
Av=λv where
λ is the eigenvalue corresponding to the eigenvector v