Linear algebra is the study of vectors, vector spaces, and linear transformations. It is fundamental to modern mathematics, physics, computer science, and machine learning. The subject provides the mathematical framework for understanding systems of linear equations, geometric transformations, and data analysis techniques.

Vectors

Vector Operations

Addition and Scalar Multiplication
Vectors can be added component-wise and multiplied by scalars:

Dot Product (Inner Product)
The dot product of two vectors measures their alignment:

Cross Product (in ℝ³)
The cross product produces a vector perpendicular to both inputs:

Geometric Interpretation

  • Vectors represent direction and magnitude
  • Addition follows the parallelogram rule
  • Dot product relates to projection
  • Cross product gives the area of the parallelogram spanned by two vectors

Vector Spaces

Definition and Axioms
A vector space over a field is a set with two operations (addition and scalar multiplication) satisfying:

  1. Closure under addition and scalar multiplication
  2. Associativity and commutativity of addition
  3. Existence of zero vector and additive inverses
  4. Distributive and associative properties of scalar multiplication
  5. Multiplicative identity:

Subspaces
A subset is a subspace if it is closed under addition and scalar multiplication, and contains the zero vector.

Span and Linear Combinations
The span of a set of vectors is the set of all linear combinations:

Linear Independence
Vectors are linearly independent if the only solution to is .

Basis and Dimension
A basis is a linearly independent spanning set. The dimension of a vector space is the number of vectors in any basis.

Matrices

Matrix Operations

Addition and Scalar Multiplication
Matrices of the same dimensions can be added entry-wise. Scalar multiplication multiplies each entry.

Matrix Multiplication
For matrices (m×n) and (n×p), the product is m×p with:

Transpose
The transpose swaps rows and columns:

Trace
The trace of a square matrix is the sum of its diagonal entries:

Special Matrices

TypeDefinition
Identity if , else 0
DiagonalNon-zero entries only on main diagonal
Upper triangularZero entries below main diagonal
Lower triangularZero entries above main diagonal
Symmetric
Skew-symmetric
Orthogonal

Matrix Properties

Rank
The rank of a matrix is the dimension of its column space (or equivalently, row space).

Inverse
A square matrix is invertible if there exists such that . A matrix is invertible if and only if its determinant is non-zero.

Determinants
The determinant is a scalar value encoding properties of a matrix:

  • For 2×2:

Systems of Linear Equations

Solving Systems

Gaussian Elimination
A systematic method for solving linear systems by applying row operations:

  1. Swap two rows
  2. Multiply a row by a non-zero scalar
  3. Add a multiple of one row to another

Row Echelon Form (REF)
A matrix is in REF if:

  • All zero rows are at the bottom
  • The leading entry of each non-zero row is to the right of the leading entry above it

Reduced Row Echelon Form (RREF)
RREF additionally requires:

  • Leading entries are all 1
  • Leading entries are the only non-zero entry in their column

LU Decomposition
Express a matrix as where is lower triangular and is upper triangular. Useful for solving multiple systems with the same coefficient matrix.

Solution Types

For a system :

  • Unique solution: (number of unknowns)
  • Infinite solutions:
  • No solution:

Geometric Interpretation

  • In ℝ², each equation represents a line; solutions are intersections
  • In ℝ³, each equation represents a plane; solutions are intersections

Linear Transformations

Definitions

A function between vector spaces is a linear transformation if:

Kernel (Null Space)

Image (Range)

Rank-Nullity Theorem

Matrix Representation

Every linear transformation between finite-dimensional vector spaces can be represented by a matrix. The matrix depends on the choice of bases for the domain and codomain.

Change of Basis
If is the change of basis matrix from basis to basis , then the matrix representation transforms as:

Eigenvalues and Eigenvectors

Definitions

An eigenvector of a matrix satisfies:

where is the corresponding eigenvalue.

Characteristic Polynomial
Eigenvalues are roots of the characteristic polynomial:

Finding Eigenvalues and Eigenvectors

  1. Solve for eigenvalues
  2. For each eigenvalue, solve for eigenvectors

Diagonalisation

A matrix is diagonalisable if where is diagonal and contains eigenvectors as columns.

When is a Matrix Diagonalisable?

  • A matrix is diagonalisable if it has linearly independent eigenvectors
  • Matrices with distinct eigenvalues are always diagonalisable

Spectral Theorem
Every real symmetric matrix is orthogonally diagonalisable: where is orthogonal.

Applications

  • Principal Component Analysis (PCA): Dimensionality reduction using eigenvectors of covariance matrices
  • Differential Equations: Solving systems via diagonalisation
  • Markov Chains: Steady-state distributions from eigenvectors
  • Google PageRank: Web page importance via eigenvector of link matrix

Inner Product Spaces

Inner Products
An inner product on a vector space satisfies:

  • Linearity in the first argument
  • Conjugate symmetry:
  • Positive definiteness: , with equality iff

Norms and Distances
The norm induced by an inner product:
Distance:

Orthogonality
Vectors and are orthogonal if .

Gram-Schmidt Orthogonalisation
An algorithm to convert a basis into an orthonormal basis:

QR Decomposition
Express where is orthogonal and is upper triangular. Obtained via Gram-Schmidt on the columns of .

Singular Value Decomposition (SVD)

Every matrix (m×n) can be decomposed as:

where:

  • (m×m) is orthogonal (left singular vectors)
  • (m×n) is diagonal (singular values)
  • (n×n) is orthogonal (right singular vectors)

Applications

  • Data compression: Low-rank approximations
  • Recommender systems: Matrix factorisation for collaborative filtering
  • Image processing: Noise reduction and compression
  • Pseudoinverse: Computing for least squares problems

Glossary

TermDefinition
SpanSet of all linear combinations of a set of vectors
BasisLinearly independent spanning set
RankDimension of the column space
NullityDimension of the null space
KernelSolution set of
EigenvalueScalar where
EigenvectorNon-zero vector where
OrthogonalPerpendicular; inner product equals zero
Singular valueSquare root of eigenvalue of

Tools

Learning Resources

Books

  • Introduction to Linear Algebra by Gilbert Strang — Classic textbook with geometric intuition
  • Linear Algebra Done Right by Sheldon Axler — Proof-focused, avoids determinants early
  • Linear Algebra and Its Applications by David Lay — Applications-oriented approach
  • Matrix Analysis by Horn and Johnson — Advanced reference

Courses

See Also

  • Calculus — Multivariable calculus uses linear algebra extensively
  • Abstract Algebra — Generalises vector space concepts to modules and rings
  • Machine Learning — Linear algebra is the computational backbone of ML algorithms