Matrices and Systems of Linear Equations
Introduction to Matrices
Matrices are rectangular arrays of numbers that provide a powerful framework for solving systems of linear equations, representing linear transformations, and modeling physical systems.
Matrix Definition
An m × n matrix A has m rows and n columns:
The entry in the i-th row and j-th column is denoted by aij.
Special Types of Matrices
- Square Matrix: Number of rows equals number of columns (m = n)
- Identity Matrix (I): Square matrix with 1's on the diagonal and 0's elsewhere
- Diagonal Matrix: Square matrix with non-zero entries only on the diagonal
- Symmetric Matrix: Square matrix where aij = aji
- Hermitian Matrix: Square matrix where aij = aji* (complex conjugate)
- Orthogonal Matrix: Square matrix A where ATA = I
- Unitary Matrix: Square matrix A where A†A = I
Interactive visualization of matrix operations
Matrix Operations
Matrix Addition and Subtraction
For matrices A and B of the same dimensions:
Addition and subtraction are performed element by element.
Scalar Multiplication
For a scalar c and matrix A:
Each element of the matrix is multiplied by the scalar.
Matrix Multiplication
For an m × n matrix A and an n × p matrix B:
The resulting matrix C = AB has dimensions m × p.
Important properties:
- Matrix multiplication is associative: (AB)C = A(BC)
- Matrix multiplication is distributive: A(B + C) = AB + AC
- Matrix multiplication is generally not commutative: AB ≠ BA
Matrix Transpose
The transpose of an m × n matrix A is an n × m matrix AT:
Properties of transpose:
- (A + B)T = AT + BT
- (AB)T = BTAT
- (AT)T = A
Determinants and Matrix Inverses
Determinant
The determinant of a square matrix A is a scalar value that provides important information about the matrix:
For a 2 × 2 matrix:
For a 3 × 3 matrix:
Properties of determinants:
- det(AB) = det(A) · det(B)
- det(AT) = det(A)
- If A has a row or column of zeros, then det(A) = 0
- If A has two identical rows or columns, then det(A) = 0
Matrix Inverse
The inverse of a square matrix A is denoted A-1 and satisfies:
A matrix is invertible if and only if its determinant is non-zero.
For a 2 × 2 matrix:
Properties of inverses:
- (AB)-1 = B-1A-1
- (AT)-1 = (A-1)T
- det(A-1) = 1/det(A)
Systems of Linear Equations
A system of m linear equations with n unknowns can be written as:
This system can be represented in matrix form as:
Where:
Solution Methods
- Gaussian Elimination: Systematically transform the augmented matrix [A|b] to row echelon form
- Matrix Inverse Method: If A is square and invertible, the solution is x = A-1b
- Cramer's Rule: For an n × n system with det(A) ≠ 0, each variable can be found using determinants
Solution Possibilities
- Unique Solution: When A is invertible (det(A) ≠ 0)
- No Solution: When the system is inconsistent
- Infinitely Many Solutions: When the system is underdetermined
The rank of a matrix is the maximum number of linearly independent rows or columns. For an m × n matrix A:
- If rank(A) = rank([A|b]) = n, the system has a unique solution
- If rank(A) < rank([A|b]), the system has no solution
- If rank(A) = rank([A|b]) < n, the system has infinitely many solutions
Eigenvalues and Eigenvectors
For a square matrix A, an eigenvector v is a non-zero vector that, when multiplied by A, yields a scalar multiple of itself:
The scalar λ is called an eigenvalue of A.
Finding Eigenvalues
Eigenvalues are found by solving the characteristic equation:
This is a polynomial equation of degree n for an n × n matrix.
Finding Eigenvectors
For each eigenvalue λ, the corresponding eigenvectors v satisfy:
This is a homogeneous system of linear equations.
Properties and Applications
- The sum of the eigenvalues equals the trace of the matrix (sum of diagonal elements)
- The product of the eigenvalues equals the determinant of the matrix
- If A is symmetric, all eigenvalues are real
- If A is orthogonal, all eigenvalues have magnitude 1
- Eigenvalues and eigenvectors are used to diagonalize matrices: A = PDP-1, where D is diagonal
- They are crucial in solving systems of differential equations, analyzing vibrations, quantum mechanics, and many other physics applications
Visualization of eigenvectors and linear transformations
Applications in Physics
Coordinate Transformations
Matrices represent rotations, reflections, and other coordinate transformations:
This matrix rotates vectors counterclockwise by angle θ in the xy-plane.
Coupled Oscillators
Systems of coupled oscillators are described by matrix equations:
The eigenvalues of K determine the normal mode frequencies, and the eigenvectors give the normal mode shapes.
Quantum Mechanics
Matrices represent quantum mechanical operators:
The Schrödinger equation is an eigenvalue problem where E (energy) is the eigenvalue of the Hamiltonian operator Ĥ.
Circuit Analysis
Kirchhoff's laws for complex circuits lead to systems of linear equations:
Where G is the conductance matrix, V is the vector of node voltages, and I is the vector of current sources.
Key Insight:
Matrices provide a powerful framework for analyzing systems with multiple variables and constraints. In physics, they allow us to represent and solve complex problems involving multiple dimensions, coupled systems, and transformations. The eigenvalue problem, in particular, appears throughout physics whenever we seek the natural modes or stable states of a system.
Practice Problems
Test your understanding of matrices and linear systems with these practice problems:
- Find the eigenvalues and eigenvectors of the matrix \(A = \begin{pmatrix} 3 & 1 \\ 1 & 3 \end{pmatrix}\).
- Solve the system of equations: \(2x + y - z = 8\), \(x - y + z = 2\), \(x + y + z = 6\).
- Show that the matrix \(A = \begin{pmatrix} 0 & 1 & 0 \\ 0 & 0 & 1 \\ -6 & -11 & -6 \end{pmatrix}\) has characteristic polynomial \(p(\lambda) = -\lambda^3 - 6\lambda^2 - 11\lambda - 6\).
- Prove that similar matrices \(A\) and \(B = P^{-1}AP\) have the same eigenvalues.
- A mass-spring system consists of two masses connected by springs. The equations of motion are \(m_1\ddot{x}_1 = -k_1 x_1 + k_2(x_2 - x_1)\) and \(m_2\ddot{x}_2 = -k_2(x_2 - x_1)\). Write this in matrix form and find the normal mode frequencies if \(m_1 = m_2 = m\) and \(k_1 = k_2 = k\).