# Review, Chapters 6 and 7

• Norms, inner products, orthogonality, orthogonal complements as sub-spaces.

• Given a subspace of a vector space: a vector can always be composed uniquely as a sum of a vector in the subspace, and a vector in the orthogonal complement (Orthogonal Decomposition theorem). Orthogonal sets, orthogonal basis, orthogonal matrix (with orthonormal columns!). Projections.

• Least square solution: the right-hand side of Ax=b is not in the column space of A, which means that the system is inconsistent. We find the best solution x, which gives us the vector closest to b in the column space....

The least-squares problem is solved by turning the rectangular problem into a square problem (which we hope is invertible), and solving that. This leads to a symmetric matrix system.

• Symmetric matrices:
• real eigenvalues
• No missing "eigenspaces"
• "eigenspaces" mutually orthogonal (meaning that the eigenspaces form a basis like the standard, only rotated/reflected)
• "orthogonally diagonalizable" (say it three times fast)
• Have nice spectral decomposition, which is a sum of projection operators.

• Quadratic forms are built on symmetric matrices
• Coordinates can be changed so that the equations become simple, "diagonal" (Principal Axes Theorem).
• positive definite: all eigenvalues strictly positive
• positive semi-definite: all eigenvalues greater than or equal to zero
• negative definite: all eigenvalues strictly negative
• negative semi-definite: all eigenvalues less than or equal to zero
• indefinite - mixed eigenvalues

• Constrained optimization: maximizing quadratic forms subject to vectors lying on the unit ball.
• Extrema fall on the eigenvectors: max on the eigenvector(s) corresponding to the largest eigenvalue, min corresponding to the minimum eigenvalue.

• The Singular Value Decomposition
• The Fundamental Theorem of Linear Algebra (Gil Strang)
• That any matrix with real components can be decomposed into a matrix product, where U and V are eigenvectors of AAT and ATA, respectively, and is (roughly speaking) "diagonal", with the square roots of the eigenvalues of the matrices AAT and ATA.
• This can also be considered a sum: which is the better way to think of the SVD if you're working with images (or statistical analysis such as Principal Components Analysis, Factor Analysis, etc.)
• The SVD pulls together and demonstrates many concepts of the course:
• matrix dimensions
• rank
• transpose
• matrix product
• norm
• inner and outer products
• unit vectors
• diagonal matrix
• singular matrix
• condition number
• orthogonal matrices
• basis
• row and column spaces
• null space
• consistent and inconsistent systems of equations
• subspace
• symmetric matrices
• partitioned matrices (for )
• eigenvalues and eigenvectors
• orthogonal diagonalization