Symmetric Matrices, SVD, and PCA
Session Material:¶
Lay: 7.1+7.4-7.5
Session Description¶
This session dives into some advanced and incredibly useful matrix factorizations. We start by looking at special matrices – the "symmetric matrices" (where A equals its transpose) – and discover that they have remarkable properties, particularly that they can always be "orthogonally diagonalized" (meaning we can find an orthonormal basis of eigenvectors). This fundamental result is captured by the "Spectral Theorem."
Then, we move to a factorization that applies to any matrix, not just square or symmetric ones: the "Singular Value Decomposition" (SVD). We'll learn about "singular values" and "singular vectors" and how they provide a powerful way to understand the structure and geometric action of any linear transformation. Finally, we'll explore some of the many real-world "applications of SVD," seeing how this decomposition is essential in areas like data analysis, image processing (for compression), and even in finding the best approximate solutions to systems using the "pseudoinverse."
Key Concepts¶
- Symmetric Matrices
- Orthogonal Diagonalization
- Spectral Theorem
- Singular Value Decomposition (SVD)
- Applications of SVD
- Pseudoinverse
Learning Objectives
- Identify and analyze properties of symmetric matrices and their diagonalization.
- Apply the Spectral Theorem to orthogonally diagonalize matrices.
- Perform and interpret Singular Value Decomposition (SVD) for general matrices.
- Relate singular values and vectors to the structure of linear transformations.
- Apply SVD and the pseudoinverse to solve problems in data analysis and engineering.
Exercises¶
ALL SVD problems from the Exam Cases (2018-s2020)
If you do not finish in class, work on them at home