Matrix factorization
A matrix factorization/decomposition is an exact multiplicative representation of a matrix as a product of matrices, in contrast to (matrix) composition, where we multiply matrices to get a single matrix. Both forms express the same transformation:
where are matrices which usually have special structure (e.g. triangular matrix, orthogonal matrix, diagonal matrix, etc.) that make them easier/more efficient to work with/simplify computation, or reveal properties of the original matrix .
Examples:
- singular value decomposition (most general, works for any matrix)
- eigendecomposition (for square matrixs with enough linearly independent eigenvectors)
- Cholesky decomposition (for symmetric matrixs that are also positive definite)
- LU decomposition (for square matrixs)
- …
A low-rank approximation of a matrix via learned matrix factorization.
Given , choose rank and learn
Typically fit by minimizing a regularized loss over observed entries :
Unlike plain SVD, MF handles sparsity/missing data and adds regularization.
This can be used nicely for recommender systems.