The Data Science Lab

Matrix Inverse from Scratch Using SVD Decomposition with C#

The Helper Functions
The purpose of most of the 11 helper functions should be clear from their names or by examining the source code. The Hypot(x, y) function computes sqrt(x^2 + y^2) in a clever way that avoids arithmetic overflow and underflow errors. The VecNorm(v) function computes the norm() of a vector, which is the square root of the sum of the squared elements of vector v. The VecDot(v1, v2) function computes the dot product of vectors v1 and v2, which is the sum of the products of corresponding elements of v1 and v2.

The special MatInvFromVec(s) is used to convert a vector s to a diagonal matrix S and then to the inverse of S. For example, if s = (4.00, 0.20, 1.00) then S =

 4.00  0.00  0.00
 0.00  0.20  0.00
 0.00  0.00  1.00

and then the inverse of S =

 0.25  0.00  0.00
 0.00  5.00  0.00
 0.00  0.00  1.00

An alternative design is to define a separate MatFromVec() function and a MatInvDiag() function.

Wrapping Up
Inverting a matrix using SVD decomposition is a good, general-purpose approach. For special types of matrices, there are different algorithms that are more efficient. If the source matrix is 3-by-3 or smaller, then it's possible to compute the inverse using brute force. If you need the determinant of the source matrix, then using LUP decomposition (Crout or Doolittle algorithm) is useful. If the source matrix is a covariance matrix, then using Cholesky decomposition is more efficient.

The implementation of SVD-Jacobi matrix inverse presented in this article emphasizes simplicity and ease-of-modification over robustness and performance. Computing the inverse of a matrix, regardless of the algorithm used, can fail in many ways and so in a production system you must add lots of error checking.

It's not feasible to list all the machine learning algorithms that use matrix inverses -- there are just too many. That said however, examples of ML algorithms that use a matrix inverse are linear regression, logistic regression, Gaussian mixture model clustering, Gaussian process regression and kernel ridge regression.



About the Author

Dr. James McCaffrey works for Microsoft Research in Redmond, Wash. He has worked on several Microsoft products including Azure and Bing. James can be reached at [email protected].

comments powered by Disqus

Featured

Subscribe on YouTube