![[left pseudo matrix.mp3]]
Left pseudo inverse works for matrices with full column rank (so must be tall).
We denote it by $A†$ (A dagger). It satisfies $A† A = I$ .
One way to compute it is: $A† = (A^TA)^{-1}A^T$ (also called Moore-Penrose pseudo-inverse)
Another way to compute it is through SVD(singular value decomposition), which is more numerically stable:
![[Pasted image 20221105111004.png]]
$$A = UΣVᵀ$$
where U and V are orthogonal matrices, and Σ is a diagonal matrix of singular values. The pseudo-inverse is then given by:
$$A^† = VΣ⁻¹Uᵀ$$
where $Σ⁻¹$ is obtained by taking the reciprocal of the non-zero singular values in Σ and transposing the resulting matrix.
The problem can be formulated as:
$$\min_x |Ax - b|_2^2$$
The solution to this minimization problem, often called the least squares solution, can be found using the normal equations:
$$A^TAx = A^Tb$$
The matrix $A^TA$ is guaranteed to be invertible if $A$ has full column rank, and the least squares solution is given by:
$$x = (A^TA)^-1A^Tb$$
This solution is equivalent to multiplying both sides of the equation $Ax = b$ by the left pseudo-inverse $A†$ of $A$, which yields:
$$A†Ax = A†b \Rightarrow x = A†b$$ ⚠️ (This can be a bit confusing here. Because the original equation $Ax=b$ may have no solution, in that case, the solution of new equation $A†Ax = A†b$ is NOT the solution of the original equation.)
The computation of $(A^TA)^{-1}A^T$ can be numerically unstable if $A^TA$ is ill-conditioned (typically appear when $det(A)$ is small, like Hibbert matrix). In such cases, the SVD-based method for computing the pseudo-inverse provides a more numerically stable solution.