I have read that the singular values of any matrix $A$ are non-negative (e.g. wikipedia). Is there a reason why?

The first possible step to get the SVD of a matrix $A$ is to compute $A^{T}A$. Then the singular values are the square root of the eigenvalues of $A^{T}A$. The matrix $A^{T}A$ is a symmetric matrix for sure. The eigenvalues of symmetric matrices are always real. But why are the eigenvalues (or the singular values) in this case always non-negative as well?

  • 311
  • 1
  • 2
  • 7

4 Answers4


I'm assuming that the matrix $A$ has real entries, or else you should be considering $A^*A$ instead.

If $A$ has real entries then $A^TA$ is positive semidefinite, since $$ \langle A^TAv,v\rangle=\langle Av,Av\rangle\geq 0$$ for all $v$. Therefore the eigenvalues of $A^TA$ are non-negative.

  • 52,033
  • 5
  • 56
  • 97
  • 9
    This shows that the eigenvalues of $A^TA$ are non-negative, what about the singular values of $A$? Do you know if the fact that the singular values of $A$ are (non-negative) square roots of the eigenvalues of $A^HA$ holds for $A$ with complex entries? – Learn_and_Share Oct 25 '17 at 08:42
  • Yes, the singular values of $A$ are the square roots of the eigenvalues of $A^*A$. – carmichael561 Oct 25 '17 at 15:11
  • 6
    @MedNait Why are they chosen to be the positive square roots? – Undertherainbow Feb 19 '18 at 15:25
  • 5
    @Undertherainbow I think I read somewhere that it's a convention since, technically, you could also choose the singular values of $A$ as the negative square roots of the eigenvalues of $A^HA$. – Learn_and_Share Feb 19 '18 at 20:50

Assume that $A$ is real for simplicity. The set of (orthogonal,diagonal,orthognal) matrices $(U, \Sigma, V)$ such that $A = U \Sigma V^T$ is not unique. Indeed, if $A = U \Sigma V^T$ then also

$$ A = (-U)(-\Sigma)V^T = U (-\Sigma)(-V^T) = (UD_1)(D_1 \Sigma D_2)(V D_2)^T$$ for any diagonal matrices $D_1$ and $D_2$ with only $1$ or $-1$ on the diagonal. Therefore, the positivity of the singular values is purely conventional.

Roberto Rastapopoulos
  • 1,816
  • 10
  • 26

Suppose $T \in \mathcal{L}(V)$, i.e., $T$ is a linear operator on the vector space $V$. Then the singular values of $T$ are the eigenvalues of the positive operator $\sqrt{T^* \; T}$. The eigenvalues of a positive operator are non-negative.

  • Why is $\sqrt{T^* \; T}$ a positive operator? Consider $S = T^* \; T$. Then $S^* = (T^* \; T)^* = T^*\;(T^*)^* = T^*\;T=S$, and hence $S$ is self-adjoint. Also, $\langle Sv, v \rangle = \langle T^*\,T v, v \rangle = \langle Tv, Tv \rangle \geq 0$ for every $v \in V$. Hence $S$ is positive. Now every positive operator has a unique positive square root, which, for $S$, I am denoting with $\sqrt{T^* \; T}$.
  • Why are the eigenvalues of a positive operator non-negative? If $S$ is a positive operator, then $ 0 \leq \langle Sv, v \rangle = \langle \lambda v, v \rangle = \lambda \langle v, v \rangle$, and thus $\lambda$ is non-negative.
  • 812
  • 6
  • 13

I think your question is very interesting. Let us take some, non zero singular value $\sigma_i$. We can reverse the sign if it is positive. That is, $-\sigma_i = - \sqrt{\lambda_i^2}=-\lambda_i$ where $\lambda_i^2$ is an eigenvalue of $A^T A$ corresponding to an eigenvector $v_i$. That is $A^T A v_i = \lambda_i^2 v_i$. Who can stop us to write instead $A^T A (-v_i) = \lambda_i^2 (-v_i)$? What this means is that we can reverse the sign of a singular value, but then we need to go to the matrix $V$ and reverse the sign of its corresponding eigenvector column.

Hence, there is not a unique way to write $A=U \Sigma V^T$. But if we decide that all $\sigma_i$ are non-negative, then "yes" there is a unique way to write $A=U \Sigma V^T$. Of course all $\sigma_i$ are sorted from largest to smallest (otherwise there would be a bunch of possibilities by permuting any two columns of $U$ and $V$ and their corresponding eigenvalues.)

Herman Jaramillo
  • 2,568
  • 21
  • 25