0

I am reading the paper "Understanding dimensional collapse in contrastive self-supervised learning". Authors identified a
dimensional collapse phenomenon, i.e. some dimension of embedding collapse to zero. They show this by collecting the embedding vectors on the validation set. Each embedding vector has a size of $d=128$, then compute the covariance matrix $C\in\mathbb{R}^{d\times d}$. Then the singular value decomposition is applied on the covariance matrix. They state d that a number pf singular values collapse to zero, thus representing collapsed dimensions.

Thus my questions are:

  1. What does singular value decomposition of covariance matrix represent?
  2. Why a number of singular value of covariance matrix collapse to zero can represent these dimension of embedding collapse.
Noel
  • 1
  • Does this answer your question? [What is the intuitive relationship between SVD and PCA?](https://math.stackexchange.com/questions/3869/what-is-the-intuitive-relationship-between-svd-and-pca) – Kurt G. Apr 20 '22 at 17:51
  • Nope, The paper use the SVD of covariance. – Noel Apr 22 '22 at 00:52
  • "The paper use the SVD of covariance". The title of your question is "What does the singular value decomposition of covariance matrix represent?" . I was under the impression that SVD is the abbreviation of singular value decomposition. Please tell me now why exactly your question is not a duplicate? – Kurt G. Apr 22 '22 at 04:19
  • The answer said that the eigenvalue decomposition on covariance matrix is equivalent to SVD or original matrix. But I am confused by applying SVD on covariance matrix rather than original matrix. Thanks again. – Noel Apr 22 '22 at 08:10
  • Sure SVD is normally applied to the data matrix. Regarding your question: Isn't a PCA (diagonalization of the covariance matrix) not just a special case of an SVD of it ? It is probably time to post a link to the paper you are reading. The images tell me nothing. – Kurt G. Apr 22 '22 at 08:39

0 Answers0