I am reading the paper "Understanding dimensional collapse in contrastive self-supervised learning". Authors identified a

dimensional collapse phenomenon, i.e. some dimension of embedding collapse to zero. They show this by collecting the embedding vectors on the validation set. Each embedding vector has a size of $d=128$, then compute the covariance matrix $C\in\mathbb{R}^{d\times d}$. Then the **singular value decomposition** is applied on the **covariance matrix**. They state d that a number pf singular values collapse to zero, thus representing collapsed dimensions.

Thus my questions are:

- What does singular value decomposition of covariance matrix represent?
- Why a number of singular value of covariance matrix collapse to zero can represent these dimension of embedding collapse.