I am interested in finding/understanding a "good" $L^2$-orthogonal (i.e. uncorrelated) decomposition of a matrix-valued quadratic form. The setup is as follows:

Given a random matrix $X$ (tall, of size $n\times N$), with mean zero rows $X_i$ consider $$Q(X) = X'AX.$$ The covariance structure of $X$ can be encoded in a block matrix $\Sigma>0$ given as $$ \Sigma = \begin{bmatrix} \Sigma_{11} & \Sigma_{12} & \dots& \Sigma_{1N}\\ \Sigma_{12} & \Sigma_{22} & \dots&\\ \vdots & \ddots& \ddots \end{bmatrix}. $$ The blocks are of size $n$ corresponding to the length of the vectors $X_i$.

Ideally, I would like to represent $Q$ as $$ Q(X) = X'AX = \sum U'_i\Lambda_iU_i $$ Where the covariance structure of the $U_i$ is at least block-diagonal: $$ \Lambda = \begin{bmatrix} \Lambda_1 & 0 & 0&\dots\\ 0 & \Lambda_2 & 0 &\dots\\ 0 & 0 & \ddots & \ddots \end{bmatrix}. $$ Is this possible? Also, and if so, I am interested in the relationship between the singular values of the blocks of $\Sigma$ and $\Lambda$.

I suspect something like this to be acheivable given that we can do it ordinary quadratic forms: sum of squares of dependent gaussian random variables

I guess maybe one could use a vectorized version of the argument presented there, but I am not entirely sure how to procede?