The curse of isotropy: from principal components to principal subspaces
Résumé
This paper raises an important issue about the interpretation of principal component analysis. The curse of isotropy states that a covariance matrix with repeated eigenvalues yields rotation-invariant eigenvectors. In other words, principal components associated with equal eigenvalues show large intersample variability and are arbitrary combinations of potentially more interpretable components. However, empirical eigenvalues are never exactly equal in practice due to sampling errors. Therefore, most users overlook the problem. In this paper, we propose to identify datasets that are likely to suffer from the curse of isotropy by introducing a generative Gaussian model with repeated eigenvalues and comparing it to traditional models via the principle of parsimony. This yields an explicit criterion to detect the curse of isotropy in practice. We notably argue that in a dataset with 1000 samples, all the eigenvalue pairs with a relative eigengap lower than 21% should be assumed equal. This demonstrates that the curse of isotropy cannot be overlooked. In this context, we propose to transition from fuzzy principal components to much-more-interpretable principal subspaces. The final methodology (principal subspace analysis) is extremely simple and shows promising results on a variety of datasets from different fields.
Origine | Fichiers produits par l'(les) auteur(s) |
---|---|
licence |