logo
today local_bar
Poster Session 4 · Thursday, December 4, 2025 4:30 PM → 7:30 PM
#2804

Rethinking PCA Through Duality

NeurIPS OpenReview

Abstract

Motivated by the recently shown connection between self-attention and (kernel) principal component analysis (PCA), we revisit the fundamentals of PCA. Using the difference-of-convex (DC) framework, we present several novel formulations and provide new theoretical insights. In particular, we show the kernelizability and out-of-sample applicability for a PCA-like family of problems.
Moreover, we uncover that simultaneous iteration, which is connected to the classical QR algorithm, is an instance of the difference-of-convex algorithm (DCA), offering an optimization perspective on this longstanding method.
Further, we describe new algorithms for PCA and empirically compare them with state-of-the-art methods.
Lastly, we introduce a kernelizable dual formulation for a robust variant of PCA that minimizes the -deviation of the reconstruction errors.