TransWikia.com

Differentiable PCA?

Cross Validated Asked by Yaoshiang on November 26, 2021

Is there a differentiable method for dimensionality reduction that is either based on PCA or has the properties of:

  • Mathematically or algorithmically defined, e.g. not trained like an ML model or t-sne. EDIT: Another way to state this might be: guaranteed to converge to the optimal solution within a fixed number of steps. (H/T @Sycorax).
  • Differentiable, so I can place it in the middle of a NN and get gradients at the input based on error at the output.

Of the course the trivial answer is yes – maxpooling and convolutions are simple forms of dimensionality reduction. Even taking the moments/cumulants of a distribution can be viewed as dimensionality reduction. But I’m looking for something with the power of PCA – something more in line with the spirit of finding linear approximations of the manifold structure of the data.

Auto encoders do perform dimensionality reduction and are differentiable. But they are ML based and therefore require training. I’m trying to cut down the training in my NN so AE’s are not a candidate.

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP