TransWikia.com

Error rate implying regularity

MathOverflow Asked by user69642 on November 24, 2021

My question is a bit general/vague.

It is well known that the regularity of certain functions can be measured through the rate of decay of certain error quantity based on an approximation procedure (see, e.g., the famous result of DeVore, Jawerth and Popov for Besov regularity in terms of the rate of decay for the error of the best n-term approximation).

I would like to know if there is a standard well-developed theory which allows to obtain regularity results thanks to rates of decay of approximation.

Are there any general references in this direction ?

Many thanks in advance.

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP