TransWikia.com

Universal approximation theorem on limited precision arithmetic

Cross Validated Asked by MrMartin on November 24, 2021

The universal approximation theorem of a single-layer neural network holds for all activation functions which are bounded and nonconstant, so it’s clear that a limited-precision function can be used.

However, does the theorem still hold when the multiplication and summation operations are not on reals, but with limited precision arithmetic? That is, when the whole thing runs on a physical computer?

This has practical considerations, like whether we can expect a physical implementation of a neural net to hold the same universality as its theoretical counterpart.

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP