TransWikia.com

When would bias regularisation and activation regularisation be necessary?

Artificial Intelligence Asked on August 24, 2021

For Keras on TensorFlow, a layer class constructor comes with these:

  • kernel_regularizer=…
  • bias_regularizer=…
  • activity_regularizer=…

For example, Dense layer:
https://www.tensorflow.org/api_docs/python/tf/keras/layers/Dense#arguments_1

The first one, kernel_regularizer is easy to understand, it regularises weights, makes weights smaller to avoid overfitting on training data only.

Is kernel_regularizer enough? When should I use bias_regularizer and activity_regularizer too?

One Answer

Regularizer's are used as a means to combat over fitting.They essentially create a cost function penalty which tries to prevent quantities from becoming to large. I have primarily used kernel regularizers. First I try to control over fitting using dropout layers. If that does not do the job or leads to poor training accuracy I try the Kernel regularizer. I usually stop at that point. I think activity regularization would be my next option to prevent outputs from becoming to large. I suspect weight regularization effectively can pretty much achieve the same result.

Correct answer by Gerry P on August 24, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP