TransWikia.com

Input Normalization for Transfer Learning

Data Science Asked by user75228 on December 3, 2020

If I am training a deep neural net with input features that are physical in nature (e.g. temperature, precipitation, etc), and I want to be able to perform some kind of transfer learning where I train on multiple instances to see how they perform on a different set of inputs entirely. How do I make sure that the inputs being normalized in each instant don’t conflict with one another?

For example, Mean temperature of 0 degrees with a standard deviation of 10 will be the same as mean temperature of 80 degrees with a standard deviation of 10 after normalization to a mean of 0 and a std dev of 1.

One Answer

You should select a single preprocessing scheme and keep it constant for all experiments!

In your case, you want to scale your input features to $0$ mean and a standard deviation equal to $1$:

$$x_{scaled} = frac{x-μ}{σ}$$

During the initial training compute the mean $μ$ and standard deviation $σ$ of each feature and store them. Then use the same values to scale the data for all subsequent experiments.

Answered by Djib2011 on December 3, 2020

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP