TransWikia.com

What is difference between Standard Normal Distribution and Mean Normalization approaches to feature-scaling?

Data Science Asked on December 4, 2020

The tag feature-scaling seems to convey that one of the scaling methods is Standard Normal Distribution. Further, I read an Answer on this site saying that Mean Normalization is a form of feature scaling.

What is the difference between two approaches to scaling?

Note: I think that statistics and mathematics of normalization do differ.

One Answer

The terms standardization and normalization are often used interchangeably. However, strictly speaking they do refer to distinct feature transformations.

Normalization

Normalization, also called feature scaling usually means scaling the data between 0 and 1. There are many approaches that can be used to achieve this. One common way is by

$x' = frac{x - x_{min}}{x_{max} - x_{min}}$

Standardization

Standardization transforms the feature to have a mean 0 and a standard deviation of 1. This is also called z-scoring and can be achieved by

$x_i' = frac{x_i - bar{x}}{s}$

where $bar{x}$ is the mean of the feature and $s$ is the standard deviation of the feature.

Answered by JahKnows on December 4, 2020

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP