TransWikia.com

Term for the error in machine learning as a direct result of incorrectly labelled data?

Cross Validated Asked on November 9, 2021

Is there a term for the inaccuracy that results from an ML model being trained on imperfectly labelled data? For example, if humans label a training set, they could make occasional human errors. In such cases, a theoretically perfect model trained on that dataset would still have some inaccuracy (purely due to that human error during labelling). Note that human error may not be the only cause of inaccurate labels; it’s just a simple example.

Example use of the term: “the data set was labelled by humans, so we should anticipate some observations being incorrectly labelled, and hence a model that performs with some degree of error”

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP