TransWikia.com

correcting conditional and marginal distribution in transfer learning

Data Science Asked on June 27, 2021

I understand that in case of transfer learning, we can have the target and the source data having different domain distributions. In such cases, authors in many papers suggest bringing the marginal and conditional distributions of the target and the source closer, i.e, minimize the difference between the marginal and conditional distributions. Can someone please help me understand this by giving an intuitive explanation for this? I am unable to understand what exactly the author means when he says by bringing the distributions closer? Explanations using visual representations would be helpful.

One Answer

Bringing the distributions closer means that we are trying to modify the source data usually by performing weights in the instances or in the features (or in both in hybrid algorithms) in order to make the weighted data more similar to the targer data. If we achieve that then we will be able to train a model considering also the source data, which usually has a larger size or it is well annotated in comparison with the target.

Answered by Christos Karatsalos on June 27, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP