TransWikia.com

Choosing the size of the network for Neural Collaborative Filtering (NCF)?

Data Science Asked by bkaankuguoglu on May 11, 2021

I’ve been working on Neural Collaborative Filtering (NCF) recently to build a recommender system. After doing some hyperparameter tuning with various sizes for embedding and dense layers sizes, from 16 all the way to 2048, I realized less complex models tend to work better and usually do a better job at avoiding overfitting. This is not so surprising until now, I suppose.

I know it’s generally very contextual and data-dependent. The less complex relationships in the data, the higher chance of overfitting in training the networks with large capacity. But I can’t help but wonder, besides halving the size of dense networks, if there is any rule of thumb for the size of such networks. How do you usually go about choosing the size of the network for NCF models?

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP