Cross Validated Asked by White on January 1, 2022
My question boils down to this : Which optimizer should I use to train my neural network.
I understand this question depends on the problem.
However, for instance it seems that ADAM optimizer generally outperforms SGD. So maybe we can find some rationality in the choice of optimizers (and their learning rate) ?
Maybe it won’t be mathematically correct, but maybe some of you came accross a standard method for choosing an optimizer ?
There is no science behind which optimiser to use in a model. It mostly boils down to heuristics and type of model in use.
ADAM is not generally better than SGD: This blog post reviews an article about SGD being a better optimiser than ADAM. https://shaoanlu.wordpress.com/2017/05/29/sgd-all-which-one-is-the-best-optimizer-dogs-vs-cats-toy-experiment/
There are pros and cons of each optimiser and no optimiser works best
https://ruder.io/optimizing-gradient-descent/ This link compares different gradient descent optimisers.
Answered by Vivek on January 1, 2022
1 Asked on November 12, 2021
machine learning precision recall random forest unbalanced classes
1 Asked on November 12, 2021
1 Asked on November 12, 2021
1 Asked on November 12, 2021
binary data classification precision recall unbalanced classes
0 Asked on November 12, 2021 by vineet
0 Asked on November 12, 2021
1 Asked on November 12, 2021 by stephen-clark
1 Asked on November 12, 2021 by user6703592
0 Asked on November 12, 2021
1 Asked on November 12, 2021 by emma-jean
0 Asked on November 12, 2021 by rbeginner
1 Asked on November 12, 2021
1 Asked on November 12, 2021
cox model mathematical statistics r survival time varying covariate
2 Asked on November 9, 2021 by jennifer-ruurs
1 Asked on November 9, 2021 by joy_1379
1 Asked on November 9, 2021 by astel
errors in training hyperparameter supervised learning tuning
Get help from others!
Recent Questions
Recent Answers
© 2023 AnswerBun.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP