TransWikia.com

Is random seed a hyper-parameter to tune in training deep neural network?

Cross Validated Asked on November 6, 2021

For building deep neural networks, there are a lot of random components in each training. On one hand, I feel it is uncanny to "tune" random seed. But in my experience, some random seed just works better than others …

So, Is random seed a hyper-parameter to tune in training deep neural network?

One Answer

If your training has large changes in performance due to the random seed, then it is unstable. This is an undesirable trait. Testing different random seeds can in this sense be useful to check stability. But picking a model from a given random seed which happens to do better on the validation set does not guarantee better performance on unseen data.

Answered by Jon Nordby on November 6, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP