TransWikia.com

Can large # of epochs or smaller batchsize compensate for smaller data size in training lstms

Cross Validated Asked by tjt on January 23, 2021

I have about 40 time series (40 products) of weekly sales for 3 years ( = 156 data points for each series). So, in total I have about 6240 data points. To train a stateful or stateless lstm for predicting the sales (assuming I have yearly seasonality) and this smaller data might be insufficient.

Can I compensate the smaller data size by training for large # of epochs or having smaller data size, if I want to go with lstm?

One Answer

You can't compensate in this way because having more epochs doesn't give you more data -- the sample size stays the same size.

In a certain sense, the opposite is true -- early stopping only trains a network for a certain number of iterations but stops before the parameters move enough to overfit. This can be shown to be equivalent to $L^2$ regularization.

If you've got a small amount of data, the best thing to do is collect more data.

If that's not possible, then your best bet is to use simpler models, e.g. regression strategies.

Regularization can also help avoid overfitting.

Correct answer by Sycorax on January 23, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP