TransWikia.com

My validation set losses are static

Data Science Asked on March 17, 2021

My dataset looks something like this:

    0   1   2   3   4   5   6   7   8   9   10  11  12  13  14  15  16  17  18  19  20  21  22  23  24  25  26  27  28  29  30  31  32  33
0   0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 1.0 2.0 2.0 0.0 0.0 0.0 1.0 1.0 1.0 3.0 1.0 0.0
1   0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0
2   0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 3.0 2.0 0.0 1.0 3.0 1.0
3   0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 0.0
4   0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0

And my dataset varies like crazy as shown in mine dataset.describe()

    0   1   2   3   4   5   6   7   8   9   10  11  12  13  14  15  16  17  18  19  20  21  22  23  24  25  26  27  28  29  30  31  32  33
count   214200.000000   214200.000000   214200.000000   214200.000000   214200.000000   214200.000000   214200.000000   214200.000000   214200.000000   214200.000000   214200.000000   214200.000000   214200.000000   214200.000000   214200.000000   214200.000000   214200.000000   214200.000000   214200.000000   214200.000000   214200.000000   214200.000000   214200.000000   214200.000000   214200.000000   214200.000000   214200.000000   214200.000000   214200.000000   214200.000000   214200.000000   214200.000000   214200.000000   214200.000000
mean    0.096228    0.103469    0.147521    0.096657    0.108880    0.151751    0.139869    0.150999    0.205892    0.185588    0.207502    0.318478    0.195868    0.200019    0.229104    0.172759    0.199748    0.208492    0.206004    0.247241    0.225037    0.248922    0.323800    0.485752    0.315481    0.254888    0.256083    0.275196    0.263193    0.241839    0.244188    0.278137    0.274622    0.293413
std 1.031383    1.130085    1.815386    0.952851    2.026803    3.965382    3.190181    3.295184    4.714035    3.830216    4.102940    5.567510    3.112597    3.127541    3.482804    2.464995    2.763012    3.120215    2.655728    2.833560    2.998698    3.040342    4.229684    5.561023    4.079211    1.879339    1.723709    4.119686    3.828952    2.286223    2.143116    2.149646    2.498978    5.550976
min -2.000000   -1.000000   -1.000000   -1.000000   -1.000000   -1.000000   -1.000000   -2.000000   -1.000000   -1.000000   -4.000000   -1.000000   -1.000000   -1.000000   -1.000000   -1.000000   -1.000000   -1.000000   -1.000000   -1.000000   -1.000000   -1.000000   -1.000000   -1.000000   -1.000000   -1.000000   -2.000000   -1.000000   -1.000000   -1.000000   -1.000000   -1.000000   -1.000000   -1.000000
25% 0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000
50% 0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000
75% 0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000    0.000000
max 169.000000  117.000000  259.000000  151.000000  504.000000  766.000000  799.000000  820.000000  950.000000  978.000000  989.000000  1305.000000 899.000000  941.000000  776.000000  597.000000  602.000000  771.000000  563.000000  591.000000  639.000000  634.000000  772.000000  1209.000000 1000.000000 257.000000  174.000000  813.000000  742.000000  444.000000  482.000000  436.000000  473.000000  2253.000000

And we can see how far apart the max values are from the means.

My model is:

X_train, X_val, y_train, y_val = train_test_split(X_train, y_train, test_size=0.15)
my_model = Sequential()
my_model.add(LSTM(units = 64,input_shape = (33,1),return_sequences=True,kernel_regularizer=l2(0.00001),bias_regularizer=l2(0.00001), recurrent_regularizer=l2(0.00001)))
my_model.add(LSTM(units = 128, return_sequences=True,kernel_regularizer=l2(0.00001),bias_regularizer=l2(0.00001), recurrent_regularizer=l2(0.00001)))
my_model.add(Dense(100, kernel_regularizer=l2(0.00001),bias_regularizer=l2(0.00001)))
my_model.add(Dense(10, kernel_regularizer=l2(0.00001),bias_regularizer=l2(0.00001)))
my_model.add(Dense(1))
opt = tf.keras.optimizers.Adam(learning_rate=0.0002)#changed from 0.001
my_model.compile(loss = 'mse',optimizer = opt, metrics = ['mean_squared_error'])
my_model.summary()
history=my_model.fit(X_train,y_train, batch_size = 512,epochs = 100,validation_data=(X_val,y_val),shuffle=True)  ## Removing the shuffle cause shuffle seems to undo the CV

But despite whatever tweaking I do, my validation losses just choose a low constant value and stick to it like gospel. I have tried changing learning rates, models and everything. My loss is always something along those lines.

enter image description here

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP