TransWikia.com

Train NN for balanced accuracy

Data Science Asked by karu on February 10, 2021

I’m trying to get a NN (a CNN, to be precise) to predict labels on hugely imbalanced data, i.e. some classes are much bigger than others. I want to optimize the balanced accuracy. So I give the class weight to the model which I compute like so:

classes = np.unique(y_ord)
class_weight = compute_class_weight('balanced', classes=classes, y=y_ord)
class_weight = {k: v for k, v in zip(classes, class_weight)}  # make class_weight a dict

Now, I train my model using the sparse_categorical_crossentropy as the loss. However, while the loss decreases, accuracy does not improve. I’ve read in several blog posts the (possible) reason why: The loss is continuous, while the accuracy is either 0 or 1. However, I’ve failed to find a way to deal with this. This does not happen when I do not use the weights, so I guess it must have something to do with these.

Thank you in advance!

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP