TransWikia.com

Backpropagation and gradient descent

Data Science Asked by Luka Milivojevic on January 2, 2021

I just want to clear one doubt – we use gradient descent to optimize the weights and biases of the neural network, and we use backpropagation for the step that requires calculating partial derivatives of the loss function, or am I misinterpreting something?

One Answer

Yes you are correct. Gradient descent (or various flavors of it) is the mechanism by which you find a local minima of your loss space using some learning rate. Backpropagation calculates the gradient of the error function with respect to the NN's weights and biases.

Correct answer by Oliver Foster on January 2, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP