AnswerBun.com

Backpropagation and gradient descent

I just want to clear one doubt – we use gradient descent to optimize the weights and biases of the neural network, and we use backpropagation for the step that requires calculating partial derivatives of the loss function, or am I misinterpreting something?

Data Science Asked by Luka Milivojevic on January 2, 2021

1 Answers

One Answer

Yes you are correct. Gradient descent (or various flavors of it) is the mechanism by which you find a local minima of your loss space using some learning rate. Backpropagation calculates the gradient of the error function with respect to the NN's weights and biases.

Correct answer by Oliver Foster on January 2, 2021

Add your own answers!

Related Questions

How to handle unseen labels in test data?

1  Asked on August 22, 2020 by meysam

   

In a list, find the numbers where they are bigger than the next numbers

1  Asked on August 21, 2020 by ellen-sheldon

 

Neural networks: how to think of it?

0  Asked on August 18, 2020 by luca-di-mauro

       

MLPRegressor Output Range

2  Asked on August 18, 2020 by kam

       

Ask a Question

Get help from others!

© 2022 AnswerBun.com. All rights reserved.