TransWikia.com

How does the descending gradient know what weights to adjust?

Data Science Asked by molo32 on January 6, 2021

I was reading about descending gradient.
How does the descending gradient know what weights to adjust?
Does it adjust to all network weights at the same time?

Does each weight have an associated error?

2 Answers

In general (or at least the basic implementation for the gradient descent), you apply for each iteration the update rule for each weight, using the partial derivative of the loss function with respect to that weight, as follows:

enter image description here

Answered by German C M on January 6, 2021

All the weights are updated while backpropagation in the gradient decent algorithm. The amount of weight update is determined by it gradient and the learning rate.

Answered by mujjiga on January 6, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP