TransWikia.com

SVM with gradient descent

Data Science Asked on November 30, 2020

The constrained optimization problem in SVM is given by
min 1/2 ||w||^2
s.t y(i)(w^T x(i) + b >= 1 for all i

Now converting this to an unconstrained optimization problem gives the lagriangian L as shown in the picture

And while deriving this we also get an equation for w as shown in picture
I’ve been instructed to implement this using gradient descent and give reason if this is not possible
This is the approach I tried to follow:

  1. Use gradient descent to minimize the lagrangian and get the minimum alpha vector
  2. Then using equation for w which uses this alpha vector …compute w vector
    Also I’ve included b (intercept) in w itself .

But it’s not working
I’m very new to ML and I couldn’t understand all the math behind this. Help

enter image description here

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP