TransWikia.com

Python scipy NLLS optimization : Parameter estimate hugely off, but the visualisation looks fine

Computational Science Asked by Isquare1 on June 30, 2021

I don’t have almost any experience in data modelling so I would really appreciate any input!
I have to fit some models to my data – below you see the raw decay, the smoothed version that is actually fit and the actual fit. One crucial parameter that is extracted is the y-axis value where the decay starts (r0) – in this case you see it should be about 0.3.
enter image description here

The fit seems fine given the noisy data (although chi-square is only about 0.1..), but depending on the window size of the filter I apply, I get any wild numbers in the range of r0 = 600 – 10 000 with errors in the range of thousands.. (calculated using the covariance matrix) – very odd, as it doesn’t seem to show on the plot. The other two parameter estimates are somewhat more reasonable. I use Python’s scipy least squares method here and filter with the Savitzky Golay filter. Here is some of the code in case it helps:

> model_func = lambda t, r0, rinf, theta: (r0-rinf)*np.exp(-np.divide(t,theta)) + rinf
> 
> # weighted by the weights calculated above from SG filter 
> error_func = lambda p, r, t, w: (r - model_func(t, p[0], p[1], p[2]))*w
> 
> p0,p1,p2 = r0_0, rinf_0, theta_0   # parameters to optimize - initial values 
> 
> full_output = scipy.optimize.leastsq(error_func, x0 = [p0, p1, p2],
>                               args = (r,t,w),
>                               full_output = True)
> 
> params_fit, cov_x, infodict, mesg, ier = full_output

What could the reason for this massive discrepancy between expected value (0.3), visualisation (it looks pretty much the same) and output prediction be? Sorry if my question is too naive, it’s the first time I’m asked to do this and I just learned from other people’s code 🙁

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP