SciPy: leastsq vs least_squares
Asked Answered
A

1

20

SciPy provides two functions for nonlinear least squares problems:

optimize.leastsq() uses the Levenberg-Marquardt algorithm only.

optimize.least_squares() allows us to choose the Levenberg-Marquardt, Trust Region Reflective, or Trust Region Dogleg algorithm.

Should we always use least_squares() instead of leastsq()?

If so, what purpose does the latter serve?

Acaroid answered 24/12, 2016 at 17:2 Comment(0)
M
22

Short answer

Should we always use least_squares() instead of leastsq()?

Yes.

If so, what purpose does the latter serve?

Backward compatibility.

Explanation

The least_squares function is new in 0.17.1. Its documentation refers to leastsq as

A legacy wrapper for the MINPACK implementation of the Levenberg-Marquadt algorithm.

The original commit introducing least_squares actually called leastsq when the method was chosen to be 'lm'. But the contributor (Nikolay Mayorov) then decided that

least_squares might feel more solid and homogeneous if I write a new wrapper to MINPACK functions, instead of calling leastsq.

and so he did. So, leastsq is no longer required by least_squares, but I'd expect it to be kept at least for a while, to avoid breaking old code.

Mokas answered 24/12, 2016 at 21:45 Comment(1)
In my experiments it turned out that leastsq is some 10-15% faster than least_squares. Can you comment on this?Drexler

© 2022 - 2024 — McMap. All rights reserved.