I was wondering how I can choose the best minimization method for scipy.optimize.minimize and how different the results may be?
I am trying to minimize the following expression (solve for g):
|a1.g.x + a2.g.x^3 - K|
I was wondering how I can choose the best minimization method for scipy.optimize.minimize and how different the results may be?
I am trying to minimize the following expression (solve for g):
|a1.g.x + a2.g.x^3 - K|
Scipy has a lecture on Mathematical Optimization, where they have a section on choosing a minimization method. Snippet taken from that section:
Without knowledge of the gradient:
- In general, prefer BFGS or L-BFGS, even if you have to approximate numerically gradients. These are also the default if you omit the parameter method - depending if the problem has constraints or bounds
- On well-conditioned problems, Powell and Nelder-Mead, both gradient-free methods, work well in high dimension, but they collapse for ill-conditioned problems.
With knowledge of the gradient:
- BFGS or L-BFGS.
- Computational overhead of BFGS is larger than that L-BFGS, itself larger than that of conjugate gradient. On the other side, BFGS usually needs less function evaluations than CG. Thus conjugate gradient method is better than BFGS at optimizing computationally cheap functions.
With the Hessian:
- If you can compute the Hessian, prefer the Newton method (Newton-CG or TCG).
If you have noisy measurements:
- Use Nelder-Mead or Powell.
If I have interpreted your equation correctly, I think that either BFGS or L-BFGS
might work for you.
I find that lmfit works much better on real data. I have a problem where lmfit finds the coefficients for 15 parameters used for a non-linear system. lmfit ( Levenberg_Marquardt ) is 5 times a faster than the minimize's L-BFGS-B which is in second place and BFGS which comes in third. None of the others converge. Since this is real data, the Jacobian and Hessian are not available. I like Nelder-Mead when then number of parameters is small.
© 2022 - 2024 — McMap. All rights reserved.