I am trying to understand how the "dogleg" method works in Python's scipy.optimize.minimize
function. I am adapting the example at the bottom of the help page.
The dogleg method requires a Jacobian and Hessian argument according to the notes. For this I use the numdifftools
package:
import numpy as np
from scipy.optimize import minimize
from numdifftools import Jacobian, Hessian
def fun(x,a):
return (x[0] - 1)**2 + (x[1] - a)**2
x0 = np.array([2,0]) # initial guess
a = 2.5
res = minimize(fun, x0, args=(a), method='dogleg',
jac=Jacobian(fun)([2,0]), hess=Hessian(fun)([2,0]))
print(res)
Edit:
If I make a change as suggested by a post below,
res = minimize(fun, x0, args=a, method='dogleg',
jac=Jacobian(lambda x: fun(x,a)),
hess=Hessian(lambda x: fun(x,a)))
I get an error TypeError: <lambda>() takes 1 positional argument but 2 were given
. What am I doing wrong?
Also is it correct to calculate the Jacobian and Hessian at the initial guess x0
?
Jacobian
andHessian
I get an errorValueError: The truth value of an array with more than one element is ambiguous. Use a.any() or a.all()
. Also if I want to specify more than 1 argument, would I then need to use something likeargs=(a,b,c)
? – Sleight