NonlinearConstraints in SciPy Optimize
Asked Answered
R

1

1

I'm trying to use the optimization module in SciPy, just writing short trial programs. I can get solutions when there are linear constraints, but the Hessian definition just doesnt work. I've used the example on this site but I get an error when try not to use the built-in Rosenberg function and its hessian.

Also tried with a simple problem found online, my code being:

import numpy as np
from scipy import optimize
from scipy.optimize import NonlinearConstraint

def fun(x):
    return x[0]**2+x[1]**2-8*x[1]+16

bounds = optimize.Bounds([0,0,0],[np.inf,np.inf,np.inf])

def cons_f(x):
    return x[0]**2+x[1]**2+x[2]
def cons_J(x):
    return [2*x[0],2*x[1],1]
def cons_H(x,v):
    return v[0]*[2,2,0]
nonlinear_constraint = optimize.NonlinearConstraint(cons_f, -np.inf, 6, jac=cons_J, hess=cons_H)

x0=[1,1]
res = optimize.minimize(fun, x0, method='trust-constr', jac=cons_J, hess=cons_H,
               constraints=[nonlinear_constraint],
               options={'verbose': 1}, bounds=bounds)
print(res.x)

I get the following error for both cases:

Traceback (most recent call last):
  File "C:\Users\user\OneDrive - EOP\Escritorio\Test.py", line 19, in <module>
    res = optimize.minimize(fun, x0, method='trust-constr', jac=cons_J, hess=cons_H,
  File "C:\Users\user\AppData\Local\Programs\Python\Python39\lib\site-packages\scipy\optimize\_minimize.py", line 634, in minimize
    return _minimize_trustregion_constr(fun, x0, args, jac, hess, hessp,
  File "C:\Users\user\AppData\Local\Programs\Python\Python39\lib\site-packages\scipy\optimize\_trustregion_constr\minimize_trustregion_constr.py", line 332, in _minimize_trustregion_constr
    objective = ScalarFunction(fun, x0, args, grad, hess,
  File "C:\Users\user\AppData\Local\Programs\Python\Python39\lib\site-packages\scipy\optimize\_differentiable_functions.py", line 163, in __init__
    self.H = hess(np.copy(x0), *args)
TypeError: cons_H() missing 1 required positional argument: 'v'
Rask answered 4/11, 2021 at 12:35 Comment(0)
M
1

There are a several things going wrong here:

  1. By setting jac=cons_J and hess=cons_H you are using the derivatives of the constraint function as objective derivatives, which probably is not what you want to do.
  2. The constraint hessian cons_H is wrong.
  3. Your constraint function is a function of three variables but your initial guess x0 lets minimize think you have an optimization problem of two variables.

After fixing all problems, your code could look like this:

import numpy as np
from scipy.optimize import Bounds, minimize, NonlinearConstraint

# objective and derivatives
def fun(x):
    return x[0]**2+x[1]**2-8*x[1]+16

def grad(x):
    return np.array([2*x[0], 2*x[1]-8, 0])

def hess(x):
    return np.array([[2, 0, 0], [0, 2, 0], [0, 0, 0]])

# constraint function and derivatives
def cons_f(x): return x[0]**2+x[1]**2+x[2]
def cons_J(x): return [2*x[0],2*x[1],1]
def cons_H(x,v): return v[0]*np.array([[2, 0, 0], [0, 2, 0], [0, 0, 0]])


# variable bounds
bounds = Bounds([0,0,0],[np.inf,np.inf,np.inf])

# constraint
con = NonlinearConstraint(cons_f, -np.inf, 6, jac=cons_J, hess=cons_H)

# initial guess
x0=[1,1,1]

res = minimize(fun, x0, method='trust-constr', jac=grad, hess=hess,
               constraints=[con], bounds=bounds)
Megaera answered 4/11, 2021 at 14:32 Comment(1)
I didn't really understand what the info meant with linear combination of the Hessians, it seems. The fact that x0 was initialized with just 2 variables was a mistake due to solving a problem with linear constraints before, I forgot to update that. TY so much for your feedback.Martainn

© 2022 - 2024 — McMap. All rights reserved.