Is there any quadratic programming function that can have both lower and upper bounds - Python
Asked Answered
E

3

6

Normally I have been using GNU Octave to solve quadratic programming problems.

I solve problems like

x = 1/2x'Qx + c'x

With subject to

A*x <= b
lb <= x <= ub

Where lb and ub are lower bounds and upper bounds, e.g limits for x

My Octave code looks like this when I solve. Just one simple line

U = quadprog(Q, c, A, b, [], [], lb, ub);

The square brackets [] are empty because I don't need the equality constraints

Aeq*x = beq,

So my question is: Is there a easy to use quadratic solver in Python for solving problems

x = 1/2x'Qx + c'x

With subject to

A*x <= b
lb <= x <= ub

Or subject to

b_lb <= A*x <= b_ub
lb <= x <= ub
Erin answered 22/4, 2019 at 20:18 Comment(4)
look in scipy.Consensus
Another option is to use cvxoptAccuse
You could also take a look at qpsolversNauseating
have a look at lsq_linearPirog
B
2

If you need a general quadratic programming solver like quadprog, I would suggest the open-source software cvxopt as noted in one of the comments. This is robust and really state-of-the-art. The main contributor is a major expert in the field and the co-author of a classic book on Convex Optimization.

The function you want to use is cvxopt.solvers.qp. A simple wrapper to use it in Numpy like quadprog is the following. Note that bounds can be included as a special case of inequality constraints.

import numpy as np
from cvxopt import matrix, solvers

def quadprog(P, q, G=None, h=None, A=None, b=None, options=None):
     """
    Quadratic programming problem with both linear equalities and inequalities

        Minimize      0.5 * x @ P @ x + q @ x
        Subject to    G @ x <= h
        and           A @ x = b
    """
    P, q = matrix(P), matrix(q)

    if G is not None:
        G, h = matrix(G), matrix(h)

    if A is not None:
        A, b = matrix(A), matrix(b)

    sol = solvers.qp(A, b, G, h, A, b, options=options)

    return np.array(sol['x']).ravel()

cvxopt used to be difficult to install, but is nowadays also included in the Anaconda distribution and can be installed (even on Windows) with conda install cvxopt.

If instead, you are interested in the more specific case of linear least-squares optimisation with bounds, which is a subset of the general quadratic programming, namely

Minimize || A @ x - b ||
subject to lb <= x <= ub

Then Scipy has the specific function scipy.optimize.lsq_linear(A, b, bounds).

Note that the accepted answer is a very inefficient approach and should not be recommended. It makes no use of the crucial fact that the function you want to optimize is quadratic but instead uses a generic nonlinear optimization program and does not even specify the analytic gradient.

Blair answered 11/12, 2019 at 13:29 Comment(0)
V
4

You can write your own solver based scipy.optimize, here is a small example on how to code your custom python quadprog():

# python3
import numpy as np
from scipy import optimize

class quadprog(object):

    def __init__(self, H, f, A, b, x0, lb, ub):
        self.H    = H
        self.f    = f
        self.A    = A
        self.b    = b
        self.x0   = x0
        self.bnds = tuple([(lb, ub) for x in x0])
        # call solver
        self.result = self.solver()

    def objective_function(self, x):
        return 0.5*np.dot(np.dot(x.T, self.H), x) + np.dot(self.f.T, x)

    def solver(self):
        cons = ({'type': 'ineq', 'fun': lambda x: self.b - np.dot(self.A, x)})
        optimum = optimize.minimize(self.objective_function, 
                                    x0          = self.x0.T,
                                    bounds      = self.bnds,
                                    constraints = cons, 
                                    tol         = 10**-3)
        return optimum

Here is how to use this, using the same variables from the first example provided in matlab-quadprog:

# init vars
H  = np.array([[ 1, -1],
               [-1,  2]])

f  = np.array([-2, -6]).T

A  = np.array([[ 1, 1],
               [-1, 2],
               [ 2, 1]])

b  = np.array([2, 2, 3]).T
x0 = np.array([1, 2])
lb = 0
ub = 2

# call custom quadprog
quadprog  = quadprog(H, f, A, b, x0, lb, ub)
print(quadprog.result)

The output of this short snippet is:

     fun: -8.222222222222083
     jac: array([-2.66666675, -4.        ])
 message: 'Optimization terminated successfully.'
    nfev: 8
     nit: 2
    njev: 2
  status: 0
 success: True
       x: array([0.66666667, 1.33333333])

For more information on how to use scipy.optimize.minimize please refer to the docs.

Vansickle answered 4/5, 2019 at 17:5 Comment(0)
B
2

If you need a general quadratic programming solver like quadprog, I would suggest the open-source software cvxopt as noted in one of the comments. This is robust and really state-of-the-art. The main contributor is a major expert in the field and the co-author of a classic book on Convex Optimization.

The function you want to use is cvxopt.solvers.qp. A simple wrapper to use it in Numpy like quadprog is the following. Note that bounds can be included as a special case of inequality constraints.

import numpy as np
from cvxopt import matrix, solvers

def quadprog(P, q, G=None, h=None, A=None, b=None, options=None):
     """
    Quadratic programming problem with both linear equalities and inequalities

        Minimize      0.5 * x @ P @ x + q @ x
        Subject to    G @ x <= h
        and           A @ x = b
    """
    P, q = matrix(P), matrix(q)

    if G is not None:
        G, h = matrix(G), matrix(h)

    if A is not None:
        A, b = matrix(A), matrix(b)

    sol = solvers.qp(A, b, G, h, A, b, options=options)

    return np.array(sol['x']).ravel()

cvxopt used to be difficult to install, but is nowadays also included in the Anaconda distribution and can be installed (even on Windows) with conda install cvxopt.

If instead, you are interested in the more specific case of linear least-squares optimisation with bounds, which is a subset of the general quadratic programming, namely

Minimize || A @ x - b ||
subject to lb <= x <= ub

Then Scipy has the specific function scipy.optimize.lsq_linear(A, b, bounds).

Note that the accepted answer is a very inefficient approach and should not be recommended. It makes no use of the crucial fact that the function you want to optimize is quadratic but instead uses a generic nonlinear optimization program and does not even specify the analytic gradient.

Blair answered 11/12, 2019 at 13:29 Comment(0)
C
1

You could use the solve_qp function from qpsolvers. It solves quadratic programs in the following form:

minimize_x  1/2 x' P x + q'x
subject to  G x <= h
            A x == b
            lb <= x <= ub

The function wraps the many QP solvers available in Python (full list here) via its solver keyword argument. Make sure to try different solvers to find the one that fits your problem best.

Here is a snippet for solving a small problem:

from numpy import array, dot
from qpsolvers import solve_qp

M = array([[1., 2., 0.], [-8., 3., 2.], [0., 1., 1.]])
P = dot(M.T, M)  # this is a positive definite matrix
q = dot(array([3., 2., 3.]), M)
G = array([[1., 2., 1.], [2., 0., 1.], [-1., 2., -1.]])
h = array([3., 2., -2.])
A = array([1., 1., 1.])
b = array([1.])

x = solve_qp(P, q, G, h, A, b, solver="osqp")
print(f"QP solution: x = {x}")

And if you are interested linear least-squares with linear or box (bounds) constraints, there is also a solve_ls function. Here is a short tutorial on solving such problems.

Cony answered 13/11, 2022 at 15:14 Comment(1)
There is an easier way to do this. Use Hilderth's algorithm. Few lines of MATLAB-code that solve QP :)Erin

© 2022 - 2024 — McMap. All rights reserved.