Constrained Linear Regression in Python
Asked Answered
R

5

23

I have a classic linear regression problem of the form:

y = X b

where y is a response vector X is a matrix of input variables and b is the vector of fit parameters I am searching for.

Python provides b = numpy.linalg.lstsq( X , y ) for solving problems of this form.

However, when I use this I tend to get either extremely large or extremely small values for the components of b.

I'd like to perform the same fit, but constrain the values of b between 0 and 255.

It looks like scipy.optimize.fmin_slsqp() is an option, but I found it extremely slow for the size of problem I'm interested in (X is something like 3375 by 1500 and hopefully even larger).

  1. Are there any other Python options for performing constrained least squares fits?
  2. Or are there python routines for performing Lasso Regression or Ridge Regression or some other regression method which penalizes large b coefficient values?
Rau answered 14/4, 2012 at 15:46 Comment(1)
sklearn LASSO: google.com/…Collop
B
10

Recent scipy versions include a solver:

https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.lsq_linear.html#scipy.optimize.lsq_linear

Bloodstained answered 14/4, 2012 at 15:53 Comment(3)
Nice, on the surface that sounds like exactly what I need. Being able to provide weights to the X input variable matrix rows may actually also be very useful to me (I do have a sense of the reliability of various data points which that may let me take advantage of). I'll definitely give it a try, thanks!Rau
It is not really good tested through, hope it will work for you. Code is pure python and should be easy to test.Bloodstained
scipy.opimize.nnls is a good tip as well. Simply constraining to non-negative values may indeed be enough. numpy.linalg.lstsq solutions seemed to be balancing out huge positive b values with equally huge negative b values.Rau
T
10

You mention you would find Lasso Regression or Ridge Regression acceptable. These and many other constrained linear models are available in the scikit-learn package. Check out the section on generalized linear models.

Usually constraining the coefficients involves some kind of regularization parameter (C or alpha)---some of the models (the ones ending in CV) can use cross validation to automatically set these parameters. You can also further constrain models to use only positive coefficents---for example, there is an option for this on the Lasso model.

Tache answered 30/5, 2012 at 10:55 Comment(0)
T
4

scipy-optimize-leastsq-with-bound-constraints on SO gives leastsq_bounds, which is scipy leastsq + bound constraints such as 0 <= x_i <= 255.
(Scipy leastsq wraps MINPACK, one of several implementations of the widely-used Levenberg–Marquardt algorithm a.k.a. damped least-squares.
There are various ways of implementing bounds; leastsq_bounds is I think the simplest.)

Thalamencephalon answered 15/4, 2012 at 11:21 Comment(0)
E
2

As @conradlee says, you can find Lasso and Ridge Regression implementations in the scikit-learn package. These regressors serve your purpose if you just want your fit parameters to be small or positive.

However, if you want to impose any other range as a bound for the fit parameters, you can build your own constrained Regressor with the same package. See the answer by David Dale to this question for an example.

Eightieth answered 20/8, 2019 at 2:19 Comment(0)
C
1

I recently prepared some tutorials on Linear Regression in Python. Here is one of the options (Gekko) that includes constraints on the coefficients.

# Constrained Multiple Linear Regression
import numpy as np
nd = 100 # number of data sets
nc = 5   # number of inputs
x = np.random.rand(nd,nc)
y = np.random.rand(nd)

from gekko import GEKKO
m = GEKKO(remote=False); m.options.IMODE=2
c  = m.Array(m.FV,nc+1)
for ci in c:
    ci.STATUS=1
    ci.LOWER = -10
    ci.UPPER =  10
xd = m.Array(m.Param,nc)
for i in range(nc):
    xd[i].value = x[:,i]
yd = m.Param(y); yp = m.Var()
s =  m.sum([c[i]*xd[i] for i in range(nc)])
m.Equation(yp==s+c[-1])
m.Minimize((yd-yp)**2)
m.solve(disp=True)
a = [c[i].value[0] for i in range(nc+1)]
print('Solve time: ' + str(m.options.SOLVETIME))
print('Coefficients: ' + str(a))

It uses the nonlinear solver IPOPT to solve the problem that is better than the scipy.optimize.minimize solver. There are other constrained optimization methods in Python as well as discussed in Is there a high quality nonlinear programming solver for Python?.

Craft answered 4/9, 2020 at 0:41 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.