I have a classic linear regression problem of the form:
y = X b
where y
is a response vector X
is a matrix of input variables and b
is the vector of fit parameters I am searching for.
Python provides b = numpy.linalg.lstsq( X , y )
for solving problems of this form.
However, when I use this I tend to get either extremely large or extremely small values for the components of b
.
I'd like to perform the same fit, but constrain the values of b
between 0 and 255.
It looks like scipy.optimize.fmin_slsqp()
is an option, but I found it extremely slow for the size of problem I'm interested in (X
is something like 3375 by 1500
and hopefully even larger).
- Are there any other Python options for performing constrained least squares fits?
- Or are there python routines for performing Lasso
Regression or Ridge Regression or some other regression method
which penalizes large
b
coefficient values?