Linear Regression with positive coefficients in Python
Asked Answered
A

3

11

I'm trying to find a way to fit a linear regression model with positive coefficients.

The only way I found is sklearn's Lasso model, which has a positive=True argument, but doesn't recommend using with alpha=0 (means no other constraints on the weights).

Do you know of another model/method/way to do it?

Antimagnetic answered 14/3, 2016 at 11:43 Comment(0)
Y
6

IIUC, this is a problem which can be solved by the scipy.optimize.nnls, which can do non-negative least squares.

Solve argmin_x || Ax - b ||_2 for x>=0.

In your case, b is the y, A is the X, and x is the β (coefficients), but, otherwise, it's the same, no?

Yetac answered 14/3, 2016 at 11:56 Comment(0)
O
4

Many functions can keep linear regression model with positive coefficients.

  1. scipy.optimize.nnls can solve above problem.
  2. scikit-learn LinearRegression can set the parameter positive=True to solve this. And, the sklearn also uses the scipy.optimize.nnls. Interestingly, you can learn how to write multiple targets outputs in source code.
  3. Additionally, if you want to solve linear least squares with bounds on the variables. You can see lsq_linear .
Oratorio answered 15/6, 2021 at 6:35 Comment(0)
E
1

As of version 0.24, scikit-learn LinearRegression includes a similar argument positive, which does exactly that; from the docs:

positive : bool, default=False

When set to True, forces the coefficients to be positive. This option is only supported for dense arrays.

New in version 0.24.

Encrimson answered 15/3, 2021 at 18:26 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.