Scoring Metric for scikit-learn's LassoCV
Asked Answered
T

2

5

I'm using scikit-learn's LassoCV function. During cross-validation, what scoring metric is being used by default?

I would like cross-validation to be based on "Mean squared error regression loss". Can one use this metric with LassoCV? One can specify a scoring metric for LogisticRegressionCV, so it may be possible with LassoCV too?

Tertullian answered 22/5, 2017 at 5:22 Comment(3)
Not possible in current implementation. You can put this as an issue to scikit-learn github page and see what the response is.Menhir
Do you know what the current scoring metric is?Tertullian
R2 is the default metric for most regression estimators. See the description of score() for LassoCVMenhir
U
6

LassoCV uses R^2 as the scoring metric. From the docs:

By default, parameter search uses the score function of the estimator to evaluate a parameter setting. These are the sklearn.metrics.accuracy_score for classification and sklearn.metrics.r2_score for regression.

To use an alternative scoring metric, such as mean squared error, you need to use GridSearchCV or RandomizedSearchCV (instead of LassoCV) and specify the scoring parameter as scoring='neg_mean_squared_error'. From the docs:

An alternative scoring function can be specified via the scoring parameter to GridSearchCV, RandomizedSearchCV and many of the specialized cross-validation tools described below.

Uphroe answered 10/11, 2017 at 11:14 Comment(0)
P
3

I think the accepted answer is wrong, as it quotes the documentation of Grid Search, but LassoCV uses regularisation paths, not grid search. In fact in the docs page for LassoCV, it says that the loss function is:

(1 / (2 * n_samples)) * ||y - Xw||^2_2 + alpha * ||w||_1

Meaning that its minimising the MSE (plus the LASSO term).

Privative answered 26/8, 2019 at 5:8 Comment(1)
I don't think it's wrong. I think you two are answering different questions, with the accepted answer answering the OP's question, and you answering a different question. The OP asked for the "scoring metric" DURING CROSS VALIDATION. That's not equivalent to asking what objective function is solved, which is what you're answering, I think?Genitourinary

© 2022 - 2024 — McMap. All rights reserved.