How to specify positive label when use precision as scoring in GridSearchCV
Asked Answered
E

1

7
model = sklearn.model_selection.GridSearchCV(
        estimator = est, 
        param_grid = param_grid,
        scoring = 'precision',
        verbose = 1,
        n_jobs = 1,
        iid = True,
        cv = 3)

In sklearn.metrics.precision_score(y, y_pred,pos_label=[0]), I can specify the positive label, how can I specify this in GridSearchCV too?

If there is no way to specify, when using custom scoring, how can I define?

I have tried this:

custom_score = make_scorer(precision_score(y, y_pred,pos_label=[0]),  
                          greater_is_better=True)  

but I got error:

NameError: name 'y_pred' is not defined
Egmont answered 19/6, 2018 at 17:4 Comment(0)
H
9

Reading the docs, you can pass any kwargs into make_scorer and they will be automatically passed into the score_func callable.

from sklearn.metrics import precision_score, make_scorer
custom_scorer = make_scorer(precision_score, greater_is_better=True,  pos_label=0)

Then you pass this custom_scorer to GridSearchCV:

gs = GridSearchCV(est, ..., scoring=custom_scorer)
Hermaphroditism answered 19/6, 2018 at 17:43 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.