How to use custom classifiers in ensemble classifiers in sklearn?
Asked Answered
O

3

8

I read that the builtin ensemble methods in sklearn use decision trees as the base classifiers. Is it possible to use custom classifiers instead?

Opera answered 9/5, 2012 at 14:39 Comment(0)
N
3

If you mean the random forest classes, then no, this is currently not possible. The option to allow other estimators was discussed on the scikit-learn mailing list last January, but I don't believe any actual code has come out that discussion.

Nudge answered 9/5, 2012 at 16:10 Comment(0)
H
3

If you use sklearn.ensemble.AdaBoostClassifier, then the answer is yes: scikit-learn.org/stable/modules/generated/sklearn.ensemble.AdaBoostClassifier.html You can assign base_estimator yourself.

Herzog answered 8/5, 2014 at 23:31 Comment(0)
X
2

I don't know if it helps, but you can very easily stack/combine custom classifiers using the Pipeline utilities: http://scikit-learn.org/stable/tutorial/statistical_inference/putting_together.html#pipelining

Xylem answered 22/5, 2013 at 12:33 Comment(2)
Pipelines are not ensemble methods. They only combine a single classifier with a series of preprocessing steps.Nudge
Yes you are right. But what I meant is that using Pipelining and FeatureUnion can be used together to combine homogenous or heterogenous models in few lines of code. Ramp github.com/kvh/ramp uses this principle a lot for example.Xylem

© 2022 - 2024 — McMap. All rights reserved.