I read that the builtin ensemble methods in sklearn use decision trees as the base classifiers. Is it possible to use custom classifiers instead?
How to use custom classifiers in ensemble classifiers in sklearn?
Asked Answered
If you mean the random forest classes, then no, this is currently not possible. The option to allow other estimators was discussed on the scikit-learn mailing list last January, but I don't believe any actual code has come out that discussion.
If you use sklearn.ensemble.AdaBoostClassifier
, then the answer is yes:
scikit-learn.org/stable/modules/generated/sklearn.ensemble.AdaBoostClassifier.html
You can assign base_estimator yourself.
I don't know if it helps, but you can very easily stack/combine custom classifiers using the Pipeline utilities: http://scikit-learn.org/stable/tutorial/statistical_inference/putting_together.html#pipelining
Pipelines are not ensemble methods. They only combine a single classifier with a series of preprocessing steps. –
Nudge
Yes you are right. But what I meant is that using Pipelining and FeatureUnion can be used together to combine homogenous or heterogenous models in few lines of code. Ramp github.com/kvh/ramp uses this principle a lot for example. –
Xylem
© 2022 - 2024 — McMap. All rights reserved.