Is there a way to put an l2-Penalty for the logistic regression model in statsmodel through a parameter or something else? I just found the l1-Penalty in the docs but nothing for the l2-Penalty.
The models in statsmodels.discrete like Logit, Poisson and MNLogit have currently only L1 penalization. However, elastic net for GLM and a few other models has recently been merged into statsmodels master.
GLM with family binomial with a binary response is the same model as discrete.Logit although the implementation differs. See my answer for L2 penalization in Is ridge binomial regression available in Python?
What has not yet been merged into statsmodels is L2 penalization with a structured penalization matrix as it is for example used as roughness penality in generalized additive models, GAM, and spline fitting.
If you look closely at the Documentation for statsmodels.regression.linear_model.OLS.fit_regularized
you'll see that the current version of statsmodels allows for Elastic Net regularization which is basically just a convex combination of the L1- and L2-penalties (though more robust implementations employ some post-processing to diminish undesired behaviors of the naive implementations, see "Elastic Net" on Wikipedia for details):
If you take a look at the parameters for fit_regularized
in the documentation:
OLS.fit_regularized(method='elastic_net', alpha=0.0, L1_wt=1.0, start_params=None, profile_scale=False, refit=False, **kwargs)
you'll realize that L1_wt
is just lambda_1 in the first equation. So to get the L2-Penalty you're looking for, you just pass L1_wt=0
as an argument when you call the function. As an example:
model = sm.OLS(y, X)
results = model.fit_regularized(method='elastic_net', alpha=1.0, L1_wt=0.0)
print(results.summary())
should give you an L2 Penalized Regression predicting target y
from input X
.
Three final comments:
statsmodels currently only implements
elastic_net
as an option to themethod
argument. So that gives you L1 and L2 and any linear combination of them but nothing else (for OLS at least);L1 Penalized Regression = LASSO (least absolute shrinkage and selection operator);
L2 Penalized Regression = Ridge Regression, the Tikhonov–Miller method, the Phillips–Twomey method, the constrained linear inversion method, and the method of linear regularization.
logistic_regression_model = sm.GLM( y, X, link=sm.genmod.families.links.logit) results = logistic_regression_model.fit_regularized(alpha=1.) results.summary()
, I get an error on results.summary()
that summary is not implemented. Why does my fit_regularized()
not return a summary? –
Laris © 2022 - 2024 — McMap. All rights reserved.