I have a logistic regression model with a defined set of parameters (warm_start=True
).
As always, I call LogisticRegression.fit(X_train, y_train)
and use the model after to predict new outcomes.
Suppose I alter some parameters, say, C=100
and call .fit
method again using the same training data.
Theoretically, for the second time, I think .fit
should take less computational time as compared to the model with warm_start=False
. However, empirically is not actually true.
Please, help me understand the concept of warm_start
parameter.
P.S.: I have also implemented
SGDClassifier()
for an experimentation.