What is the difference between objective
and feval
in xgboost in R? I know this is something very fundamental but I am unable to exactly define them/ their purpose.
Also, what is a softmax objective, while doing multi class classification?
What is the difference between objective
and feval
in xgboost in R? I know this is something very fundamental but I am unable to exactly define them/ their purpose.
Also, what is a softmax objective, while doing multi class classification?
Objective
Objective
in xgboost
is the function which the learning algorithm will try and optimize. By definition, it must be able to create 1st (gradient) and 2nd (hessian) derivatives w.r.t. the predictions at a given training round.
A custom Objective
function example:link
# user define objective function, given prediction, return gradient and second order gradient
# this is loglikelihood loss
logregobj <- function(preds, dtrain) {
labels <- getinfo(dtrain, "label")
preds <- 1/(1 + exp(-preds))
grad <- preds - labels
hess <- preds * (1 - preds)
return(list(grad = grad, hess = hess))
}
This is the critical function to training and no xgboost
model can be trained without defining one. Objective
functions are directly used in splitting at each node in each tree.
feval
feval
in xgboost
plays no role in directly optimizing or training your model. You don't even need one to train. It doesn't impact splitting. All it does is score your model AFTER it has trained. A look at a example of a custom feval
evalerror <- function(preds, dtrain) {
labels <- getinfo(dtrain, "label")
err <- as.numeric(sum(labels != (preds > 0)))/length(labels)
return(list(metric = "error", value = err))
}
Notice, it just returns a name(metric) and a score(value). Typically the feval
and objective
could be the same, but maybe the scoring mechanism you want is a little different, or doesn't have derivatives. For example, people use the logloss objective
to train, but create an AUC feval
to evaluate the model.
Furthermore you can use the feval
to stop your model from training once it stops improving. And you can use multiple feval
functions to score your model in different ways and observe them all.
You do not need a feval
function to train a model. Only to evaluate it, and help it stop training early.
Summary:
Objective
is the main workhorse.
feval
is a helper to allow xgboost
to do some cool things.
softmax
is an objective
function that is commonly used in multi-class classification. It insures that all your predictions sum to one, and are scaled using the exponential function. softmax
© 2022 - 2024 — McMap. All rights reserved.