Light gbm has the metric with log_loss for binary or multi classification.
is Random Forest also has the loss function with log_loss?
Asked
Active
Viewed 4,906 times
1
slowmonk
- 513
- 2
- 9
- 17
1 Answers
1
Yes, there are decision tree algorithms using this criterion, e.g. see C4.5 algorithm, and it is also used in random forest classifiers. See, for example, the random forest classifier scikit learn documentation:
criterion: string, optional (default=”gini”) The function to measure the quality of a split. Supported criteria are “gini” for the Gini impurity and “entropy” for the information gain. Note: this parameter is tree-specific.
Please note that usually it is referred to as cross-entropy or information gain rather than log loss.
For some background information you can give this and the links provided in the answers a read (although they may refer rather to decision trees): When should I use Gini Impurity as opposed to Information Gain?
Jonathan
- 5,605
- 1
- 11
- 23