Is there a difference between feature effect (eg SHAP effect) and feature importance in machine learning terminologies?
2 Answers
In A Unified Approach to Interpreting Model Predictions the authors define SHAP values "as a unified measure of feature importance". That is, SHAP values are one of many approaches to estimate feature importance.
This e-book provides a good explanation, too:
The goal of SHAP is to explain the prediction of an instance x by computing the contribution of each feature to the prediction. [...] SHAP feature importance is an alternative to permutation feature importance. There is a big difference between both importance measures: Permutation feature importance is based on the decrease in model performance. SHAP is based on magnitude of feature attributions.
- 5,605
- 1
- 11
- 23
SHAP values estimate the impact of a feature on predictions whereas feature importances estimate the impact of a feature on model fit.
- 1,695
- 9
- 20