Answer:
Both "Feature Importances" and "SHAP Averages" are measures of how each of the selected features impact predictions. On the surface, they appear to be saying the exact same thing, namely “how important is each of the features to the model and predictions”. Yet they do not always list the features in the same order of importance. This is because they’re actually not the same thing.
- Feature Importances are a measure of how important each feature is to the model itself.
- SHAP Averages, on the other hand, are a measure of how important each feature is to the predictions made in the particular run you’re looking at. SHAP Averages are an average of the absolute values and therefore not directional.
If you persist a model then run it on a different population, the Feature Importances will remain static while the SHAP Averages can change. A feature might be highly predictive of someone either terminating or remaining employed, and both are considered highly impactful. Directionality can be gleaned from the SHAP beeswarm chart, which is explained in this FAQ.
Both feature importances and SHAP averages can be found in the Results Summary Report (One AI >Runs > Run Label > Results Summary)
Comments
0 comments
Please sign in to leave a comment.