top of page

Feature Importance

Feature Importance plots are a great way to get a sense quickly how much a model is relying on each feature to make it predictions.


Feature importance plot shows the global effect each feature has on the model. There are a few different techniques to calculate feature importance. Our implementation is based on a game theory method called Shapley values. Unlike many other methods, the Shapley method has a strong theoretical basis. In short, the features are treated as players that can form a coalition and play games. The outcome of the game is the prediction of the model. The importance of each feature is it’s average contribution to the different coalitions in comparison to the average prediction across all instances. The exact calculation of feature importance based on Shapley value is computationally inefficient, hence we use an approximation to this method.


The feature importance plot shows the average absolute feature importance across all samples, ordered by importance from top to bottom are plotted.


To learn more about shapely values, please refer to https://christophm.github.io/interpretable-ml-book/shapley.html


Comments


bottom of page