You have set up a machine learning workspace where you have completed a number of ML experiments on your dataset.
Finally, the run showing the best performance has been selected.
You are now in the phase of examining the test results in detail, to understand the contribution of the features to the predictions of the model.
Which combination of explainers would you NOT select for interpreting global and local feature importance?
Click on the arrows to vote for the correct answer
A. B. C. D.Answer: A.
Option A is CORRECT because while Permutation Feature Importance (PFI) model explainer can only be used to explain how strongly the features contribute to the prediction at the dataset level, it doesn't support evaluation of local importances.
Option B is incorrect because the Mimic Explainer can be used for interpreting both the global and local importance of features, so this option is valid.
Option C is incorrect because the Tabular Explainer can be used for interpreting both the global and local importance of features, so this option is valid.
Option D is incorrect because the PFI Explainer can be used for interpreting the global feature importance while the Tabular Explainer is a good choice for examining local importance, so this option is valid.
Reference:
The question is asking about the combination of explainers that should NOT be used for interpreting global and local feature importance. Let's start by briefly defining what these explainers are:
Permutation Feature Importance (PFI): This explainer evaluates the importance of each feature in a model by randomly shuffling the values of that feature and measuring the impact on the model's performance. The larger the drop in performance, the more important the feature is considered to be.
Mimic: This explainer is a type of model-agnostic explanation method that approximates a black-box model's behavior using a more transparent model that is easier to interpret. It is typically used for global explanations.
Tabular: This explainer is a type of model-agnostic explanation method that provides both global and local explanations by computing feature importance scores and generating local feature importance plots, respectively.
Now, let's examine each of the answer choices:
A. Permutation Feature Importance (PFI) for global; PFI for local
This combination uses PFI for both global and local explanations. While PFI is a useful method for evaluating feature importance, it may not always be the best choice for local explanations. This is because PFI is a perturbation-based method that relies on random shuffling of feature values, which may not always reflect the true importance of a feature in a specific instance. Therefore, this combination may not be the best choice for interpreting local feature importance.
B. Mimic for global; Mimic for local
This combination uses Mimic for both global and local explanations. While Mimic is a good choice for global explanations, it is not well-suited for local explanations because it provides only global approximations of the black-box model's behavior.
C. Tabular for global; Tabular for local
This combination uses Tabular for both global and local explanations. This is a reasonable choice as Tabular is a model-agnostic explainer that can provide both global and local explanations in a consistent manner.
D. Permutation Feature Importance (PFI) for global; Tabular for local.
This combination uses PFI for global explanations and Tabular for local explanations. This is a reasonable choice as PFI is a good method for evaluating global feature importance, while Tabular can provide more accurate local explanations than PFI.
Therefore, the combination that should NOT be selected for interpreting global and local feature importance is B.