LIME¶
-
ExpyBox.
lime
()¶ Create dialog for lime
- Returns
None
Method parameters¶
- Number of features
Maximum number of features present in explanation
- Number of samples
Size of the neighborhood to learn the surrogate linear model
- Kernel width
Kernel width for the exponential kernel. Actual value used will be the
inputted value * sqrt(train_data.shape[1])
.
- Feature selection
Feature selection method for choosing the best features for surrogate model. There are following options:
forward_selection: iteratively add features to the model (costly when num_features is high)
highest_weights: selects the features that have the highest product of absolute weight * original data point when learning with all the features
lasso_path: choose features based on the lasso regularization path
none: use all features, ignore
Number of features
optionauto: use forward_selection if
Number of features
<= 6, and highest_weights otherwise
- Discretize continuous
Whether to discretize all non-categorical features
- Discretizer
Which discretizer to use when discretizing continuous features. Only matters if discretize continuous is True.
- Distance metric
What distance metric to use for calculating weights of perturbed instances.
It’s used as an
distance_metric
argument forsklearn.metrics.pairwise_distances()
. Documentation of the function (with options for metric): sklearn.metrics.pairwise_distances
- Variable with model regressor
If you want to use a different regressor than Ridge regressor you can specify a variable in provided
kernel_globals
dictionary with the regressor.It must have
model_regressor.coef_
and “sample_weight” as a parameter tomodel_regressor.fit()