Interpretable machine learning models (imodels) 🔍
Python package for concise, transparent, and accurate predictive modeling. All sklearn-compatible and easily understandable. Pull requests very welcome!
Github • Popular imodels • Custom imodels • Demo notebooks
Popular interpretable models
Implementations of different interpretable models, all compatible with scikit-learn. The interpretable models can be easily used and installed:
from imodels import BayesianRuleListClassifier, GreedyRuleListClassifier, SkopeRulesClassifier
from imodels import SLIMRegressor, RuleFitRegressor
model = BayesianRuleListClassifier() # initialize a model
model.fit(X_train, y_train) # fit model
preds = model.predict(X_test) # discrete predictions: shape is (n_test, 1)
preds_proba = model.predict_proba(X_test) # predicted probabilities: shape is (n_test, n_classes)
Install with pip install imodels
(see here for help). Contains the following models:
Model | Reference | Description |
---|---|---|
Rulefit | 🗂️, 🔗, 📄 | Extracts rules from a decision tree then builds a sparse linear model with them |
Skope rules | 🗂️, 🔗 | Extracts rules from gradient-boosted trees, deduplicates them, then forms a linear combination of them based on their OOB precision |
Bayesian rule list | 🗂️, 🔗, 📄 | Learns a compact rule list by sampling rule lists (rather than using a greedy heuristic) |
Greedy rule list | 🗂️, 🔗 | Uses CART to learn a list (only a single path), rather than a decision tree |
Iterative random forest | 🗂️, 🔗, 📄 | (In progress) Repeatedly fit random forest, giving features with high importance a higher chance of being selected. |
Optimal classification tree | 🗂️, 🔗, 📄 | (In progress) Learns succinct trees using global optimization rather than greedy heuristics |
Sparse integer linear model | 🗂️, 📄 | Forces coefficients to be integers |
Rule sets | (Coming soon) Many popular rule sets including SLIPPER, Lightweight Rule Induction, MLRules |
Docs 🗂️, Reference code implementation 🔗, Research paper 📄
Custom interpretable models
The code here contains many useful and readable functions for a variety of rule-based models, contained in the util folder. This includes functions and simple classes for rule deduplication, rule screening, converting between trees, rulesets, and pytorch neural nets. The final derived rules easily allows for extending any of the following general classes of models:
Rule set | Rule list | (Decision) Rule tree | Algebraic models |
---|---|---|---|
![]() |
![]() |
![]() |
![]() |
Demo notebooks
Demos are contained in the notebooks folder.
- model_based.ipynb, demos the imodels package. It shows how to fit, predict, and visualize with different interpretable models
- this notebook shows an example of using
imodels
for deriving a clinical decision rule - we also include some demos of posthoc analysis, which occurs after fitting models
- posthoc.ipynb - shows different simple analyses to interpret a trained model
- uncertainty.ipynb - basic code to get uncertainty estimates for a model
References
- Readings
- Reference implementations (also linked above): the code here heavily derives from the wonderful work of previous projects. We seek to to extract out, unify, and maintain key parts of these projects.
- sklearn-expertsys - by @tmadl and @kenben based on original code by Ben Letham
- rulefit - by @christophM
- skope-rules - by the skope-rules team (including @ngoix, @floriangardin, @datajms, Bibi Ndiaye, Ronan Gautier)
- Compatible packages
- Related packages
For updates, star the repo, see this related repo, or follow @csinva_. Please make sure to give authors of original methods / base implementations appropriate credit!
Expand source code
"""
.. include:: ../readme.md
"""
# Python `imodels` package for interpretable models compatible with scikit-learn.
# Github repo available [here](https://github.com/csinva/interpretability-implementations-demos).
from .rule_list.bayesian_rule_list.bayesian_rule_list import BayesianRuleListClassifier
from .rule_list.greedy_rule_list import GreedyRuleListClassifier
from .rule_set.rule_fit import RuleFitRegressor
from .rule_set.skope_rules import SkopeRulesClassifier
# from .tree.iterative_random_forest.iterative_random_forest import IRFClassifier
# from .tree.optimal_classification_tree import OptimalTreeModel
from .algebraic.slim import SLIMRegressor
CLASSIFIERS = BayesianRuleListClassifier, GreedyRuleListClassifier #, IRFClassifier
REGRESSORS = RuleFitRegressor, SkopeRulesClassifier, SLIMRegressor
Sub-modules
algebraic
rule_list
rule_set
tree
util
-
Shared utilities for implementing different interpretable models.