|Chandan Singh e614c0bf37 add downloads badge||1 day ago|
|.github||2 weeks ago|
|docs||4 days ago|
|imodels||4 days ago|
|notebooks||1 week ago|
|tests||4 days ago|
|.gitignore||4 weeks ago|
|.jupytext.toml||3 weeks ago|
|_config.yml||4 months ago|
|license.md||4 months ago|
|readme.md||1 day ago|
|setup.py||2 weeks ago|
Python package for concise, transparent, and accurate predictive modeling. All sklearn-compatible and easily customizable.
Implementations of different popular interpretable models can be easily used and installed:
from imodels import BayesianRuleListClassifier, GreedyRuleListClassifier, SkopeRulesClassifier from imodels import SLIMRegressor, RuleFitRegressor model = BayesianRuleListClassifier() # initialize a model model.fit(X_train, y_train) # fit model preds = model.predict(X_test) # discrete predictions: shape is (n_test, 1) preds_proba = model.predict_proba(X_test) # predicted probabilities: shape is (n_test, n_classes)
pip install imodels (see here for help). Contains the following models:
|Rulefit rule set||🗂️, 🔗, 📄||Extracts rules from a decision tree then builds a sparse linear model with them|
|Skope rule set||🗂️, 🔗||Extracts rules from gradient-boosted trees, deduplicates them, then forms a linear combination of them based on their OOB precision|
|Boosted rule set||🗂️, 🔗, 📄||Uses Adaboost to sequentially learn a set of rules|
|Bayesian rule list||🗂️, 🔗, 📄||Learns a compact rule list by sampling rule lists (rather than using a greedy heuristic)|
|Greedy rule list||🗂️, 🔗||Uses CART to learn a list (only a single path), rather than a decision tree|
|OneR rule list||🗂️, 📄||Learns rule list restricted to only one feature|
|Optimal rule tree||🗂️, 🔗, 📄||(In progress) Learns succinct trees using global optimization rather than greedy heuristics|
|Iterative random forest||🗂️, 🔗, 📄||(In progress) Repeatedly fit random forest, giving features with high importance a higher chance of being selected.|
|Sparse integer linear model||🗂️, 📄||Forces coefficients to be integers|
|Rule sets||⌛||(Coming soon) Many popular rule sets including SLIPPER, Lightweight Rule Induction, MLRules|
Docs 🗂️, Reference code implementation 🔗, Research paper 📄More models coming soon!
The final form of the above models takes one of the following forms, which aim to be simultaneously simple to understand and highly predictive:
|Rule set||Rule list||Rule tree||Algebraic models|
Different models and algorithms vary not only in their final form but also in different choices made during modeling. In particular, many models differ in the 3 steps given by the table below.
See the docs for individual models for futher descriptions.
|Rule candidate generation||Rule selection||Rule pruning / combination|
The code here contains many useful and customizable functions for rule-based learning in the util folder. This includes functions / classes for rule deduplication, rule screening, and converting between trees, rulesets, and neural networks.
Demos are contained in the notebooks folder.
imodelsfor deriving a clinical decision rule
Different models support different machine-learning tasks. Current support for different models is given below:
|Model||Binary classification||Multi-class classification||Regression|
|Rulefit rule set||✔️||✔️|
|Skope rule set||✔️|
|Boosted rule set||✔️|
|Bayesian rule list||✔️|
|Greedy rule list||✔️|
|OneR rule list||✔️|
|Optimal rule tree|
|Iterative random forest|
|Sparse integer linear model||✔️|