Alvin Wan, Lisa Dunlap*, Daniel Ho*, Jihan Yin, Scott Lee, Henry Jin, Suzanne Petryk, Sarah Adel Bargal, Joseph E. Gonzalez
Our models, termed Neural-Backed Decision Trees, are as accurate as state-of-the-art neural networks on image classification and as interpretable as decision trees.
Provide our classification model with an image of your choice, or pick one of our suggested images. Unlike a run-of-the-mill neural network, our NBDT returns sequential decisions leading up to a prediction.
Animal
97.2% probability
Chordate
97.2% probability
Carnivore
97.2% probability
97.2% probability
Machine learning applications such as finance and medicine demand accurate and justifiable predictions, barring most deep learning methods from use. In response, previous work combines decision trees with deep learning, creating a dilemma:
We forgo this dilemma by proposing Neural-Backed Decision Trees (NBDTs), modified hierarchical classifiers that use trees constructed in weight-space. Our NBDTs achieve (1) interpretability and (2) neural network accuracy: We preserve interpretable properties -- e.g., leaf purity and a non-ensembled model -- and demonstrate interpretability of model predictions both qualitatively and quantitatively. Furthermore, NBDTs match state-of-the-art neural networks on CIFAR10, CIFAR100, TinyImageNet, and ImageNet to within 1-2%. This yields state-of-the-art interpretable models on ImageNet, with NBDTs besting all decision-tree-based methods by ~14% to attain 75.30% top-1 accuracy. Code and pretrained NBDTs are on Github.
Our work culminates in three key contributions that you can takeaway for future research:
Installation is just one line.
pip install nbdt
Run on any image of your choosing.
nbdt https://bit.ly/3eiuCId