Alvin Wan, Lisa Dunlap*, Daniel Ho*, Jihan Yin, Scott Lee, Henry Jin, Suzanne Petryk, Sarah Adel Bargal, Joseph E. Gonzalez
Our models, termed Neural-Backed Decision Trees, improve both accuracy and interpretability of modern neural networks on image classification.
Provide our classification model with an image of your choice, or pick one of our suggested images. Unlike a run-of-the-mill neural network, our NBDT returns sequential decisions leading up to a prediction.
Animal
97.2% probability
Chordate
97.2% probability
Carnivore
97.2% probability
97.2% probability
Machine learning applications such as finance and medicine demand accurate and justifiable predictions, barring most deep learning methods from use. In response, previous work combines decision trees with deep learning, yielding models that:
We forgo this dilemma by proposing Neural-Backed Decision Trees (NBDTs). NBDTs replace a neural network’s final linear layer with a differentiable sequence of decisions and a surrogate loss.This forces the model to learn high-level concepts and lessens reliance on highly-uncertain decisions, yielding both:
Code and pretrained NBDTs are on Github.
Our work culminates in three key contributions that you can takeaway for future research:
Installation is just one line.
pip install nbdt
Run on any image of your choosing.
nbdt https://bit.ly/3eiuCId