Doctoral Thesis: Simple Linear Classifiers via Discrete Optimization: Learning Certifiably Optimal Scoring Systems for Decision-Making and Risk Assessment
Scoring systems are linear classification models that let users make quick predictions by adding, subtracting, and multiplying a few small numbers. These models are widely used in medicine and criminal justice because they are easy to understand and validate. In spite of extensive deployment, many scoring systems are built using ad hoc approaches that combine statistical techniques, heuristics, and expert judgement. Such approaches impose steep trade-offs with performance, making it difficult for practitioners to build scoring systems that will be used and accepted.
In this dissertation, we present two new machine learning methods to learn scoring systems from data:
1. SLIM (Supersparse Linear Integer Models) for decision-making applications;
2. RiskSLIM (Risk-calibrated Supersparse Linear Integer Models) for risk assessment applications.
Both SLIM and RiskSLIM solve hard discrete optimization problems to learn scoring systems that are fully optimized for feature selection, small integer coefficients, and operational constraints. We formulate these problems as integer programming problems and develop specialized algorithms to recover certifiably optimal solutions with an integer programming solver.
We show the benefits of this approach by building scoring systems for real-world problems such as recidivism prediction, sleep apnea screening, ICU seizure prediction, and adult ADHD diagnosis. The results show that a discrete optimization approach can learn simple models that perform well in comparison to the state-of-the-art, but that are far easier to customize, understand, and validate.
Thesis Supervisor: Cynthia Rudin
Thesis Committee: Leslie Kaebling, Stefanie Jegelka