TY - CONF
T1 - FASTSHAP
T2 - 10th International Conference on Learning Representations, ICLR 2022
AU - Jethani, Neil
AU - Sudarshan, Mukund
AU - Covert, Ian
AU - Lee, Su In
AU - Ranganath, Rajesh
N1 - Funding Information:
We thank the reviewers for their thoughtful feedback, and we thank the Lee Lab for helpful discussions. Neil Jethani was partially supported by NIH T32 GM136573. Mukund Sudarshan was partially supported by a PhRMA Foundation Predoctoral Fellowship. Mukund Sudarshan and Rajesh Ranganath were partly supported by NIH/NHLBI Award R01HL148248, and by NSF Award 1922658 NRT-HDR: FUTURE Foundations, Translation, and Responsibility for Data Science. Ian Covert and Su-In Lee were supported by the NSF Awards CAREER DBI-1552309 and DBI-1759487; the NIH Awards R35GM128638 and R01NIAAG061132; and the American Cancer Society Award 127332-RSG-15-097-01-TBG.
Publisher Copyright:
© 2022 ICLR 2022 - 10th International Conference on Learning Representationss. All rights reserved.
PY - 2022
Y1 - 2022
N2 - Although Shapley values are theoretically appealing for explaining black-box models, they are costly to calculate and thus impractical in settings that involve large, high-dimensional models. To remedy this issue, we introduce FastSHAP, a new method for estimating Shapley values in a single forward pass using a learned explainer model. To enable efficient training without requiring ground truth Shapley values, we develop an approach to train FastSHAP via stochastic gradient descent using a weighted least squares objective function. In our experiments with tabular and image datasets, we compare FastSHAP to existing estimation approaches and find that it generates accurate explanations with an orders-of-magnitude speedup.
AB - Although Shapley values are theoretically appealing for explaining black-box models, they are costly to calculate and thus impractical in settings that involve large, high-dimensional models. To remedy this issue, we introduce FastSHAP, a new method for estimating Shapley values in a single forward pass using a learned explainer model. To enable efficient training without requiring ground truth Shapley values, we develop an approach to train FastSHAP via stochastic gradient descent using a weighted least squares objective function. In our experiments with tabular and image datasets, we compare FastSHAP to existing estimation approaches and find that it generates accurate explanations with an orders-of-magnitude speedup.
UR - http://www.scopus.com/inward/record.url?scp=85150391250&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85150391250&partnerID=8YFLogxK
M3 - Paper
AN - SCOPUS:85150391250
Y2 - 25 April 2022 through 29 April 2022
ER -