SUBPLEX: A Visual Analytics Approach to Understand Local Model Explanations At the Subpopulation Level

Jun Yuan, Gromit Yeuk Yin Chan, Brian Barr, Kyle Overton, Kim Rees, Luis Gustavo Nonato, Enrico Bertini, Claudio T. Silva

Research output: Contribution to journalArticlepeer-review

Abstract

Understanding the interpretation of machine learning (ML) models has been of paramount importance when making decisions with societal impacts such as transport control, financial activities, and medical diagnosis. While local explanation techniques are popular methods to interpret ML models on a single instance, they do not scale to the understanding of a model&#x0027;s behavior on the whole dataset. In this paper, we outline the challenges and needs of visually analyzing local explanations and propose <monospace>SUBPLEX</monospace>, a visual analytics approach to help users understand local explanations with subpopulation visual analysis. <monospace>SUBPLEX</monospace> provides steerable clustering and projection visualization techniques that allow users to derive interpretable subpopulations of local explanations with users expertise. We evaluate our approach through two use cases and experts&#x0027; feedback.

Original languageEnglish (US)
Pages (from-to)1-14
Number of pages14
JournalIEEE Computer Graphics and Applications
DOIs
StateAccepted/In press - 2022

Keywords

  • Analytical models
  • Computational modeling
  • Data models
  • Data visualization
  • Human in the loop
  • Task analysis
  • Visual analytics

ASJC Scopus subject areas

  • Software
  • Computer Graphics and Computer-Aided Design

Fingerprint

Dive into the research topics of 'SUBPLEX: A Visual Analytics Approach to Understand Local Model Explanations At the Subpopulation Level'. Together they form a unique fingerprint.

Cite this