Empirical studies in information visualization: Seven scenarios

Heidi Lam, Enrico Bertini, Petra Isenberg, Catherine Plaisant, Sheelagh Carpendale

    Research output: Contribution to journalReview articlepeer-review

    Abstract

    We take a new, scenario-based look at evaluation in information visualization. Our seven scenarios, evaluating visual data analysis and reasoning, evaluating user performance, evaluating user experience, evaluating environments and work practices, evaluating communication through visualization, evaluating visualization algorithms, and evaluating collaborative data analysis were derived through an extensive literature review of over 800 visualization publications. These scenarios distinguish different study goals and types of research questions and are illustrated through example studies. Through this broad survey and the distillation of these scenarios, we make two contributions. One, we encapsulate the current practices in the information visualization research community and, two, we provide a different approach to reaching decisions about what might be the most effective evaluation of a given information visualization. Scenarios can be used to choose appropriate research questions and goals and the provided examples can be consulted for guidance on how to design one's own study.

    Original languageEnglish (US)
    Article number6095544
    Pages (from-to)1520-1536
    Number of pages17
    JournalIEEE Transactions on Visualization and Computer Graphics
    Volume18
    Issue number9
    DOIs
    StatePublished - 2012

    Keywords

    • Information visualization
    • evaluation

    ASJC Scopus subject areas

    • Software
    • Signal Processing
    • Computer Vision and Pattern Recognition
    • Computer Graphics and Computer-Aided Design

    Fingerprint

    Dive into the research topics of 'Empirical studies in information visualization: Seven scenarios'. Together they form a unique fingerprint.

    Cite this