User support systems: Lessons learned from implementing multiple interaction methods during testing

Will Altoff, Tyler Duke, Dylan Schouten, Casper Harteveld, Camillia Matuk, Gillian Smith, Steven C. Sutherland

Research output: Chapter in Book/Report/Conference proceedingChapter (peer-reviewed)peer-review

Abstract

To master the functions and tasks of a game, players must learn how to play the game. When conceptual learning outcomes are expected, additional skills are required to master those concepts. Methods, such as the Wizard of Oz technique, which require users to interact with a computer support tool, have been used to help improve usability and learnability of products and interfaces; however, little attention has been given to how these approaches may help with effective scaffolding with respect to constructionist game design tools. Students created research experiment games in StudyCrafter. We introduced a multiple-interaction technique of providing feedback via querying the “system” or instructor and found that students typically initiate interactions with support tools to address technical issues and rarely ask for assistance with conceptual support. We suggest that the use of this approach allows designers to better gauge how users interact with support and propose considerations for designing creativity support tools for educational content.

Original languageEnglish (US)
Title of host publicationProceedings of the Human Factors and Ergonomics Society Annual Meeting (HFES2020)
Pages2070-2074
Volume64
Edition1
DOIs
StatePublished - 2020

Fingerprint

Dive into the research topics of 'User support systems: Lessons learned from implementing multiple interaction methods during testing'. Together they form a unique fingerprint.

Cite this