Evaluating the accessibility of crowdsourcing tasks on Amazon's mechanical turk

Rocío Calvo, Shaun K. Kane, Amy Hurst

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Crowd work web sites such as Amazon Mechanical Turk enable individuals to work from home, which may be useful for people with disabilities. However, the web sites for finding and performing crowd work tasks must be accessible if people with disabilities are to use them. We performed a heuristic analysis of one crowd work site, Amazon's Mechanical Turk, using the Web Content Accessibility Guidelines 2.0. This paper presents the accessibility problems identified in our analysis and offers suggestions for making crowd work platforms more accessible.

Original languageEnglish (US)
Title of host publicationASSETS14 - Proceedings of the 16th International ACM SIGACCESS Conference on Computers and Accessibility
PublisherAssociation for Computing Machinery
Pages257-258
Number of pages2
ISBN (Electronic)9781450327206
DOIs
StatePublished - Oct 20 2014
Event16th International ACM SIGACCESS Conference on Computers and Accessibility, ASSETS 2014 - Rochester, United States
Duration: Oct 20 2014Oct 22 2014

Publication series

NameASSETS14 - Proceedings of the 16th International ACM SIGACCESS Conference on Computers and Accessibility

Conference

Conference16th International ACM SIGACCESS Conference on Computers and Accessibility, ASSETS 2014
Country/TerritoryUnited States
CityRochester
Period10/20/1410/22/14

Keywords

  • Accessibility
  • Crowdsourcing
  • Evaluation
  • Mechanical turk

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Human-Computer Interaction
  • Software
  • Hardware and Architecture

Fingerprint

Dive into the research topics of 'Evaluating the accessibility of crowdsourcing tasks on Amazon's mechanical turk'. Together they form a unique fingerprint.

Cite this