ThePlantGame: Actively training human annotators for domain-specific crowdsourcing

Maximilien Servajean, Alexis Joly, Dennis Shasha, Julien Champ, Esther Pacitti

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

In a typical citizen science/crowdsourcing environment, the contributors label items. When there are few labels, it is straightforward to train contributors and judge the quality of their labels by giving a few examples with known answers. Neither is true when there are thousands of domain-specific labels and annotators with heterogeneous skills. This demo paper presents an Active User Training framework implemented as a serious game called ThePlantGame. It is based on a set of data-driven algorithms allowing to (i) actively train annotators, and (ii) evaluate the quality of contributors' answers on new test items to optimize predictions.

Original languageEnglish (US)
Title of host publicationMM 2016 - Proceedings of the 2016 ACM Multimedia Conference
PublisherAssociation for Computing Machinery, Inc
Pages720-721
Number of pages2
ISBN (Electronic)9781450336031
DOIs
StatePublished - Oct 1 2016
Event24th ACM Multimedia Conference, MM 2016 - Amsterdam, United Kingdom
Duration: Oct 15 2016Oct 19 2016

Publication series

NameMM 2016 - Proceedings of the 2016 ACM Multimedia Conference

Other

Other24th ACM Multimedia Conference, MM 2016
Country/TerritoryUnited Kingdom
CityAmsterdam
Period10/15/1610/19/16

ASJC Scopus subject areas

  • Computer Graphics and Computer-Aided Design
  • Human-Computer Interaction
  • Computer Vision and Pattern Recognition
  • Software

Fingerprint

Dive into the research topics of 'ThePlantGame: Actively training human annotators for domain-specific crowdsourcing'. Together they form a unique fingerprint.

Cite this