ABE: An agent-based software architecture for a multimodal emotion recognition framework

Javier Gonzalez-Sanchez, Maria Elena Chavez-Echeagaray, Robert Atkinson, Winslow Burleson

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

The computer's ability to recognize human emotional states given physiological signals is gaining in popularity to create empathetic systems such as learning environments, health care systems and videogames. Despite that, there are few frameworks, libraries, architectures, or software tools, which allow systems developers to easily integrate emotion recognition into their software projects. The work reported here offers a first step to fill this gap in the lack of frameworks and models, addressing: (a) the modeling of an agent-driven component-based architecture for multimodal emotion recognition, called ABE, and (b) the use of ABE to implement a multimodal emotion recognition framework to support third-party systems becoming empathetic systems.

Original languageEnglish (US)
Title of host publicationProceedings - 9th Working IEEE/IFIP Conference on Software Architecture, WICSA 2011
Pages187-193
Number of pages7
DOIs
StatePublished - 2011
Event9th Working IEEE/IFIP Conference on Software Architecture, WICSA 2011 - Boulder, CO, United States
Duration: Jun 20 2011Jun 24 2011

Publication series

NameProceedings - 9th Working IEEE/IFIP Conference on Software Architecture, WICSA 2011

Other

Other9th Working IEEE/IFIP Conference on Software Architecture, WICSA 2011
Country/TerritoryUnited States
CityBoulder, CO
Period6/20/116/24/11

Keywords

  • Affective computing
  • Agent-based
  • Architecture
  • Emotion recognition
  • Empathetic systems
  • Framework
  • Multimodal

ASJC Scopus subject areas

  • Software

Fingerprint

Dive into the research topics of 'ABE: An agent-based software architecture for a multimodal emotion recognition framework'. Together they form a unique fingerprint.

Cite this