Controlling aural and visual particle systems through human movement

Carlos Guedes, Kirk Woolford

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

This paper describes the methods used to construct an interactive installation using human motion to animate both an aural and visual particle system in synch. It outlines the rotoscoping, meta-motion processing, and visual particle system software. The paper then goes into a detailed explanation of the audio software developed for the project.

Original languageEnglish (US)
Title of host publicationProceedings of the 4th Sound and Music Computing Conference, SMC 2007
PublisherSound and music Computing network
Pages200-203
Number of pages4
ISBN (Print)9789606608759
StatePublished - 2007
Event4th Sound and Music Computing Conference, SMC 2007 - Lefkada, Greece
Duration: Jul 11 2007Jul 13 2007

Publication series

NameProceedings of the 4th Sound and Music Computing Conference, SMC 2007

Other

Other4th Sound and Music Computing Conference, SMC 2007
Country/TerritoryGreece
CityLefkada
Period7/11/077/13/07

Keywords

  • Interactive installations
  • Motion capture
  • Movement analysis
  • Music and dance interaction

ASJC Scopus subject areas

  • General Computer Science

Fingerprint

Dive into the research topics of 'Controlling aural and visual particle systems through human movement'. Together they form a unique fingerprint.

Cite this