Controlling aural and visual particle systems through human movement

Carlos Guedes, Kirk Woolford

Research output: Contribution to conferencePaperpeer-review

Abstract

This paper describes the methods used to construct an interactive installation using human motion to animate both an aural and visual particle system in synch. It outlines the rotoscoping, meta-motion processing, and visual particle system software. The paper then goes into a detailed explanation of the audio software developed for the project.

Original languageEnglish (US)
Pages180-183
Number of pages4
StatePublished - Jan 1 2007
EventInternational Computer Music Conference, ICMC 2007 - Copenhagen, Denmark
Duration: Aug 27 2007Aug 31 2007

Other

OtherInternational Computer Music Conference, ICMC 2007
CountryDenmark
CityCopenhagen
Period8/27/078/31/07

ASJC Scopus subject areas

  • Media Technology
  • Computer Science Applications
  • Music

Fingerprint Dive into the research topics of 'Controlling aural and visual particle systems through human movement'. Together they form a unique fingerprint.

Cite this