How musical are images? From sound representation to image sonification: An eco systemic approach

Jean Baptiste Thiebaut, Juan Pablo Bello, Diemo Schwarz

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Although sound visualization and image sonification have been extensively used for scientific and artistic purposes, their combined effect is rarely considered. In this paper, we propose the use of an iterative visualization/sonification approach as a sound generation mechanism. In particular, we visualize sounds using a textural self-similarity representation, which is then analysed to generate control data for a granular synthesizer. Following an eco-systemic approach, the output of the synthesizer is then fed back into the system, thus generating a loop designed for the creation of novel time-evolving sounds. All the process is real-time, implemented in Max/MSP using the FTM library. A qualitative analysis of the approach is presented and complemented with a discussion about visualization and sonification issues in the context of sound design.

Original languageEnglish (US)
Title of host publicationInternational Computer Music Conference, ICMC 2007
PublisherInternational Computer Music Association
Pages183-187
Number of pages5
StatePublished - 2007
EventInternational Computer Music Conference, ICMC 2007 - Copenhagen, Denmark
Duration: Aug 27 2007Aug 31 2007

Other

OtherInternational Computer Music Conference, ICMC 2007
Country/TerritoryDenmark
CityCopenhagen
Period8/27/078/31/07

ASJC Scopus subject areas

  • Media Technology
  • Computer Science Applications
  • Music

Fingerprint

Dive into the research topics of 'How musical are images? From sound representation to image sonification: An eco systemic approach'. Together they form a unique fingerprint.

Cite this