Title:
Data-to-music API: Real-time data-agnostic sonification with musical structure models
Data-to-music API: Real-time data-agnostic sonification with musical structure models
Author(s)
Tsuchiya, Takahiko
Freeman, Jason
Lerner, Lee W.
Freeman, Jason
Lerner, Lee W.
Advisor(s)
Editor(s)
Collections
Supplementary to
Permanent Link
Abstract
In sonification methodologies that aim to represent the underlying data accurately, musical or artistic approaches are often dismissed
as being not transparent, likely to distort the data, not
generalizable, or not reusable for different data types. Scientific
applications for sonification have been, therefore, hesitant to use
approaches guided by artistic aesthetics and musical expressivity.
All sonifications, however, may have musical effects on listeners,
as our trained ears with daily exposure to music tend to naturally
distinguish musical and non-musical sound relationships, such as
harmony, rhythmic stability, or timbral balance. This study proposes
to take advantage of the musical effects of sonification in
a systematic manner. Data may be mapped to high-level musical
parameters rather than to one-to-one low-level audio parameters.
An approach to create models that encapsulate modulatable
musical structures is proposed in the context of the new DataTo-
Music JavaScript API. The API provides an environment for rapid
development of data-agnostic sonification applications in a web
browser, with a model-based modular musical structure system.
The proposed model system is compared to existing sonification
frameworks as well as music theory and composition models.
Also, issues regarding the distortion of original data, transparency,
and reusability of musical models are discussed.
Sponsor
Date Issued
2015-07
Extent
Resource Type
Text
Resource Subtype
Proceedings
Rights Statement
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License..