2016-04,
Tsuchiya, Takahiko,
Freeman, Jason,
Lerner, Lee W.
Creating interactive audio applications for web browsers
often involves challenges such as time synchronization
between non-audio and audio events within thread
constraints and format-dependent mapping of data to
synthesis parameters. In this paper, we describe a unique
approach for these issues with a data-driven symbolic
music application programming interface (API) for rapid
and interactive development. We introduce DataToMusic
(DTM) API, a data-sonification tool set for web browsers
that utilizes the Web Audio API1 as the primary means of
audio rendering. The paper demonstrates the possibility of
processing and sequencing audio events at the
audio-sample level by combining various features of the
Web Audio API, without relying on the
ScriptProcessorNode, which is currently under a redesign.
We implemented an audio event system in the clock and
synthesizer classes in the DTM API, in addition to a
modular audio effect structure and a
exible
data-to-parameter mapping interface. For complex
real-time configuration and sequencing, we also present a
model system for creating reusable functions with a
data-agnostic interface and symbolic musical
transformations. Using these tools, we aim to create a
seamless connection between high-level (musical structure)
and low-level (sample rate) processing in the context of
real-time data sonification.