Publication Search Results
Now showing 1 - 10 of 15
ItemDirected Evolution in Live Coding Music Performance(Georgia Institute of Technology, 2020-10-24) Dasari, Sandeep ; Freeman, Jason ; College of Design ; School of MusicGenetic algorithms are extensively used to understand, simulate, and create works of art and music. In this paper, a similar approach is taken to apply basic evolutionary algorithms to perform music live using code. Often considered an improvisational or experimental performance, live coding music comes with its own set of challenges. Genetic algorithms offer potential to address these long-standing challenges. Traditional evolutionary applications in music focused on novelty search to create new sounds, sequences of notes or chords, and effects. In contrast, this paper focuses on live performance to create directed evolving musical pieces. The paper also details some key design decisions, implementation, and usage of a novel genetic algorithm API created for a popular live coding language.
ItemComposer, Performer, Listener(Georgia Institute of Technology, 2008-03-04) Freeman, Jason ; Library ; Georgia Institute of Technology. Music Dept.Even as social networking, multi-player gaming, and collaborative content creation become increasingly important in our lives, concert musical performance continues to follow a model in which the audience remains passive, with little connection to the composer, to the performers, or to each other. Freeman, an assistant professor in the Music Department, will explore how technology can transform the concert experience by inviting the audience to shape the music as it is performed or by engaging audiences in personalized musical experiences online.
ItemMulti-Modal Web-Based Dashboards for Geo-Located Real-Time Monitoring(Georgia Institute of Technology, 2016-04) Winters, R. Michael ; Tsuchiya, Takahiko ; Lerner, Lee W. ; Freeman, Jason ; School of Music ; College of DesignThis paper describes ongoing research in the presentation of geo-located, real-time data using web-based audio and visualization technologies. Due to both the increase of devices and diversity of information being accumulated in real-time, there is a need for cohesive techniques to render this information in a useable and functional way for a variety of audiences. We situate web-sonification|sonification of web- based information using web-based technologies|as a particularly valuable avenue for display. When combined with visualizations, it can increase engagement and allow users to profit from the additional affordances of human hearing. This theme is developed in the description of two multi-modal dashboards designed for data in the context of the Internet of Things (IoT) and Smart Cities. In both cases, Web Audio provided the back-end for sonification, but a new API called DataToMusic (DTM) was used to make common sonification operations easier to implement. DTM provides a valuable framework for web-sonification and we highlight its use in the two dashboards. Following our description of the implementations, the dashboards are compared and evaluated, contributing to general conclusions on the use of web-audio for sonification, and suggestions for future dashboards.
ItemTechnology to Broaden Education( 2015-08-28) Freeman, Jason ; Guzdial, Mark ; Hoffman, Michael ; Moon, Nathan ; School of Literature, Media, and Communication ; Ivan Allen College of Liberal Arts ; Georgia Institute of Technology. College of Architecture ; Georgia Institute of Technology. College of Computing
ItemStorage in Collaborative Networked Art(Georgia Institute of Technology, 2009) Freeman, Jason ; College of Design ; School of MusicThis chapter outlines some of the challenges and opportunities associated with storage in networked art. Using comparative analyses of collaborative networked music as a starting point, this chapter explores how networked storage can transform the relationship between composition and improvisation; how it can influence network designs focused on shared material or shared control; how it can actively and autonomously manipulate its own contents; how it can circumvent problems of network latency and facilitate asynchronous collaboration; and how it can exist as a core component of a work’s design without being at the core of every user’s experience.
ItemData-Driven Live Coding with DataToMusic API(Georgia Institute of Technology, 2016-04) Tsuchiya, Takahiko ; Freeman, Jason ; Lerner, Lee W. ; School of Music ; College of DesignCreating interactive audio applications for web browsers often involves challenges such as time synchronization between non-audio and audio events within thread constraints and format-dependent mapping of data to synthesis parameters. In this paper, we describe a unique approach for these issues with a data-driven symbolic music application programming interface (API) for rapid and interactive development. We introduce DataToMusic (DTM) API, a data-sonification tool set for web browsers that utilizes the Web Audio API1 as the primary means of audio rendering. The paper demonstrates the possibility of processing and sequencing audio events at the audio-sample level by combining various features of the Web Audio API, without relying on the ScriptProcessorNode, which is currently under a redesign. We implemented an audio event system in the clock and synthesizer classes in the DTM API, in addition to a modular audio effect structure and a exible data-to-parameter mapping interface. For complex real-time configuration and sequencing, we also present a model system for creating reusable functions with a data-agnostic interface and symbolic musical transformations. Using these tools, we aim to create a seamless connection between high-level (musical structure) and low-level (sample rate) processing in the context of real-time data sonification.