Title:
Analyzing and Learning Movement Through Human-Computer Co-Creative Improvisation and Data Visualization

Thumbnail Image
Author(s)
Liu, Lucas
Authors
Advisor(s)
Magerko, Brian
Advisor(s)
Editor(s)
Associated Organization(s)
Organizational Unit
Organizational Unit
Supplementary to
Abstract
Recent years have seen an incredible rise in the availability of household motion and video capture technologies, ranging from the humble webcam to the relatively sophisticated Kinect sensor. Naturally, this precipitated a rise in both the quantity and quality of motion capture data available on the internet. The wealth of data on the internet has caused a new interest in the field of motion data classification, the specific task of having a model classify and sort different clips of human motion. However, there is comparatively little work in the field of motion data clustering, which is an unsupervised field that may be more useful in the future as it allows for agents to recognize “categories” of motions without the need for user input or classified data. Systems that can cluster motion data focus more on “what type of motion data is this, and what is it similar to” rather than which motion is this. The LuminAI project, as described in this paper, is an example of a practical use for motion data clustering that allows the system to respond to user dance moves with a similar but different gesture. To analyze the efficacy and properties of this motion data clustering pipeline, we also propose a novel data visualization tool and the design considerations involved in its development.
Sponsor
Date Issued
2020-12
Extent
Resource Type
Text
Resource Subtype
Undergraduate Thesis
Rights Statement
Rights URI