Abstract:
This thesis presents a framework that queries a music database with rhythmic motion signals. Rather than the existing method to extract the motion signal's underlying rhythm by marking salient frames, this thesis proposes a novel approach, which converts the rhythmic motion signal to MIDI-format music and extracts its beat sequence as the rhythmic information of that motion. We extract "motion events" from the motion data based on characteristics such as movement directional change, root-y coordinate and angular-velocity. Those events are converted to music notes in order to generate an audio representation of the motion. Both this motion-generated music and the existing audio library are analyzed by a beat tracking algorithm. The music retrieval is completed based on the extracted beat sequences.
We tried three approaches to retrieve music using motion queries, which are a mutual-information-based approach, two sample KS test and a rhythmic comparison algorithm. Feasibility of the framework is evaluated with pre-recorded music and motion recordings.