Skip to content
Balint Laczko edited this page Jun 17, 2021 · 10 revisions

The Musical Gestures Toolbox for Python is a collection of high-level modules targeted at researchers working with video recordings. It includes visualization techniques such as motion videos, motion history images, and motiongrams; techniques that in different ways, allow for looking at video recordings from different temporal and spatial perspectives. It also includes basic computer vision analysis, such as extracting quantity and centroid of motion and using such features in analysis.

The toolbox was originally developed to analyze music-related body motion (musicians, dancers and perceivers), but is equally useful for other disciplines working with video recordings of humans, such as linguistics, pedagogy, psychology, and medicine.

Functions

The Musical Gestures Toolbox contains functions to analyze and visualize video, audio, and motion capture data. There are three categories of functions:

  • Preprocesses (time-period extraction, time dilation, color management, etc)
  • Processes (motion, history, optical flow, pose estimation, etc)
  • Visualization functions (video playback, image display, plotting)

Look at the MusicalGesturesToolbox.ipynb to get an idea about how to use the toolbox. (Note that it might not display successfully online on GitHub due to its large size.) You can also run the notebook in Colab.

Problems

These are some of the known issues. Please help improve the toolbox by adding bugs and feature requests in the issues section.