This is a collection of tools for analysing and visualising video in realtime. It developed in and for the graphical programming environment Max/MSP/Jitter.
The toolbox is probably most useful for people that already are experienced Max programmers. People looking for similar functionality should check out some of the standalone applications we have built based on the toolbox.
This was the original Musical Gestures Toolbox. The patches were developed 2004-2007 and haven't been updated since. They should, however, probably still work, although there may be some dependency issues.
Starting from 2006, the patches were embedded within the Jamoma framework. Later there is also a version of MGT for Matlab and Python.
The software is currently maintained by the fourMs lab at RITMO Centre for Interdisciplinary Studies in Rhythm, Time and Motion at the University of Oslo.
The Musical Gestures Toolbox was first described in:
- Jensenius, A. R., Godøy, R. I., & Wanderley, M. M. (2005). Developing tools for studying musical gestures within the Max/MSP/Jitter environment. Proceedings of the International Computer Music Conference, 282–285.
Details of the implementation and usage was presented in more detail in this Ph.D. dissertation:
- Jensenius, A. R. (2007). Action–Sound: Developing Methods and Tools to Study Music-Related Body Movement. PhD thesis, University of Oslo.
Main developer: Alexander Refsum Jensenius.
This software is open source, and is shared with The GNU General Public License v3.0.