-
Notifications
You must be signed in to change notification settings - Fork 0
Home
The Musical Gestures Toolbox for Terminal is made for doing various video-related tricks in the terminal. This is mainly based on the power of FFmpeg and ImageMagick.
The toolbox was primarily developed for music research, with a particular focus on studying the motion of musicians and dancers. But it can be used for any type of motion-related analysis based on video recordings.
Create a grid of images from a video file by sampling frames at equal distribution throughout the video. Example:
./mggrid.sh dance.mp4
By default, the source frame size is used and a 3x3 grid is created:
You can specify the height of each frame, the dimensions (columns x rows) and the output name like this:
./mggrid.sh dance.mp4 300 5 3 dance_grid_5x3.jpg
This will result in a grid image where each frame is 300 pixels tall, and with 5 columns and 3 rows:
This function is similar to mggrid but based on creating an image from a set of images (instead of a video). This function can best be used together with mgkeyframes.
This function extracts all the keyframes from a .mp4 file. Keyframes are used by the MPEG compression to save storage space. Each keyframe contains the whole image, while subsequent frames only store what changed after the keyframe. If the function is used on a video file that does not contain keyframes (such as a .avi file with MJPEG compression), it will export all the frames of the file. Example usage:
./mgkeyframes dance.mp4
This will generate one .tiff file for each keyframe:
These files can be used for further processing by other functions.
Create videograms (horizontal and vertical) from a video file by averaging over columns and rows respectively. Example:
./mgvideogram.sh dance.mp4
This will create two image files, the horizontal videogram looks like this:
Will create a image consisting of an array of pixels based on the average colour of each frame in a video file. This makes most sense for long video recordings where there are scene changes. For short videos with similar colour, you will just get something like this:
./mgpixelarray dance.mp4
Yes, that is mainly a blue line, resulting from the average colour of the video being blue throughout the entire video. The function works better on longer videos with scene changes. For example, here is a pixel array image of Bergensbanen, a 7-hour TV production of the train ride between Oslo and Bergen. The end result looks like this (1920 pixels wide):
As you see, not much is changing, but that also represents the slowness of the train ride.
Create a motion video by performing a "frame differencing" of the video. This means to subtract subsequent frames, which leaves only information about the pixels that changed between the frames. Example:
./mgmotion.sh dance.mp4
This will create a motion video. Here is a screenshot from such a video:
Create motion history video from a video file. Example:
./mgkeyframes dance.mp4
This will generate a motion history video with a blurring over 10 frames:
The number of frames to include in the history can be specified:
./mgkeyframes dance.mp4 30
This will blur over 30 frames.
This is a useful function for long videos that you want to "speed up" before running other processes. Example:
./mgresample.sh dance.mp4 4
This will speed up the video 4 times, hence make later processing quicker.
This function can be used to "flatten" Ricoh Theta 360-degree videos. These cameras contain two fisheye lenses, capturing two 180-degree videos next to each other. This results in video files like this:
These files are not very useful to watch or work with, so we need to somehow “flatten” them into a more meaningful video file. It can be done in the Ricoh mobile phone app but may be easier to do on a computer. The FFmpeg developers are working on native support for various 360-degree video files. This is implemented in the filter v360. Meanwhile, this blog post](https://github.com/96fps/ThetaS-video-remap) shows how to do the flattening based on two PGM files that contain information about how the video should be mapped:
./mgricohflatten.sh ricoh-file.mp4
The result is a flattened video file, as shown below:
Read more in this blog post.
Create a simple waveform from the audio of a video file. Example:
./mgwaveform.sh dance.mp4
This will create a waveform display like this:
Create a spectrogram from the audio of a video file. Example:
./mgspectrogram.sh dance.mp4
This will create a 1920x1080 spectrogram using the standard settings of FFmpeg:
A project from the fourMs Lab, RITMO Centre for Interdisciplinary Studies in Rhythm, Time and Motion, Department of Musicology, University of Oslo.