Skip to content

Gesture-controlled drumkit: wearable gloves that produce different sounds from different gestures

Notifications You must be signed in to change notification settings

rachung2510/oppy

Repository files navigation

Gesture-Controlled Drumkit

This project was completed as part of the MSc Applied Machine Learning programme at Imperial College London's Department of Electrical Engineering.

This project develops a data glove with flex sensors and an inertial measurement unit (IMU) to produce different MIDI sounds with different gestures. Machine learning is used to classify the different gestures into different parts of the drum (kick, snare, hi-hat), allowing the user to use the data gloves to "play drums".

Functions

The model classifies five gestures into five main parts of a standard acoustic drum kit: the kick, hi-hat, snare, tom and crash.

Sound Kick Hihat Snare Tom Crash
Gesture Kick gesture Hihat gesture Snare gesture Tom gesture Crash gesture
Fist 1 finger 2 fingers 3 fingers Open palm

The user chooses which drum sound to produce through the gesture, and when to produce the sound by performing a quick downward motion with their hand.

Beat motion
A quick, downward movement to produce a beat

Using the data gloves, the user can "play drums" in realtime, using manually downloaded drum sounds or MIDI sounds produced through a DAW.

Demo with DAW
Using one data glove to play drums using Logic Pro
(click for sound)

Data Collection

  1. Upload arduino/code>code.ino to your Nano.
  2. In collect_data.py, define your paths. For Windows, SERIAL_PATH would be like "COM10". For Unix systems, SERIAL_PATH should be something like "/dev/ttyUSB0". FIGURES_PATH is optional (directory doesn't have to exist).
  3. Run collect_data.py. When prompted for the gesture, type in the gesture for that trial in the format <first_gesture_num><second_gesture_num> (e.g. 13). Gestures are:
    • (0) Kick - fist
    • (1) Hihat - 1 finger
    • (2) Snare - 2 fingers
    • (3) Tom - 3 fingers
    • (4) Crash - open palm
  4. Once Reading... is printed, start doing your gestures. Collect 100 gestures.
  5. Once 100 gestures have been performed, press Ctrl-C to stop data collection. When prompted for your CSV filename, input your desired filename or just press Enter to set it as the gesture you previously inputted.

Files

  • collect_data.py: Python script for collecting data
  • create_dataset.ipynb: Jupyter notebook for creating combined dataset from individual CSVs of trials for each gesture transition
  • predict.py: Python script for realtime gesture prediction. Run with python3 predict.py [--dev <usb_device_path>] [--hand <l/r>] [--sound <k/p for keyboard or playing>].

About

Gesture-controlled drumkit: wearable gloves that produce different sounds from different gestures

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages