demo.mp4
- Operating System: Ubuntu 20.04
- ROS Version: Noetic
pip install empy==3.3.4 catkin_pkg pyyaml rospkg
pip install dynamixel_sdk numpy
- Install the dependencies and navigate to your ROS workspace.
- Create a new workspace:
cd ~/leap_hand_ws/src
- Clone Leap-Hand-Robotics inside the
~/leap_hand_ws/src
directorygit clone https://github.com/Demolus13/Leap-Hand-Robotics.git leap_hand --recursive
- Create Executable Files inside the
~/leap_hand_ws/src/leap_hand
directorychmod +x leaphand_node.py chmod +x rock_paper_scissors.py
- Build your workspace:
cd ~/leap_hand_ws catkin_make
- Source your workspace:
source devel/setup.bash
- Connect 5v power to the hand (the dynamixels should light up during boot up.)
- Connect the Micro USB cable to the hand (Do not use too many USB extensions)
- Find the USB port using Dynamixel Wizard
- Terminal 1
roslaunch leap_hand example.launch
- Terminal 2
rosrun leap_hand rock_paper_scissors.py
Finally Enjoy Playing the Game !!!
Note: We have already saved the gestures of rock, paper, and scissors in the rock_paper_scissors.py
# Define the joint positions for rock, paper, and scissors
self.states = {
"rock": np.array([3.1416, 4.1888, 4.5553, 4.4157, 3.1416, 4.1190, 5.1487, 4.2412, 3.1416, 4.2237, 4.7124, 4.4506, 2.6005, 1.5184, 4.6775, 4.4157]),
"paper": np.array([3.1416, 3.1416, 3.1416, 3.1416, 3.1416, 3.1416, 3.1416, 3.1416, 3.1416, 3.1416, 3.1416, 3.1416, 3.1416, 3.1416, 3.1416, 3.1416]),
"scissors": np.array([3.1416, 3.1416, 3.1416, 3.1416, 3.1416, 3.1416, 3.1416, 3.1416, 3.1416, 4.2237, 4.7124, 4.4506, 2.6005, 1.5184, 4.6775, 4.4157])
}
- Capture Images for Training using the script
capture_images.py
- On executing the python file the webcam will open can you can save your hand gesture by pressing Enter-Key to capture.
- Captured images will be stored in the
data/raw
- Execture preprocessing using the script
preprocess_data.py
- The preprocessed .csv files will be created in
data/processed
- The preprocessed .csv files will be created in
- Train the Model using the script
train_model.py
- The trained model and the label encodings will be stored in
models
- The trained model and the label encodings will be stored in
- Evaluating model using the script
evaluate_model.py
- The webcam will open and you can perform different stored gestures to check the accuracy of the model
Follow the above steps for simulating on the Leap Hand Hardware