-
Notifications
You must be signed in to change notification settings - Fork 4
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Loading status checks…
Final Repository Cleanup & Tutorials
Final Repository Cleanup
Showing
14 changed files
with
314 additions
and
994 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,14 +1,14 @@ | ||
# MI-HGNN for contact estimation/classification on various robots | ||
This repository implements a Morphology-Informed Heterogeneous Graph Neural Network (MI-HGNN) for estimating contact information on the feet of a quadruped robot. | ||
# MI-HGNN: Morphology-Informed Heterogeneous Graph Neural Network | ||
This repository implements a Morphology-Inspired Heterogeneous Graph Neural Network (MI-HGNN) for estimating contact information on the feet of a quadruped robot. For more details, see our publication "[MI-HGNN: Morphology-Informed Heterogeneous Graph Neural Network for Legged Robot Contact Perception](https://arxiv.org/abs/2409.11146)" and our [project page](https://lunarlab-gatech.github.io/Morphology-Informed-HGNN/). | ||
|
||
Additionally, by providing a compatible URDF file, this software can convert a variety of robot structures to graph format for learning with the MI-HGNN. See [#Applying-MI-HGNN-to-your-own-robot | ||
](#applying-mi-hgnn-to-your-own-robot) for more information. | ||
Additionally, it can be applied to a variety of robot structures and datasets, as our software can convert compatible robot URDF files to graph format and provides a template for implementing custom datasets. See [#Applying-MI-HGNN-to-your-own-robot](#applying-mi-hgnn-to-your-own-robot) for more information. | ||
|
||
 | ||
|
||
For information on our method, see our [project page](https://lunarlab-gatech.github.io/Morphology-Informed-HGNN/) and [paper](https://arxiv.org/abs/2409.11146). | ||
## Setup | ||
--- | ||
|
||
## Installation | ||
### Installation | ||
To get started, setup a Conda Python environment with Python=3.11: | ||
``` | ||
conda create -n mi-hgnn python=3.11 | ||
|
@@ -22,37 +22,62 @@ pip install . | |
|
||
Note, if you have any issues with setup, refer to `environment_files/README.md` so you can install the exact libraries we used. | ||
|
||
## URDF Download | ||
### URDF Download | ||
The necessary URDF files are part of git submodules in this repository, so run the following commands to download them: | ||
``` | ||
git submodule init | ||
git submodule update | ||
``` | ||
|
||
## Replicating Paper Experiments | ||
## Usage | ||
--- | ||
|
||
To replicate the experiments referenced in our paper or access our trained model weights, see `paper/README.md`. | ||
### Replicating Paper Experiments | ||
|
||
## Applying MI-HGNN to your own robot | ||
We provide code for replicating the exact experiments in our paper and provide full model weights for every model referenced in our paper. See `paper/README.md` for more information. | ||
|
||
Although in our paper, we only applied the MI-HGNN on quadruped robots for contact perception, it can also be applied to other multi-body dynamical systems. New URDF files can be added by following the instructions in `urdf_files/README.md`, and our software will automatically convert the URDF into a graph compatible for learning with the MI-HGNN. | ||
<img src="paper/website_images/figure5.png" alt="Parameter sizes and Ablation study" width="600"> | ||
|
||
## Editing and Contributing | ||
### Applying to your Robot/Dataset | ||
|
||
Datasets can be found in the `src/mi_hgnn/datasets_py` directory, and model definitions and training code can be found in the `src/mi_hgnn/lightning_py` directory. We encourage you to extend the library for your own applications. Please reference [#Replicating-Paper-Experiments](#replicating-paper-experiments) for examples on how to train and evaluate models with our repository. | ||
Although our paper's scope was limited to application of MI-HGNN on quadruped robots for contact perception, it can easily be applied to other multi-body dynamical systems and on other tasks/datasets, following the steps below: | ||
|
||
After making changes, rebuild the library following the instructions in [#Installation](#installation). To make sure that your changes haven't | ||
broken critical functionality, run the test cases found in the `tests` directory. | ||
<img src="paper/website_images/MI-HGNN Potential Applications.png" alt="MI-HGNN Potential Applications" width="800"> | ||
|
||
If you'd like to contribute to the repository, write sufficient and necessary test cases for your additions in the `tests` directory, and then open a pull request. | ||
1. Add new URDF files for your robots by following the instructions in `urdf_files/README.md`. Our software will automatically convert the URDF into a graph compatible for learning with the MI-HGNN. | ||
2. Incorporate your custom dataset using our `FlexibleDataset` class and starter `CustomDatasetTemplate.py` file by following the instructions at `src/mi_hgnn/datasets_py/README.md`. | ||
3. After making your changes, rebuild the library following the instructions in [#Installation](#installation). To make sure that your changes haven't | ||
broken critical functionality, run the test cases with the command `python -m unittest discover tests/ -v`. | ||
4. Using the files in the `research` directory as an example, call our `train_model` and `evaluate_model` functions provided in `src/mi_hgnn/lightning_py/gnnLightning.py` with defined train, validation, and test sequences. | ||
|
||
## Citation | ||
We've designed the library to be easily applicable to a variety of datasets and robots, and have provided a variety of customization options in training, dataset creation, and logging. We're excited to see everything you can do with the MI-HGNN! | ||
|
||
|
||
### Simulated A1 Dataset | ||
|
||
To evaluate the performance of our model on GRF estimation, we generated our own simulated GRF dataset, which we now contribute to the community as well. We recorded proprioceptive sensor data and the corresponding ground truth GRFs by operating an A1 robot in the [Quad-SDK](https://github.com/lunarlab-gatech/quad_sdk_fork) simulator. In total, our dataset comprises of 530,779 synchronized data samples with a variety of frictions, terrains, and speeds. All of the different sequences are outlined in the table below: | ||
|
||
<img src="paper/grf_dataset_sequences.png" alt="GRF Dataset Planned Control" width="700"> | ||
|
||
A visualization of the various data collection environments can be seen below. | ||
|
||
 | ||
|
||
If you'd like to use this dataset, the recorded sequences can be found on [Dropbox](https://www.dropbox.com/scl/fo/4iz1oobx71qoceu2jenie/AJPggD4yIAFXf5508wBz-hY?rlkey=4miys9ap0iaozgdelntms8lxb&st=0oz7kgyq&dl=0). See `paper/README.md` and Section V-B of our publication for specific details on this dataset and how to use it. | ||
|
||
## Other Info | ||
--- | ||
### Contributing | ||
|
||
We encourage you to extend the library for your own applications. If you'd like to contribute to the repository, write sufficient and necessary test cases for your additions in the `tests` directory, and then open a pull request. Reach out to us if you have any questions. | ||
|
||
### Citation | ||
|
||
If you find our repository or our work useful, please cite the relevant publication: | ||
|
||
``` | ||
@article{butterfield2024mi, | ||
title={MI-HGNN: Morphology-Informed Heterogeneous Graph Neural Network for Legged Robot Contact Perception}, | ||
title={{MI-HGNN: Morphology-Informed Heterogeneous Graph Neural Network for Legged Robot Contact Perception}}, | ||
author={Butterfield, Daniel and Garimella, Sandilya Sai and Cheng, Nai-Jen and Gan, Lu}, | ||
journal={arXiv preprint arXiv:2409.11146}, | ||
year={2024}, | ||
|
@@ -61,6 +86,6 @@ If you find our repository or our work useful, please cite the relevant publicat | |
} | ||
``` | ||
|
||
## Contact / Issues | ||
### Contact / Issues | ||
|
||
For any issues with this repository, feel free to open an issue on GitHub. For other inquiries, please contact Daniel Butterfield ([email protected]) or the Lunar Lab (https://sites.gatech.edu/lunarlab/). |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,118 @@ | ||
from .flexibleDataset import FlexibleDataset | ||
import scipy.io as sio | ||
from pathlib import Path | ||
import numpy as np | ||
|
||
class CustomDataset(FlexibleDataset): | ||
|
||
|
||
# ========================= DOWNLOADING ========================== | ||
def get_downloaded_dataset_file_name(self): | ||
""" | ||
Type the name of the file extension of your dataset sequence files here! | ||
""" | ||
return "data.<YOUR_EXTENSION_HERE>" | ||
|
||
# ========================= PROCESSING =========================== | ||
def process(self): | ||
# Load the path to the downoaded file | ||
path_to_file = Path(self.root, 'raw', 'data.<YOUR_EXTENSION_HERE>') | ||
|
||
# TODO: Convert into a MATLAB data dictionary format here! | ||
mat_data = None | ||
|
||
# Make sure to save it at this location | ||
sio.savemat(Path(self.root, 'processed', 'data.mat'), mat_data) | ||
|
||
# TODO: Get the number of dataset entries in the file | ||
dataset_entries = None | ||
|
||
# Write a txt file to save the dataset length & and first sequence index | ||
with open(str(Path(self.processed_dir, "info.txt")), "w") as f: | ||
file_id, loc = self.get_file_id_and_loc() | ||
f.write(str(dataset_entries) + " " + file_id) | ||
|
||
# ============= DATA SORTING ORDER AND MAPPINGS ================== | ||
def get_urdf_name_to_dataset_array_index(self) -> dict: | ||
""" | ||
Implement this function to tell `FlexibleDataset` how | ||
the data returned by `load_data_at_dataset_seq()` corresponds | ||
to the joints in the robot URDF file. | ||
Traditionally a robot only has one base node, so it should get a value | ||
of 0. Next, type the name of each leg joint in the URDF file, and add | ||
the index of its value in the corresponding joint arrays returned by | ||
load_data_at_dataset_seq(). Do the same for the joints in the URDF | ||
representing a fixed foot, with the indices of their values in the foot | ||
position and foot velocity arrays. | ||
""" | ||
|
||
return { | ||
'<URDF_BASE_NODE>': 0, | ||
|
||
'<URDF_JOINT_NODE>': 2, | ||
'<URDF_JOINT_NODE2>': 0, | ||
'<URDF_JOINT_NODE3>': 1, | ||
|
||
'<URDF_FOOT_NODE>': 1, | ||
'<URDF_FOOT_NODE2>': 0, | ||
} | ||
|
||
# ===================== DATASET PROPERTIES ======================= | ||
def get_expected_urdf_name(self): | ||
return "<EXPECTED_URDF_NAME_HERE>" | ||
|
||
# ======================== DATA LOADING ========================== | ||
def load_data_at_dataset_seq(self, seq_num: int): | ||
""" | ||
When this function is called, the .mat file data saved in process() | ||
is available at self.mat_data. | ||
For information on the expected format of these variables, see the | ||
load_data_at_dataset_seq() function defition in flexibleDataset.py. | ||
""" | ||
|
||
# TODO: Load the data as numpy arrays, and don't forget to incorporate self.history_length | ||
# to load a history of measurments. | ||
lin_acc = None | ||
ang_vel = None | ||
j_p = None | ||
j_v = None | ||
j_T = None | ||
f_p = None | ||
f_v = None | ||
contact_labels = None | ||
r_p = None | ||
r_o = None | ||
timestamps = None | ||
# Note, if you don't have data for a specific return value, just return None, | ||
# and `FlexibleDataset` will know not to use it if it is not required. | ||
|
||
return lin_acc, ang_vel, j_p, j_v, j_T, f_p, f_v, contact_labels, r_p, r_o, timestamps | ||
|
||
# ================================================================ | ||
# ===================== DATASET SEQUENCES ======================== | ||
# ================================================================ | ||
|
||
class CustomDataset_sequence1(CustomDataset): | ||
""" | ||
To load a dataset sequence from Google, first upload the corresponding file on Google Drive, set "General Access" | ||
to "Anyone with the link", and then copy the link. Paste the link, and then extract the string between the text of | ||
'/file/d/' and '/view?usp=sharing'. Take this string, and paste it as the first return argument below. | ||
""" | ||
def get_file_id_and_loc(self): | ||
return "<Your_String_Here>", "Google" | ||
|
||
class CustomDataset_sequence2(CustomDataset): | ||
""" | ||
To load a dataset sequence from Dropbox, first you'll need to upload the corresponding file on Dropbox and | ||
generate a link for viewing. Make sure that access is given to anyone with the link, and that this permission won't | ||
expire, doesn't require a password, and allows for downloading. Finally, copy and paste the link as the first return | ||
argument below, but change the last number from 0 to 1 (this tells Dropbox to send the raw file, instead of a webpage). | ||
""" | ||
def get_file_id_and_loc(self): | ||
return "<Your_Link_Here>", "Dropbox" | ||
|
||
""" | ||
Create classes for each of your sequences... | ||
""" |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,63 @@ | ||
# Implementing Custom Datasets | ||
|
||
We hope that many people use our MI-HGNN on a variety of datasets. We provide the `FlexibleDataset` class which provides many convenient features and can be inherited for use with custom datasets. Below is a short summary of its features: | ||
- Automatic download of relevant datasets from the Internet (from Google Drive or Dropbox). | ||
- Data sorting to match the order of joint, foot, and base nodes in the robot graph. | ||
- Wrapper for the `RobotGraph` class that generates the graph from the robot URDF file. | ||
- Easy customization with custom history lengths and a normalization parameter. | ||
- Provides custom get() function returns for training both an MLP and the MI-HGNN. | ||
- Option for easy evaluation on floating-base dynamics model, though our current implementation is specific for the simulated A1 robot in our paper, meaning changes will be necessary for proper results on your robot. | ||
|
||
However, `FlexibleDataset` currently only supports the following input data: | ||
- lin_acc (np.array) - IMU linear acceleration | ||
- ang_vel (np.array) - IMU angular velocity | ||
- j_p (np.array) - Joint positions | ||
- j_v (np.array) - Joint velocities | ||
- j_T (np.array) - Joint Torques | ||
- f_p (np.array) - Foot position | ||
- f_v (np.array) - Foot velocity | ||
- labels (np.array) - The Dataset labels (either Z direction GRFs, or contact states) | ||
- r_p (np.array) - Robot position (GT) | ||
- r_o (np.array) - Robot orientation (GT) as a quaternion, in the order (x, y, z, w) | ||
- timestamps (np.array) - Array containing the timestamps of the data | ||
|
||
Also note that not all of these are used depending on the applied model (MLP vs. MIHGNN vs Floating-Base Dynamics). | ||
|
||
If `FlexibleDataset` supports your input data, then you can easily use it by writing a simple dataset class that inherits from `FlexibleDataset`, similar to `LinTzuYaunDataset` or `QuadSDKDataset`. We've provided a template for you in the `CustomDatasetTemplate.py` file, which you can use to start. | ||
|
||
## Using the Custom Dataset Template | ||
|
||
This section will explain how to edit the `CustomDatasetTemplate.py` file for use with your own dataset to take advantage of the features of the `FlexibleDataset` class. | ||
|
||
First, open the file and rename the class to your liking. | ||
|
||
### Adding Dataset Sequences | ||
Next, scroll down to the bottom of the file where it says `DATASET SEQUENCES`. Add every sequence of your dataset as its own class, which will require you to upload the data either to Dropbox or Google. See `CustomDatasetTemplate.py` for details. | ||
|
||
This is a clean way for data loading, as it allows the user to later combine different sequences as they'd like with the `torch.utils.data.ConcatDataset` function (see `research/train_classification_sample_eff.py` for an example). Defining these classes also means that training an MI-HGNN model on a different computer doesn't require the user to manually download any datasets, as `FlexibleDataset` will do it for you. | ||
|
||
Also, when the files are downloaded, they will be renamed to the value provided by `get_downloaded_dataset_file_name()`. Overwrite this function so that the file extension is correct (`.mat` for a Matlab file, `.bag` for a ROSbag file, etc). | ||
|
||
### Implementing Data Processing | ||
Now that you can load your dataset files, you need to implement processing. This step should be implemented in `process()`, and should convert the file from whatever format it is currently in into a `.mat` file for fast training speeds. You'll also need to provide code for extracting the number of dataset entries in this sequence, which will be saved into a .txt file for future use. | ||
|
||
Implement this function. You can see `quadSDKDataset.py` for an example of converting a ROSbag file into a .mat file. | ||
|
||
### Implementing Data Loading | ||
Now that data is loaded and processed, you can now implement the function for opening the .mat file and extracting the relevant dataset sequence. | ||
This should be done in `load_data_at_dataset_seq()`. The .mat file you saved in the last step will now be available at `self.mat_data` for easy access. | ||
Note that this function will also need to use the `self.history_length` parameter to support training with a history of measurements. See `CustomDatasetTemplate.py` for details, and see `LinTzuYaunDataset.py` for a proper implementation of this function. | ||
|
||
### Setting the proper URDF file | ||
Since its easy for the user to provide the wrong URDF file for a dataset sequence, `FlexibleDataset` checks that the URDF file provided by the user matches what the dataset expects. You can tell `FlexibleDataset` which URDF file should be used with this dataset by going to the URDF file and copying the name found at the top of the file, like pictured below: | ||
|
||
``` | ||
<robot name="miniCheetah"> | ||
``` | ||
|
||
This name should be pasted into `get_expected_urdf_name()`. | ||
|
||
### Facilitating Data Sorting | ||
Finally, the last step is to tell `FlexibleDataset` what order your dataset data is in. For example, which index in the joint position array corresponds to a specific joint in the URDF file? To do this, you'll implement `get_urdf_name_to_dataset_array_index()`. See `CustomDatasetTemplate.py` for more details. | ||
|
||
After doing this, your dataset will work with our current codebase for training MLP and MI-HGNN models! You can now instantiate your dataset and use it like in the examples in the `research` directory. Happy Training! |
Oops, something went wrong.