diff --git a/.gitmodules b/.gitmodules
index 6df22de..efdfd5d 100644
--- a/.gitmodules
+++ b/.gitmodules
@@ -2,9 +2,6 @@
[submodule "urdf_files/A1/unitree_ros"]
path = urdf_files/A1/unitree_ros
url = https://github.com/unitreerobotics/unitree_ros.git
-[submodule "urdf_files/HyQ/hyq-description"]
- path = urdf_files/HyQ/hyq-description
- url = https://github.com/iit-DLSLab/hyq-description.git
[submodule "urdf_files/Go1/unitree_ros"]
path = urdf_files/Go1/unitree_ros
url = https://github.com/unitreerobotics/unitree_ros.git
diff --git a/README.md b/README.md
index 71f4df1..8c175a0 100644
--- a/README.md
+++ b/README.md
@@ -1,14 +1,14 @@
-# MI-HGNN for contact estimation/classification on various robots
-This repository implements a Morphology-Informed Heterogeneous Graph Neural Network (MI-HGNN) for estimating contact information on the feet of a quadruped robot.
+# MI-HGNN: Morphology-Informed Heterogeneous Graph Neural Network
+This repository implements a Morphology-Inspired Heterogeneous Graph Neural Network (MI-HGNN) for estimating contact information on the feet of a quadruped robot. For more details, see our publication "[MI-HGNN: Morphology-Informed Heterogeneous Graph Neural Network for Legged Robot Contact Perception](https://arxiv.org/abs/2409.11146)" and our [project page](https://lunarlab-gatech.github.io/Morphology-Informed-HGNN/).
-Additionally, by providing a compatible URDF file, this software can convert a variety of robot structures to graph format for learning with the MI-HGNN. See [#Applying-MI-HGNN-to-your-own-robot
-](#applying-mi-hgnn-to-your-own-robot) for more information.
+Additionally, it can be applied to a variety of robot structures and datasets, as our software can convert compatible robot URDF files to graph format and provides a template for implementing custom datasets. See [#Applying-MI-HGNN-to-your-own-robot](#applying-mi-hgnn-to-your-own-robot) for more information.

-For information on our method, see our [project page](https://lunarlab-gatech.github.io/Morphology-Informed-HGNN/) and [paper](https://arxiv.org/abs/2409.11146).
+## Setup
+---
-## Installation
+### Installation
To get started, setup a Conda Python environment with Python=3.11:
```
conda create -n mi-hgnn python=3.11
@@ -22,37 +22,62 @@ pip install .
Note, if you have any issues with setup, refer to `environment_files/README.md` so you can install the exact libraries we used.
-## URDF Download
+### URDF Download
The necessary URDF files are part of git submodules in this repository, so run the following commands to download them:
```
git submodule init
git submodule update
```
-## Replicating Paper Experiments
+## Usage
+---
-To replicate the experiments referenced in our paper or access our trained model weights, see `paper/README.md`.
+### Replicating Paper Experiments
-## Applying MI-HGNN to your own robot
+We provide code for replicating the exact experiments in our paper and provide full model weights for every model referenced in our paper. See `paper/README.md` for more information.
-Although in our paper, we only applied the MI-HGNN on quadruped robots for contact perception, it can also be applied to other multi-body dynamical systems. New URDF files can be added by following the instructions in `urdf_files/README.md`, and our software will automatically convert the URDF into a graph compatible for learning with the MI-HGNN.
+
-## Editing and Contributing
+### Applying to your Robot/Dataset
-Datasets can be found in the `src/mi_hgnn/datasets_py` directory, and model definitions and training code can be found in the `src/mi_hgnn/lightning_py` directory. We encourage you to extend the library for your own applications. Please reference [#Replicating-Paper-Experiments](#replicating-paper-experiments) for examples on how to train and evaluate models with our repository.
+Although our paper's scope was limited to application of MI-HGNN on quadruped robots for contact perception, it can easily be applied to other multi-body dynamical systems and on other tasks/datasets, following the steps below:
-After making changes, rebuild the library following the instructions in [#Installation](#installation). To make sure that your changes haven't
-broken critical functionality, run the test cases found in the `tests` directory.
+
-If you'd like to contribute to the repository, write sufficient and necessary test cases for your additions in the `tests` directory, and then open a pull request.
+1. Add new URDF files for your robots by following the instructions in `urdf_files/README.md`. Our software will automatically convert the URDF into a graph compatible for learning with the MI-HGNN.
+2. Incorporate your custom dataset using our `FlexibleDataset` class and starter `CustomDatasetTemplate.py` file by following the instructions at `src/mi_hgnn/datasets_py/README.md`.
+3. After making your changes, rebuild the library following the instructions in [#Installation](#installation). To make sure that your changes haven't
+broken critical functionality, run the test cases with the command `python -m unittest discover tests/ -v`.
+4. Using the files in the `research` directory as an example, call our `train_model` and `evaluate_model` functions provided in `src/mi_hgnn/lightning_py/gnnLightning.py` with defined train, validation, and test sequences.
-## Citation
+We've designed the library to be easily applicable to a variety of datasets and robots, and have provided a variety of customization options in training, dataset creation, and logging. We're excited to see everything you can do with the MI-HGNN!
+
+
+### Simulated A1 Dataset
+
+To evaluate the performance of our model on GRF estimation, we generated our own simulated GRF dataset, which we now contribute to the community as well. We recorded proprioceptive sensor data and the corresponding ground truth GRFs by operating an A1 robot in the [Quad-SDK](https://github.com/lunarlab-gatech/quad_sdk_fork) simulator. In total, our dataset comprises of 530,779 synchronized data samples with a variety of frictions, terrains, and speeds. All of the different sequences are outlined in the table below:
+
+
+
+A visualization of the various data collection environments can be seen below.
+
+
+
+If you'd like to use this dataset, the recorded sequences can be found on [Dropbox](https://www.dropbox.com/scl/fo/4iz1oobx71qoceu2jenie/AJPggD4yIAFXf5508wBz-hY?rlkey=4miys9ap0iaozgdelntms8lxb&st=0oz7kgyq&dl=0). See `paper/README.md` and Section V-B of our publication for specific details on this dataset and how to use it.
+
+## Other Info
+---
+### Contributing
+
+We encourage you to extend the library for your own applications. If you'd like to contribute to the repository, write sufficient and necessary test cases for your additions in the `tests` directory, and then open a pull request. Reach out to us if you have any questions.
+
+### Citation
If you find our repository or our work useful, please cite the relevant publication:
```
@article{butterfield2024mi,
- title={MI-HGNN: Morphology-Informed Heterogeneous Graph Neural Network for Legged Robot Contact Perception},
+ title={{MI-HGNN: Morphology-Informed Heterogeneous Graph Neural Network for Legged Robot Contact Perception}},
author={Butterfield, Daniel and Garimella, Sandilya Sai and Cheng, Nai-Jen and Gan, Lu},
journal={arXiv preprint arXiv:2409.11146},
year={2024},
@@ -61,6 +86,6 @@ If you find our repository or our work useful, please cite the relevant publicat
}
```
-## Contact / Issues
+### Contact / Issues
For any issues with this repository, feel free to open an issue on GitHub. For other inquiries, please contact Daniel Butterfield (dbutterfield3@gatech.edu) or the Lunar Lab (https://sites.gatech.edu/lunarlab/).
diff --git a/paper/README.md b/paper/README.md
index d4c77d4..06a4457 100644
--- a/paper/README.md
+++ b/paper/README.md
@@ -99,7 +99,9 @@ Finally, Figure 5 is generated by running the `create_regression_plots.py` file,
### GRF Quad-SDK Dataset
-For this experiment, we used a dataset that we generated ourselves using Quad-SDK and Gazebo. Our modified fork that we used can be found here: [quad-sdk-fork](https://github.com/lunarlab-gatech/quad_sdk_fork). We generated a total of 21 sequences. The following table relates the dataset sequence name (in code) to the corresponding parameters used for that sequence:
+For this experiment, we generated a dataset using Quad-SDK and Gazebo. The dataset consists of synchronized proprioceptive sensor measurements for a simulated A1 robot at a maximum of 500 Hz, including joint angle, joint angular velocity, and joint torque from 12 joint encoders, base linear acceleration, base angular velocity from IMU, and GRFs for each leg in Z direction. It also includes the ground truth robot pose, represented as a translation and a quaternion.
+
+We generated a total of 21 sequences. The following table relates the dataset sequence name (in code) to the corresponding parameters used for that sequence:

@@ -107,3 +109,11 @@ In each sequence, the operator loosely followed the high-level control instructi
+The dataset files can be found on [Dropbox](https://www.dropbox.com/scl/fo/4iz1oobx71qoceu2jenie/AJPggD4yIAFXf5508wBz-hY?rlkey=4miys9ap0iaozgdelntms8lxb&st=0oz7kgyq&dl=0), and the code that we used to generate this dataset can be found here: [quad-sdk-fork](https://github.com/lunarlab-gatech/quad_sdk_fork). To find the file corresponding to a sequence in the table above, notice that the files are named with the following convention below:
+
+```
+robot_1_a1____trial1_.bag
+```
+
+Also, the file at `src/mi_hgnn/datasets_py/quadSDKDataset.py` also contains sequence name to file mappings.
+
diff --git a/paper/website_images/MI-HGNN Potential Applications.png b/paper/website_images/MI-HGNN Potential Applications.png
new file mode 100644
index 0000000..de34ebd
Binary files /dev/null and b/paper/website_images/MI-HGNN Potential Applications.png differ
diff --git a/paper/website_images/figure4.png b/paper/website_images/figure4.png
new file mode 100644
index 0000000..864ae6d
Binary files /dev/null and b/paper/website_images/figure4.png differ
diff --git a/src/mi_hgnn/datasets_py/CustomDatasetTemplate.py b/src/mi_hgnn/datasets_py/CustomDatasetTemplate.py
new file mode 100644
index 0000000..8e5a017
--- /dev/null
+++ b/src/mi_hgnn/datasets_py/CustomDatasetTemplate.py
@@ -0,0 +1,118 @@
+from .flexibleDataset import FlexibleDataset
+import scipy.io as sio
+from pathlib import Path
+import numpy as np
+
+class CustomDataset(FlexibleDataset):
+
+
+ # ========================= DOWNLOADING ==========================
+ def get_downloaded_dataset_file_name(self):
+ """
+ Type the name of the file extension of your dataset sequence files here!
+ """
+ return "data."
+
+ # ========================= PROCESSING ===========================
+ def process(self):
+ # Load the path to the downoaded file
+ path_to_file = Path(self.root, 'raw', 'data.')
+
+ # TODO: Convert into a MATLAB data dictionary format here!
+ mat_data = None
+
+ # Make sure to save it at this location
+ sio.savemat(Path(self.root, 'processed', 'data.mat'), mat_data)
+
+ # TODO: Get the number of dataset entries in the file
+ dataset_entries = None
+
+ # Write a txt file to save the dataset length & and first sequence index
+ with open(str(Path(self.processed_dir, "info.txt")), "w") as f:
+ file_id, loc = self.get_file_id_and_loc()
+ f.write(str(dataset_entries) + " " + file_id)
+
+ # ============= DATA SORTING ORDER AND MAPPINGS ==================
+ def get_urdf_name_to_dataset_array_index(self) -> dict:
+ """
+ Implement this function to tell `FlexibleDataset` how
+ the data returned by `load_data_at_dataset_seq()` corresponds
+ to the joints in the robot URDF file.
+
+ Traditionally a robot only has one base node, so it should get a value
+ of 0. Next, type the name of each leg joint in the URDF file, and add
+ the index of its value in the corresponding joint arrays returned by
+ load_data_at_dataset_seq(). Do the same for the joints in the URDF
+ representing a fixed foot, with the indices of their values in the foot
+ position and foot velocity arrays.
+ """
+
+ return {
+ '': 0,
+
+ '': 2,
+ '': 0,
+ '': 1,
+
+ '': 1,
+ '': 0,
+ }
+
+ # ===================== DATASET PROPERTIES =======================
+ def get_expected_urdf_name(self):
+ return ""
+
+ # ======================== DATA LOADING ==========================
+ def load_data_at_dataset_seq(self, seq_num: int):
+ """
+ When this function is called, the .mat file data saved in process()
+ is available at self.mat_data.
+
+ For information on the expected format of these variables, see the
+ load_data_at_dataset_seq() function defition in flexibleDataset.py.
+ """
+
+ # TODO: Load the data as numpy arrays, and don't forget to incorporate self.history_length
+ # to load a history of measurments.
+ lin_acc = None
+ ang_vel = None
+ j_p = None
+ j_v = None
+ j_T = None
+ f_p = None
+ f_v = None
+ contact_labels = None
+ r_p = None
+ r_o = None
+ timestamps = None
+ # Note, if you don't have data for a specific return value, just return None,
+ # and `FlexibleDataset` will know not to use it if it is not required.
+
+ return lin_acc, ang_vel, j_p, j_v, j_T, f_p, f_v, contact_labels, r_p, r_o, timestamps
+
+# ================================================================
+# ===================== DATASET SEQUENCES ========================
+# ================================================================
+
+class CustomDataset_sequence1(CustomDataset):
+ """
+ To load a dataset sequence from Google, first upload the corresponding file on Google Drive, set "General Access"
+ to "Anyone with the link", and then copy the link. Paste the link, and then extract the string between the text of
+ '/file/d/' and '/view?usp=sharing'. Take this string, and paste it as the first return argument below.
+ """
+ def get_file_id_and_loc(self):
+ return "", "Google"
+
+class CustomDataset_sequence2(CustomDataset):
+ """
+ To load a dataset sequence from Dropbox, first you'll need to upload the corresponding file on Dropbox and
+ generate a link for viewing. Make sure that access is given to anyone with the link, and that this permission won't
+ expire, doesn't require a password, and allows for downloading. Finally, copy and paste the link as the first return
+ argument below, but change the last number from 0 to 1 (this tells Dropbox to send the raw file, instead of a webpage).
+ """
+ def get_file_id_and_loc(self):
+ return "", "Dropbox"
+
+"""
+Create classes for each of your sequences...
+"""
\ No newline at end of file
diff --git a/src/mi_hgnn/datasets_py/README.md b/src/mi_hgnn/datasets_py/README.md
new file mode 100644
index 0000000..99a4b2f
--- /dev/null
+++ b/src/mi_hgnn/datasets_py/README.md
@@ -0,0 +1,63 @@
+# Implementing Custom Datasets
+
+We hope that many people use our MI-HGNN on a variety of datasets. We provide the `FlexibleDataset` class which provides many convenient features and can be inherited for use with custom datasets. Below is a short summary of its features:
+- Automatic download of relevant datasets from the Internet (from Google Drive or Dropbox).
+- Data sorting to match the order of joint, foot, and base nodes in the robot graph.
+- Wrapper for the `RobotGraph` class that generates the graph from the robot URDF file.
+- Easy customization with custom history lengths and a normalization parameter.
+- Provides custom get() function returns for training both an MLP and the MI-HGNN.
+- Option for easy evaluation on floating-base dynamics model, though our current implementation is specific for the simulated A1 robot in our paper, meaning changes will be necessary for proper results on your robot.
+
+However, `FlexibleDataset` currently only supports the following input data:
+- lin_acc (np.array) - IMU linear acceleration
+- ang_vel (np.array) - IMU angular velocity
+- j_p (np.array) - Joint positions
+- j_v (np.array) - Joint velocities
+- j_T (np.array) - Joint Torques
+- f_p (np.array) - Foot position
+- f_v (np.array) - Foot velocity
+- labels (np.array) - The Dataset labels (either Z direction GRFs, or contact states)
+- r_p (np.array) - Robot position (GT)
+- r_o (np.array) - Robot orientation (GT) as a quaternion, in the order (x, y, z, w)
+- timestamps (np.array) - Array containing the timestamps of the data
+
+Also note that not all of these are used depending on the applied model (MLP vs. MIHGNN vs Floating-Base Dynamics).
+
+If `FlexibleDataset` supports your input data, then you can easily use it by writing a simple dataset class that inherits from `FlexibleDataset`, similar to `LinTzuYaunDataset` or `QuadSDKDataset`. We've provided a template for you in the `CustomDatasetTemplate.py` file, which you can use to start.
+
+## Using the Custom Dataset Template
+
+This section will explain how to edit the `CustomDatasetTemplate.py` file for use with your own dataset to take advantage of the features of the `FlexibleDataset` class.
+
+First, open the file and rename the class to your liking.
+
+### Adding Dataset Sequences
+Next, scroll down to the bottom of the file where it says `DATASET SEQUENCES`. Add every sequence of your dataset as its own class, which will require you to upload the data either to Dropbox or Google. See `CustomDatasetTemplate.py` for details.
+
+This is a clean way for data loading, as it allows the user to later combine different sequences as they'd like with the `torch.utils.data.ConcatDataset` function (see `research/train_classification_sample_eff.py` for an example). Defining these classes also means that training an MI-HGNN model on a different computer doesn't require the user to manually download any datasets, as `FlexibleDataset` will do it for you.
+
+Also, when the files are downloaded, they will be renamed to the value provided by `get_downloaded_dataset_file_name()`. Overwrite this function so that the file extension is correct (`.mat` for a Matlab file, `.bag` for a ROSbag file, etc).
+
+### Implementing Data Processing
+Now that you can load your dataset files, you need to implement processing. This step should be implemented in `process()`, and should convert the file from whatever format it is currently in into a `.mat` file for fast training speeds. You'll also need to provide code for extracting the number of dataset entries in this sequence, which will be saved into a .txt file for future use.
+
+Implement this function. You can see `quadSDKDataset.py` for an example of converting a ROSbag file into a .mat file.
+
+### Implementing Data Loading
+Now that data is loaded and processed, you can now implement the function for opening the .mat file and extracting the relevant dataset sequence.
+This should be done in `load_data_at_dataset_seq()`. The .mat file you saved in the last step will now be available at `self.mat_data` for easy access.
+Note that this function will also need to use the `self.history_length` parameter to support training with a history of measurements. See `CustomDatasetTemplate.py` for details, and see `LinTzuYaunDataset.py` for a proper implementation of this function.
+
+### Setting the proper URDF file
+Since its easy for the user to provide the wrong URDF file for a dataset sequence, `FlexibleDataset` checks that the URDF file provided by the user matches what the dataset expects. You can tell `FlexibleDataset` which URDF file should be used with this dataset by going to the URDF file and copying the name found at the top of the file, like pictured below:
+
+```
+
+```
+
+This name should be pasted into `get_expected_urdf_name()`.
+
+### Facilitating Data Sorting
+Finally, the last step is to tell `FlexibleDataset` what order your dataset data is in. For example, which index in the joint position array corresponds to a specific joint in the URDF file? To do this, you'll implement `get_urdf_name_to_dataset_array_index()`. See `CustomDatasetTemplate.py` for more details.
+
+After doing this, your dataset will work with our current codebase for training MLP and MI-HGNN models! You can now instantiate your dataset and use it like in the examples in the `research` directory. Happy Training!
diff --git a/src/mi_hgnn/datasets_py/quadSDKDataset.py b/src/mi_hgnn/datasets_py/quadSDKDataset.py
index 4bd5f81..b017906 100644
--- a/src/mi_hgnn/datasets_py/quadSDKDataset.py
+++ b/src/mi_hgnn/datasets_py/quadSDKDataset.py
@@ -451,15 +451,15 @@ def load_data_at_dataset_seq(self, seq_num: int):
# A1_DEPRECATED
class QuadSDKDataset_A1Speed0_5_DEPRECATED(QuadSDKDataset_A1_DEPRECATED):
def get_file_id_and_loc(self):
- return "17tvm0bmipTpueehUNQ-hJ8w5arc79q0M", "Google"
+ return "https://www.dropbox.com/scl/fi/q8t67zc6yeyb78nqngqu2/QuadSDK-A1Speed0.5-OnlyForTestCases.bag?rlkey=38ihtx8mxkvh4cspxsm8xrudj&st=ftw2k75k&dl=1", "Dropbox"
class QuadSDKDataset_A1Speed1_0_DEPRECATED(QuadSDKDataset_A1_DEPRECATED):
def get_file_id_and_loc(self):
- return "1qSdm8Rm6UazwhzCV5DfMHF0AoyKNrthf", "Google"
+ return "https://www.dropbox.com/scl/fi/8o8mkw5079yers8i0rhhs/QuadSDK-A1Speed1.0-OnlyForTestCases.bag?rlkey=mkigp3t3253py9ih5tskl3w8c&st=m7o3n6yg&dl=1", "Dropbox"
class QuadSDKDataset_A1Speed1_5FlippedOver_DEPRECATED(QuadSDKDataset_A1_DEPRECATED):
def get_file_id_and_loc(self):
- return "1h5CN-IIJlLnMvWp0sk5Ho-hiJq2NMqCT", "Google"
+ return "https://www.dropbox.com/scl/fi/irxyiafwpn4rflrtfrb1g/QuadSDK-A1Speed1.5FlippedOver-OnlyForTestCases.bag?rlkey=0ldqimfn2jogwax6tdkq6vvqu&st=xrni7lp8&dl=1", "Dropbox"
# A1
class QuadSDKDataset_A1_Alpha(QuadSDKDataset_A1):
diff --git a/src/mi_hgnn/graphParser.py b/src/mi_hgnn/graphParser.py
index 61a5f59..3ee90af 100644
--- a/src/mi_hgnn/graphParser.py
+++ b/src/mi_hgnn/graphParser.py
@@ -70,8 +70,7 @@ def __init__(self,
Constructor for RobotGraph class.
Args:
- urdf_path (Path): The absolute path from this file (graphParser.py)
- to the desired urdf file to load.
+ urdf_path (Path): The absolute path to the desired urdf file to load.
ros_builtin_path (str): The path ROS uses in the urdf file to navigate
to the urdf description directory. An example looks like this:
"package://a1_description/". You can find this by manually looking
diff --git a/tests/testGraphParser.py b/tests/testGraphParser.py
index 99c8558..2982857 100644
--- a/tests/testGraphParser.py
+++ b/tests/testGraphParser.py
@@ -13,12 +13,11 @@
class TestNormalRobotGraph(unittest.TestCase):
def setUp(self):
- self.hyq_path = Path(
- Path(__file__).cwd(), 'urdf_files', 'HyQ', 'hyq.urdf').absolute()
+ self.mini_cheetah_path = Path(
+ Path(__file__).cwd(), 'urdf_files', 'MiniCheetah', 'miniCheetah.urdf').absolute()
- self.HyQ_URDF = NormalRobotGraph(self.hyq_path,
- 'package://hyq_description/',
- 'hyq-description')
+ self.mini_cheetah_URDF = NormalRobotGraph(self.mini_cheetah_path,
+ 'package://yobotics_description/', 'mini-cheetah-gazebo-urdf/yobo_model/yobotics_description')
def test_constructor(self):
"""
@@ -26,15 +25,15 @@ def test_constructor(self):
"""
joint_names = [
- 'floating_base', 'lf_haa_joint', 'lf_hfe_joint', 'lf_kfe_joint',
- 'lf_foot_joint', 'rf_haa_joint', 'rf_hfe_joint', 'rf_kfe_joint',
- 'rf_foot_joint', 'lh_haa_joint', 'lh_hfe_joint', 'lh_kfe_joint',
- 'lh_foot_joint', 'rh_haa_joint', 'rh_hfe_joint', 'rh_kfe_joint',
- 'rh_foot_joint'
+ 'floating_base',
+ 'FL_hip_joint', 'FL_thigh_joint', 'FL_calf_joint', 'FL_foot_fixed',
+ 'FR_hip_joint', 'FR_thigh_joint', 'FR_calf_joint', 'FR_foot_fixed',
+ 'RL_hip_joint', 'RL_thigh_joint', 'RL_calf_joint', 'RL_foot_fixed',
+ 'RR_hip_joint', 'RR_thigh_joint', 'RR_calf_joint', 'RR_foot_fixed',
]
edge_names_copy = copy.deepcopy(joint_names)
- for i, node in enumerate(self.HyQ_URDF.nodes):
+ for i, node in enumerate(self.mini_cheetah_URDF.nodes):
self.assertTrue(node.name in edge_names_copy)
edge_names_copy.remove(node.name)
self.assertEqual(0, len(edge_names_copy))
@@ -44,28 +43,28 @@ def test_constructor(self):
# Additionally, links with multiple children joints get one
# edge for each child.
desired_edges = [
- RobotGraph.Edge('trunk_to_lf_haa_joint', "floating_base",
- "lf_haa_joint", None),
- RobotGraph.Edge('trunk_to_lh_haa_joint', "floating_base",
- "lh_haa_joint", None),
- RobotGraph.Edge('trunk_to_rf_haa_joint', "floating_base",
- "rf_haa_joint", None),
- RobotGraph.Edge('trunk_to_rh_haa_joint', "floating_base",
- "rh_haa_joint", None),
- RobotGraph.Edge('lf_hipassembly', "lf_haa_joint", "lf_hfe_joint", None),
- RobotGraph.Edge('lf_upperleg', "lf_hfe_joint", "lf_kfe_joint", None),
- RobotGraph.Edge('lf_lowerleg', "lf_kfe_joint", "lf_foot_joint", None),
- RobotGraph.Edge('rf_hipassembly', "rf_haa_joint", "rf_hfe_joint", None),
- RobotGraph.Edge('rf_upperleg', "rf_hfe_joint", "rf_kfe_joint", None),
- RobotGraph.Edge('rf_lowerleg', "rf_kfe_joint", "rf_foot_joint", None),
- RobotGraph.Edge('lh_hipassembly', "lh_haa_joint", "lh_hfe_joint", None),
- RobotGraph.Edge('lh_upperleg', "lh_hfe_joint", "lh_kfe_joint", None),
- RobotGraph.Edge('lh_lowerleg', "lh_kfe_joint", "lh_foot_joint", None),
- RobotGraph.Edge('rh_hipassembly', "rh_haa_joint", "rh_hfe_joint", None),
- RobotGraph.Edge('rh_upperleg', "rh_hfe_joint", "rh_kfe_joint", None),
- RobotGraph.Edge('rh_lowerleg', "rh_kfe_joint", "rh_foot_joint", None)
+ RobotGraph.Edge('trunk_to_FL_hip_joint', "floating_base",
+ "FL_hip_joint", None),
+ RobotGraph.Edge('trunk_to_FR_hip_joint', "floating_base",
+ "FR_hip_joint", None),
+ RobotGraph.Edge('trunk_to_RL_hip_joint', "floating_base",
+ "RL_hip_joint", None),
+ RobotGraph.Edge('trunk_to_RR_hip_joint', "floating_base",
+ "RR_hip_joint", None),
+ RobotGraph.Edge('FL_hip', "FL_hip_joint", "FL_thigh_joint", None),
+ RobotGraph.Edge('FL_thigh', "FL_thigh_joint", "FL_calf_joint", None),
+ RobotGraph.Edge('FL_calf', "FL_calf_joint", "FL_foot_fixed", None),
+ RobotGraph.Edge('FR_hip', "FR_hip_joint", "FR_thigh_joint", None),
+ RobotGraph.Edge('FR_thigh', "FR_thigh_joint", "FR_calf_joint", None),
+ RobotGraph.Edge('FR_calf', "FR_calf_joint", "FR_foot_fixed", None),
+ RobotGraph.Edge('RL_hip', "RL_hip_joint", "RL_thigh_joint", None),
+ RobotGraph.Edge('RL_thigh', "RL_thigh_joint", "RL_calf_joint", None),
+ RobotGraph.Edge('RL_calf', "RL_calf_joint", "RL_foot_fixed", None),
+ RobotGraph.Edge('RR_hip', "RR_hip_joint", "RR_thigh_joint", None),
+ RobotGraph.Edge('RR_thigh', "RR_thigh_joint", "RR_calf_joint", None),
+ RobotGraph.Edge('RR_calf', "RR_calf_joint", "RR_foot_fixed", None)
]
- for i, edge in enumerate(self.HyQ_URDF.edges):
+ for i, edge in enumerate(self.mini_cheetah_URDF.edges):
match_found = False
for j, desired_edge in enumerate(desired_edges):
if edge.name == desired_edge.name:
@@ -89,7 +88,7 @@ def test_constructor(self):
]
edge_names_copy = copy.deepcopy(joint_names)
num_matches = 0
- for i, node in enumerate(self.HyQ_URDF.nodes):
+ for i, node in enumerate(self.mini_cheetah_URDF.nodes):
for j, node_des in enumerate(edge_names_copy):
if (node.name == node_des):
self.assertEqual(node.get_node_type(), des_node_type[j])
@@ -103,15 +102,15 @@ def test_constructor(self):
# is stored for one of the Nodes.
# ==================
node_found = False
- for i, node in enumerate(self.HyQ_URDF.nodes):
- if node.name == "rh_kfe_joint":
+ for i, node in enumerate(self.mini_cheetah_URDF.nodes):
+ if node.name == "RL_hip_joint":
# Test name information
joint: urchin.Joint= node.joint
- self.assertEqual("rh_kfe_joint", joint.name)
+ self.assertEqual("RL_hip_joint", joint.name)
# Test joint information
- np.testing.assert_array_equal(np.array([[1.0, 0.0, 0.0, 0.35], [0.0, 1.0, 0.0, 0.0], [0.0, 0.0, 1.0, 0.0], [0.0, 0.0, 0.0, 1.0]]), joint.origin)
- self.assertEqual(0.1, joint.dynamics.damping)
+ np.testing.assert_array_equal(np.array([[1.0, 0.0, 0.0, -0.196], [0.0, 1.0, 0.0, 0.049664], [0.0, 0.0, 1.0, 0.0], [0.0, 0.0, 0.0, 1.0]]), joint.origin)
+ self.assertEqual(0.0, joint.dynamics.damping)
self.assertEqual(0.0, joint.dynamics.friction)
node_found = True
@@ -120,16 +119,16 @@ def test_constructor(self):
self.assertTrue(node_found)
edge_found = False
- for i, edge in enumerate(self.HyQ_URDF.edges):
- if edge.name == "lh_hipassembly":
+ for i, edge in enumerate(self.mini_cheetah_URDF.edges):
+ if edge.name == "FL_calf":
# Test name information
link: urchin.Link = edge.link
- self.assertEqual("lh_hipassembly", link.name)
+ self.assertEqual("FL_calf", link.name)
# Test inertial information
- np.testing.assert_array_equal(np.array([[1.0, 0.0, 0.0, 0.04263], [0.0, 1.0, 0.0, -0.0], [0.0, 0.0, 1.0, -0.16931], [0.0, 0.0, 0.0, 1.0]]), link.inertial.origin)
- self.assertEqual(2.93, link.inertial.mass)
- np.testing.assert_array_equal(np.array([[0.05071, 4e-05, 0.00159], [4e-05, 0.05486, -5e-05], [0.00159, -5e-05, 0.00571]]), link.inertial.inertia)
+ np.testing.assert_array_equal(np.array([[1.0, 0.0, 0.0, 0.0], [0.0, 1.0, 0.0, 0.0], [0.0, 0.0, 1.0, 0.0], [0.0, 0.0, 0.0, 1.0]]), link.inertial.origin)
+ self.assertEqual(0.064, link.inertial.mass)
+ np.testing.assert_array_equal(np.array([[0.000214698666667, 0.0, 0.0], [0.0, 0.000214698666667, 0.0], [0.0, 0.0, 2.73066666667e-06]]), link.inertial.inertia)
edge_found = True
break
@@ -141,21 +140,21 @@ def test_get_connections_to_link(self):
Check that we can properly find the connections to the links in the library.
"""
- edge_parent, edge_children = self.HyQ_URDF.get_connections_to_link(
+ edge_parent, edge_children = self.mini_cheetah_URDF.get_connections_to_link(
urchin.Link("base_link", None, None, None))
self.assertEqual(edge_parent, None)
self.assertSequenceEqual(edge_children, ["floating_base"])
- edge_parent, edge_children = self.HyQ_URDF.get_connections_to_link(
+ edge_parent, edge_children = self.mini_cheetah_URDF.get_connections_to_link(
urchin.Link("trunk", None, None, None))
self.assertEqual(edge_parent, "floating_base")
self.assertSequenceEqual(
edge_children,
- ["lf_haa_joint", "rf_haa_joint", "lh_haa_joint", "rh_haa_joint"])
+ ["RL_hip_joint", "FL_hip_joint", "RR_hip_joint", "FR_hip_joint"])
- edge_parent, edge_children = self.HyQ_URDF.get_connections_to_link(
- urchin.Link("lf_foot", None, None, None))
- self.assertEqual(edge_parent, "lf_foot_joint")
+ edge_parent, edge_children = self.mini_cheetah_URDF.get_connections_to_link(
+ urchin.Link("FL_foot", None, None, None))
+ self.assertEqual(edge_parent, "FL_foot_fixed")
self.assertSequenceEqual(edge_children, [])
def test_create_updated_urdf_file(self):
@@ -165,14 +164,14 @@ def test_create_updated_urdf_file(self):
"""
# Delete the urdf file
- hyq_path_updated = self.hyq_path.parent / "hyq_updated.urdf"
- os.remove(str(hyq_path_updated))
- self.assertFalse(os.path.exists(hyq_path_updated))
+ mini_cheetah_path_updated = self.mini_cheetah_path.parent / "miniCheetah_updated.urdf"
+ os.remove(str(mini_cheetah_path_updated))
+ self.assertFalse(os.path.exists(mini_cheetah_path_updated))
# Rebuild it
- RobotGraph(self.hyq_path, 'package://hyq_description/',
- 'hyq-description')
- self.assertTrue(os.path.exists(hyq_path_updated))
+ RobotGraph(self.mini_cheetah_path, 'package://yobotics_description/',
+ 'mini-cheetah-gazebo-urdf/yobo_model/yobotics_description')
+ self.assertTrue(os.path.exists(mini_cheetah_path_updated))
def test_get_node_name_to_index_dict(self):
"""
@@ -180,11 +179,11 @@ def test_get_node_name_to_index_dict(self):
are unique.
"""
- key = list(self.HyQ_URDF.get_node_name_to_index_dict())
+ key = list(self.mini_cheetah_URDF.get_node_name_to_index_dict())
get_nodes_index = []
for key in key:
- index = self.HyQ_URDF.get_node_name_to_index_dict()[key]
+ index = self.mini_cheetah_URDF.get_node_name_to_index_dict()[key]
get_nodes_index.append(index)
self.assertTrue(pd.Index(get_nodes_index).is_unique)
@@ -195,12 +194,12 @@ def test_get_node_index_to_name_dict(self):
index_to_name dict and the name_to_index dict are consistent.
"""
- index_to_name = list(self.HyQ_URDF.get_node_index_to_name_dict())
- name_to_index = list(self.HyQ_URDF.get_node_name_to_index_dict())
+ index_to_name = list(self.mini_cheetah_URDF.get_node_index_to_name_dict())
+ name_to_index = list(self.mini_cheetah_URDF.get_node_name_to_index_dict())
get_nodes_index = []
for key in name_to_index:
- index = self.HyQ_URDF.get_node_name_to_index_dict()[key]
+ index = self.mini_cheetah_URDF.get_node_name_to_index_dict()[key]
get_nodes_index.append(index)
self.assertEqual(index_to_name, get_nodes_index)
@@ -210,7 +209,7 @@ def test_get_edge_index_matrix(self):
Check the dimensionality of the edge matrix.
"""
- edge_matrix = self.HyQ_URDF.get_edge_index_matrix()
+ edge_matrix = self.mini_cheetah_URDF.get_edge_index_matrix()
self.assertEqual(edge_matrix.shape[0], 2)
self.assertEqual(edge_matrix.shape[1], 32)
@@ -220,7 +219,7 @@ def test_get_num_nodes(self):
Check that the number of nodes are correct.
"""
- self.assertEqual(self.HyQ_URDF.get_num_nodes(), 17)
+ self.assertEqual(self.mini_cheetah_URDF.get_num_nodes(), 17)
def test_get_edge_connections_to_name_dict(self):
"""
@@ -230,13 +229,13 @@ def test_get_edge_connections_to_name_dict(self):
"""
connections_to_name = list(
- self.HyQ_URDF.get_edge_connections_to_name_dict())
+ self.mini_cheetah_URDF.get_edge_connections_to_name_dict())
name_to_connections = list(
- self.HyQ_URDF.get_edge_name_to_connections_dict())
+ self.mini_cheetah_URDF.get_edge_name_to_connections_dict())
result = []
for key in name_to_connections:
- connections = self.HyQ_URDF.get_edge_name_to_connections_dict(
+ connections = self.mini_cheetah_URDF.get_edge_name_to_connections_dict(
)[key]
for i in range(connections.shape[1]):
real_reshaped = np.squeeze(connections[:, i].reshape(1, -1))
@@ -252,12 +251,12 @@ def test_get_edge_name_to_connections_dict(self):
"""
name_to_connections = list(
- self.HyQ_URDF.get_edge_name_to_connections_dict())
+ self.mini_cheetah_URDF.get_edge_name_to_connections_dict())
all_connections = []
# Get all connections from dictionary
for key in name_to_connections:
- connections = self.HyQ_URDF.get_edge_name_to_connections_dict(
+ connections = self.mini_cheetah_URDF.get_edge_name_to_connections_dict(
)[key]
for i in range(connections.shape[1]):
real_reshaped = np.squeeze(connections[:, i].reshape(1, -1))
diff --git a/urdf_files/HyQ/hyq-description b/urdf_files/HyQ/hyq-description
deleted file mode 160000
index f95d651..0000000
--- a/urdf_files/HyQ/hyq-description
+++ /dev/null
@@ -1 +0,0 @@
-Subproject commit f95d651be787ea5369b7fa354f3e896eae741067
diff --git a/urdf_files/HyQ/hyq.pdf b/urdf_files/HyQ/hyq.pdf
deleted file mode 100644
index 07da419..0000000
Binary files a/urdf_files/HyQ/hyq.pdf and /dev/null differ
diff --git a/urdf_files/HyQ/hyq.urdf b/urdf_files/HyQ/hyq.urdf
deleted file mode 100644
index 95feec8..0000000
--- a/urdf_files/HyQ/hyq.urdf
+++ /dev/null
@@ -1,893 +0,0 @@
-
-
-
-
-
-
-
-
- /hyq
- gazebo_ros_control/DefaultRobotHWSim
- true
- 0.004
-
-
- /hyq
- trunk
- ground_truth
- world
- 0 0 0
- 0 0 0
- 0
- true
- 250.0
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
- 1.5
- 1.5
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
- transmission_interface/SimpleTransmission
-
- hardware_interface/EffortJointInterface
-
-
- 1
-
-
-
-
- transmission_interface/SimpleTransmission
-
- hardware_interface/EffortJointInterface
-
-
- 1
-
-
-
-
- transmission_interface/SimpleTransmission
-
- hardware_interface/EffortJointInterface
-
-
- 1
-
-
-
- true
-
-
- true
-
-
- true
-
-
- 1000000.0
- 100.0
- 1.5
- 1.5
- 1 0 0
- 1.0
- 0.00
-
-
- 1000000.0
- 100.0
- 1.0
- 1.0
- 1.0
- 1
-
- true
- 250.0
-
- lf_lowerleg_collision
-
-
-
- /hyq/lf_shin_bumper
-
-
-
-
- 1000000.0
- 100.0
- 1.0
- 1.0
- 1.0
- 1
-
- true
- 250.0
-
- lf_lowerleg_fixed_joint_lump__lf_foot_collision_1
-
-
-
- /hyq/lf_foot_bumper
- hyq
-
-
- Gazebo/Black
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
- transmission_interface/SimpleTransmission
-
- hardware_interface/EffortJointInterface
-
-
- 1
-
-
-
-
- transmission_interface/SimpleTransmission
-
- hardware_interface/EffortJointInterface
-
-
- 1
-
-
-
-
- transmission_interface/SimpleTransmission
-
- hardware_interface/EffortJointInterface
-
-
- 1
-
-
-
- true
-
-
- true
-
-
- true
-
-
- 1000000.0
- 100.0
- 1.5
- 1.5
- 1 0 0
- 1.0
- 0.00
-
-
- 1000000.0
- 100.0
- 1.0
- 1.0
- 1.0
- 1
-
- true
- 250.0
-
- rf_lowerleg_collision
-
-
-
- /hyq/rf_shin_bumper
-
-
-
-
- 1000000.0
- 100.0
- 1.0
- 1.0
- 1.0
- 1
-
- true
- 250.0
-
- rf_lowerleg_fixed_joint_lump__rf_foot_collision_1
-
-
-
- /hyq/rf_foot_bumper
- hyq
-
-
- Gazebo/Black
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
- transmission_interface/SimpleTransmission
-
- hardware_interface/EffortJointInterface
-
-
- 1
-
-
-
-
- transmission_interface/SimpleTransmission
-
- hardware_interface/EffortJointInterface
-
-
- 1
-
-
-
-
- transmission_interface/SimpleTransmission
-
- hardware_interface/EffortJointInterface
-
-
- 1
-
-
-
- true
-
-
- true
-
-
- true
-
-
- 1000000.0
- 100.0
- 1.5
- 1.5
- 1 0 0
- 1.0
- 0.00
-
-
- 1000000.0
- 100.0
- 1.0
- 1.0
- 1.0
- 1
-
- true
- 250.0
-
- lh_lowerleg_collision
-
-
-
- /hyq/lh_shin_bumper
-
-
-
-
- 1000000.0
- 100.0
- 1.0
- 1.0
- 1.0
- 1
-
- true
- 250.0
-
- lh_lowerleg_fixed_joint_lump__lh_foot_collision_1
-
-
-
- /hyq/lh_foot_bumper
- hyq
-
-
- Gazebo/Black
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
-
- transmission_interface/SimpleTransmission
-
- hardware_interface/EffortJointInterface
-
-
- 1
-
-
-
-
- transmission_interface/SimpleTransmission
-
- hardware_interface/EffortJointInterface
-
-
- 1
-
-
-
-
- transmission_interface/SimpleTransmission
-
- hardware_interface/EffortJointInterface
-
-
- 1
-
-
-
- true
-
-
- true
-
-
- true
-
-
- 1000000.0
- 100.0
- 1.5
- 1.5
- 1 0 0
- 1.0
- 0.00
-
-
- 1000000.0
- 100.0
- 1.0
- 1.0
- 1.0
- 1
-
- true
- 250.0
-
- rh_lowerleg_collision
-
-
-
- /hyq/rh_shin_bumper
-
-
-
-
- 1000000.0
- 100.0
- 1.0
- 1.0
- 1.0
- 1
-
- true
- 250.0
-
- rh_lowerleg_fixed_joint_lump__rh_foot_collision_1
-
-
-
- /hyq/rh_foot_bumper
- hyq
-
-
- Gazebo/Black
-
-
diff --git a/urdf_files/README.md b/urdf_files/README.md
index e79a223..3a2ef08 100644
--- a/urdf_files/README.md
+++ b/urdf_files/README.md
@@ -1,25 +1,28 @@
# URDF files
-Each robot URDF file with its corresponding repository can be found in a folder with its name.
+Each robot URDF file with its corresponding repository can be found in a folder with its name. As robots can have multiple URDF files (for example, with different naming conventions, frame definitions, etc), double check any URDF in this directory before you use it to make sure it has the data you expect. For example, "A1" refers to the URDF file from [unitreerobotics](https://github.com/unitreerobotics/unitree_ros/tree/master/robots/a1_description), whereas "A1-Quad" refers to the URDF file from our fork of [quad-sdk](https://github.com/lunarlab-gatech/quad_sdk_fork/tree/a1).
Note, the ```urdfParser.py``` file will take the ```*.urdf``` file and generate a ```*_updated.urdf``` file in the same directory, which contains updated paths based on your current system. For this reason, these files are not commited to the GitHub repo, as they should vary per device.
## Adding a new URDF file
-Before you add a new URDF file, you need to make sure that it has all of it's non-kinematic nodes pruned. If it doesn't you'll have to generate a new URDF file without them. See [Generating new URDF files](#generating-new-urdf-files) for more information.
+Before you add a new URDF file, you need to make sure that it has all of it's non-kinematic nodes pruned. If it doesn't, you'll have to generate a new URDF file without them. See [Generating new URDF files](#generating-new-urdf-files) for more information.
-Once you have a new URDF file, make sure to create a new folder for it and add the URDF. Add the repository that the URDF depends on to the folder as a git submodule. Finally, if you want to, add the PDF following the instructions in [Generating PDF](#generating-pdf)
+Once you have a new URDF file, make sure to create a new folder for it and add the URDF. Add the repository that the URDF depends on to the folder as a git submodule. Finally, if you desire, you can add the PDF for the URDF following the instructions in [Generating PDF](#generating-pdf).
### Generating new URDF files
First, make sure to install ROS and the corresponding repository that comes along with the URDF file (which I'll call the URDF repository). Build the repository using ROS, and make sure it builds without errors. You may need to install more dependencies, depending on what the URDF repository says.
-Next, find the xacro file used to create the URDF. This will most likely be found in the URDF repository. Manually comment out any non-kinematic structures. Then run the following command to generate the new URDF file:
+Next, find the xacro file used to create the URDF. This will most likely be found in the URDF repository. Manually comment out any non-kinematic structures. Why is this necessary? Our MI-HGNN relies upon the input graph being morphology-informed, which we define as graph nodes representing kinematic joints and graph edges representing kinematic links. Thus, it assumes that the input graph is composed of all of the robots kinematic joints, and that no other fixed structures are included (like cameras or IMUs). Failing to remove any non-kinematic structures will cause unexpected effects, likely decreasing model performance. An easy way to make sure you have done this properly is by [generating a PDF](#generating-pdf) of the URDF file after you create it.
+
+Run the following command to generate the new URDF file:
```
rosrun xacro xacro >
```
+
### Generating PDF
To generate the pdf file that corresponds to the urdf file, install ROS 1, and run the following command: