You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+10-5Lines changed: 10 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -32,16 +32,21 @@ git submodule update
32
32
33
33
### Replicating Paper Experiments
34
34
35
-
To replicate the experiments referenced in our paper (like the "GRF Estimation" experiment shown below) or to access our trained model weights, see `paper/README.md`.
35
+
We provide code for replicating the exact experiments in our paper and provide full model weights for every model referenced in our paper. See `paper/README.md` for more information.
36
36
37
37
<imgsrc="paper/website_images/figure5.png"alt="Parameter sizes and Ablation study"width="600">
38
38
39
39
### Applying to your Robot/Dataset
40
40
41
-
Although in our paper, we only applied the MI-HGNN on quadruped robots for contact perception, it can also be applied to other multi-body dynamical systems and on other tasks/datasets. New URDF files can be added by following the instructions in `urdf_files/README.md`. Datasets can be found in the `src/mi_hgnn/datasets_py` directory, and model definitions and training code can be found in the `src/mi_hgnn/lightning_py` directory. We encourage you to extend the library for your own applications. Please reference [#Replicating-Paper-Experiments](#replicating-paper-experiments) for examples on how to train and evaluate models with our repository.
41
+
Although our paper's scope was limited to application of MI-HGNN on quadruped robots for contact perception, it can easily be applied to other multi-body dynamical systems and on other tasks/datasets, following the steps below:
42
42
43
-
After making changes, rebuild the library following the instructions in [#Installation](#installation). To make sure that your changes haven't
44
-
broken critical functionality, run the test cases found in the `tests` directory.
43
+
1. Add new URDF files for your robots by following the instructions in `urdf_files/README.md`.
44
+
2. Incorporate your custom dataset using our `FlexibleDataset` class and starter `CustomDatasetTemplate.py` file by following the instructions at `src/mi_hgnn/datasets_py/README.md`.
45
+
3. After making your changes, rebuild the library following the instructions in [#Installation](#installation). To make sure that your changes haven't
46
+
broken critical functionality, run the test cases with the command `python -m unittest discover tests/ -v`.
47
+
4. Using the files in the `research` directory as an example, call our `train_model` and `evaluate_model` functions provided in `src/mi_hgnn/lightning_py/gnnLightning.py` with defined train, validation, and test sequences.
48
+
49
+
We've designed the library to be easily applicable to a variety of datasets and robots, and have provided a variety of customization options in training, dataset creation, and logging. We're excited to see everything you can do with the MI-HGNN!
45
50
46
51
### Simulated A1 Dataset
47
52
@@ -59,7 +64,7 @@ If you'd like to use this dataset, the recorded sequences can be found on [Dropb
59
64
---
60
65
### Contributing
61
66
62
-
If you'd like to contribute to the repository, write sufficient and necessary test cases for your additions in the `tests` directory, and then open a pull request.
67
+
We encourage you to extend the library for your own applications. If you'd like to contribute to the repository, write sufficient and necessary test cases for your additions in the `tests` directory, and then open a pull request. Reach out to us if you have any questions.
We hope that many people use our MI-HGNN on a variety of datasets. We provide the `FlexibleDataset` class which provides many convenient features and can be inherited for use with custom datasets. Below is a short summary of its features:
4
+
- Automatic download of relevant datasets from the Internet (from Google Drive or Dropbox).
5
+
- Data sorting to match the order of joint, foot, and base nodes in the robot graph.
6
+
- Wrapper for the `RobotGraph` class that generates the graph from the robot URDF file.
7
+
- Easy customization with custom history lengths and a normalization parameter.
8
+
- Provides custom get() function returns for training both an MLP and the MI-HGNN.
9
+
- Option for easy evaluation on floating-base dynamics model, though our current implementation is specific for the simulated A1 robot in our paper, meaning changes will be necessary for proper results on your robot.
10
+
11
+
However, `FlexibleDataset` currently only supports the following input data:
12
+
- lin_acc (np.array) - IMU linear acceleration
13
+
- ang_vel (np.array) - IMU angular velocity
14
+
- j_p (np.array) - Joint positions
15
+
- j_v (np.array) - Joint velocities
16
+
- j_T (np.array) - Joint Torques
17
+
- f_p (np.array) - Foot position
18
+
- f_v (np.array) - Foot velocity
19
+
- labels (np.array) - The Dataset labels (either Z direction GRFs, or contact states)
20
+
- r_p (np.array) - Robot position (GT)
21
+
- r_o (np.array) - Robot orientation (GT) as a quaternion, in the order (x, y, z, w)
22
+
- timestamps (np.array) - Array containing the timestamps of the data
23
+
24
+
Also note that not all of these are used depending on the applied model (MLP vs. MIHGNN vs Floating-Base Dynamics).
25
+
26
+
If `FlexibleDataset` supports your input data, then you can easily use it by writing a simple dataset class that inherits from `FlexibleDataset`, similar to `LinTzuYaunDataset` or `QuadSDKDataset`. We've provided a template for you in the `CustomDatasetTemplate.py` file, which you can use to start.
27
+
28
+
## Using the Custom Dataset Template
29
+
30
+
This section will explain how to edit the `CustomDatasetTemplate.py` file for use with your own dataset to take advantage of the features of the `FlexibleDataset` class.
31
+
32
+
First, open the file and rename the class to your liking.
33
+
34
+
### Adding Dataset Sequences
35
+
Next, scroll down to the bottom of the file where it says `DATASET SEQUENCES`. Add every sequence of your dataset as its own class, which will require you to upload the data either to Dropbox or Google. See `CustomDatasetTemplate.py` for details.
36
+
37
+
This is a clean way for data loading, as it allows the user to later combine different sequences as they'd like with the `torch.utils.data.ConcatDataset` function (see `research/train_classification_sample_eff.py` for an example). Defining these classes also means that training an MI-HGNN model on a different computer doesn't require the user to manually download any datasets, as `FlexibleDataset` will do it for you.
38
+
39
+
Also, when the files are downloaded, they will be renamed to the value provided by `get_downloaded_dataset_file_name()`. Overwrite this function so that the file extension is correct `.mat` for a Matlab file, `.bag` for a ROSbag file, etc.
40
+
41
+
### Implementing Data Processing
42
+
Now that you can load your dataset files, you need to implement processing. This step should be implemented in `process()`, and should convert the file from whatever format it is currently in into a `.mat` file for fast training speeds. You'll also need to provide code for extracting the number of dataset entries in this sequence, which will be saved into a .txt file for future use.
43
+
44
+
Implement this function. You can see `quadSDKDataset.py` for an example of converting a ROSbag file into a .mat file.
45
+
46
+
### Loading data for use with FlexibleDataset
47
+
Now that data is loaded and processed, you can now implement the function for opening the .mat file and extracting the relevant dataset sequence.
48
+
This should be done in `load_data_at_dataset_seq()`. The .mat file you saved in the last step will now be available at `self.mat_data` for easy access.
49
+
Note that this function will also need to use the `self.history_length` parameter to support training with a history of measurements. See `CustomDatasetTemplate.py` for details, and see `LinTzuYaunDataset.py` for a proper implementation of this function.
50
+
51
+
### Setting the proper URDF file
52
+
Since its easy for the user to provide the wrong URDF file for a dataset sequence, `FlexibleDataset` checks that the URDF file provided by the user matches what the dataset expects. You can tell `FlexibleDataset` which URDF file should be used with this dataset by going to the URDF file and copying the name found at the top of the file, like pictured below:
53
+
54
+
```
55
+
<robot name="miniCheetah">
56
+
```
57
+
58
+
This name should be pasted into `get_expected_urdf_name()`.
59
+
60
+
### Facilitating Data Sorting
61
+
Finally, the last step is to tell `FlexibleDataset` what order your dataset data is in. For example, which index in the joint position array corresponds to a specific joint in the URDF file? To do this, you'll implement `get_urdf_name_to_dataset_array_index()`.
62
+
63
+
After doing this, your dataset will work with our current codebase for training MLP and MI-HGNN models! You can now instantiate your dataset and use it in a similar manner to the datasets in the `research` directory. Happy Training!
Copy file name to clipboardExpand all lines: urdf_files/README.md
+6-3Lines changed: 6 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,20 +6,23 @@ Note, the ```urdfParser.py``` file will take the ```*.urdf``` file and generate
6
6
7
7
## Adding a new URDF file
8
8
9
-
Before you add a new URDF file, you need to make sure that it has all of it's non-kinematic nodes pruned. If it doesn't you'll have to generate a new URDF file without them. See [Generating new URDF files](#generating-new-urdf-files) for more information.
9
+
Before you add a new URDF file, you need to make sure that it has all of it's non-kinematic nodes pruned. If it doesn't, you'll have to generate a new URDF file without them. See [Generating new URDF files](#generating-new-urdf-files) for more information.
10
10
11
-
Once you have a new URDF file, make sure to create a new folder for it and add the URDF. Add the repository that the URDF depends on to the folder as a git submodule. Finally, if you want to, add the PDF following the instructions in [Generating PDF](#generating-pdf)
11
+
Once you have a new URDF file, make sure to create a new folder for it and add the URDF. Add the repository that the URDF depends on to the folder as a git submodule. Finally, if you desire, you can add the PDF for the URDF following the instructions in [Generating PDF](#generating-pdf).
12
12
13
13
### Generating new URDF files
14
14
15
15
First, make sure to install ROS and the corresponding repository that comes along with the URDF file (which I'll call the URDF repository). Build the repository using ROS, and make sure it builds without errors. You may need to install more dependencies, depending on what the URDF repository says.
16
16
17
-
Next, find the xacro file used to create the URDF. This will most likely be found in the URDF repository. Manually comment out any non-kinematic structures. Then run the following command to generate the new URDF file:
17
+
Next, find the xacro file used to create the URDF. This will most likely be found in the URDF repository. Manually comment out any non-kinematic structures. Why is this necessary? Our MI-HGNN relies upon the input graph being morphology-informed, which we define as graph nodes representing kinematic joints and graph edges representing kinematic links. Thus, it assumes that the input graph is composed of all of the robots kinematic joints, and that no other fixed structures are included (like cameras or IMUs). Failing to remove any non-kinematic structures will cause unexpected effects, likely decreasing model performance. An easy way to make sure you have done this properly is by [generating a PDF](#generating-pdf) of the URDF file after you create it.
18
+
19
+
Run the following command to generate the new URDF file:
18
20
19
21
```
20
22
rosrun xacro xacro <xacro_path> > <new_urdf_path>
21
23
```
22
24
25
+
23
26
### Generating PDF
24
27
To generate the pdf file that corresponds to the urdf file, install ROS 1, and run the following command:
0 commit comments