44
55# Movement Primitives
66
7+ > Dynamical movement primitives (DMPs), probabilistic movement primitives
8+ > (ProMPs), and spatially coupled bimanual DMPs for imitation learning.
9+
710Movement primitives are a common group of policy representations in robotics.
811There are many types and variations. This repository focuses mainly on
9- imitation learning, generalization, and adaptation of movement primitives.
10- It provides implementations in Python and Cython.
12+ imitation learning, generalization, and adaptation of movement primitives for
13+ Cartesian motions of robots. It provides implementations in Python and Cython
14+ and can be installed directly from
15+ [ PyPI] ( https://pypi.org/project/movement-primitives/ ) .
16+
17+ ## Content
18+
19+ * [ Features] ( #features )
20+ * [ API Documentation] ( #api-documentation )
21+ * [ Install Library] ( #install-library )
22+ * [ Examples] ( #examples )
23+ * [ Build API Documentation] ( #build-api-documentation )
24+ * [ Test] ( #test )
25+ * [ Contributing] ( #contributing )
26+ * [ Non-public Extensions] ( #non-public-extensions )
27+ * [ Related Publications] ( #related-publications )
28+ * [ Funding] ( #funding )
1129
1230## Features
1331
@@ -19,14 +37,10 @@ It provides implementations in Python and Cython.
1937* Propagation of DMP weight distribution to state space distribution
2038* Probabilistic Movement Primitives (ProMPs)
2139
22- Example of dual Cartesian DMP with
23- [ RH5 Manus] ( https://robotik.dfki-bremen.de/en/research/robot-systems/rh5-manus/ ) :
24-
25- <img src =" https://raw.githubusercontent.com/dfki-ric/movement_primitives/main/doc/source/_static/dual_cart_dmp_rh5_with_panel.gif " width =" 256px " />
26-
27- Example of joint space DMP with UR5:
40+ <img src =" https://raw.githubusercontent.com/dfki-ric/movement_primitives/main/doc/source/_static/dual_cart_dmp_rh5_with_panel.gif " height =" 200px " /><img src =" https://raw.githubusercontent.com/dfki-ric/movement_primitives/main/doc/source/_static/dmp_ur5_minimum_jerk.gif " height =" 200px " />
2841
29- <img src =" https://raw.githubusercontent.com/dfki-ric/movement_primitives/main/doc/source/_static/dmp_ur5_minimum_jerk.gif " width =" 256px " />
42+ Left: Example of dual Cartesian DMP with [ RH5 Manus] ( https://robotik.dfki-bremen.de/en/research/robot-systems/rh5-manus/ ) .
43+ Right: Example of joint space DMP with UR5.
3044
3145## API Documentation
3246
@@ -71,83 +85,12 @@ or install the library with
7185python setup.py install
7286```
7387
74- ## Non-public Extensions
75-
76- Scripts from the subfolder ` examples/external_dependencies/ ` require access to
77- git repositories (URDF files or optional dependencies) and datasets that are
78- not publicly available. They are available on request (email
79- 80-
81- Note that the library does not have any non-public dependencies! They are only
82- required to run all examples.
83-
84- ### MoCap Library
85-
86- ``` bash
87- # untested: pip install git+https://git.hb.dfki.de/dfki-interaction/mocap.git
88- git clone
[email protected] :dfki-interaction/mocap.git
89- cd mocap
90- python -m pip install -e .
91- cd ..
92- ```
93-
94- ### Get URDFs
95-
96- ``` bash
97- # RH5
98- git clone
[email protected] :models-robots/rh5_models/pybullet-only-arms-urdf.git --recursive
99- # RH5v2
100- git clone
[email protected] :models-robots/rh5v2_models/pybullet-urdf.git --recursive
101- # Kuka
102- git clone
[email protected] :models-robots/kuka_lbr.git
103- # Solar panel
104- git clone
[email protected] :models-objects/solar_panels.git
105- # RH5 Gripper
106- git clone
[email protected] :motto/abstract-urdf-gripper.git --recursive
107- ```
108-
109- ### Data
110-
111- I assume that your data is located in the folder ` data/ ` in most scripts.
112- You should put a symlink there to point to your actual data folder.
113-
114- ## Build API Documentation
115-
116- You can build an API documentation with [ pdoc3] ( https://pdoc3.github.io/pdoc/ ) .
117- You can install pdoc3 with
118-
119- ``` bash
120- pip install pdoc3
121- ```
122-
123- ... and build the documentation from the main folder with
124-
125- ``` bash
126- pdoc movement_primitives --html
127- ```
128-
129- It will be located at ` html/movement_primitives/index.html ` .
130-
131- ## Test
132-
133- To run the tests some python libraries are required:
134-
135- ``` bash
136- python -m pip install -e .[test]
137- ```
138-
139- The tests are located in the folder ` test/ ` and can be executed with:
140- ` python -m nose test `
141-
142- This command searches for all files with ` test ` and executes the functions with ` test_* ` .
143-
144- ## Contributing
145-
146- To add new features, documentation, or fix bugs you can open a pull request.
147- Directly pushing to the main branch is not allowed.
148-
14988## Examples
15089
90+ You will find a lot of examples in the subfolder
91+ [ ` examples/ ` ] ( https://github.com/dfki-ric/movement_primitives/tree/main/examples ) .
92+ Here are just some highlights to showcase the library.
93+
15194### Conditional ProMPs
15295
15396<img src =" https://raw.githubusercontent.com/dfki-ric/movement_primitives/main/doc/source/_static/conditional_promps.png " width =" 800px " />
@@ -157,7 +100,7 @@ trajectories that can be conditioned on viapoints. In this example, we
157100plot the resulting posterior distribution after conditioning on varying
158101start positions.
159102
160- [ Script] ( examples/plot_conditional_promp.py )
103+ [ Script] ( https://github.com/dfki-ric/movement_primitives/blob/main/ examples/plot_conditional_promp.py)
161104
162105### Potential Field of 2D DMP
163106
@@ -167,7 +110,7 @@ A Dynamical Movement Primitive defines a potential field that superimposes
167110several components: transformation system (goal-directed movement), forcing
168111term (learned shape), and coupling terms (e.g., obstacle avoidance).
169112
170- [ Script] ( examples/plot_dmp_potential_field.py )
113+ [ Script] ( https://github.com/dfki-ric/movement_primitives/blob/main/ examples/plot_dmp_potential_field.py)
171114
172115### DMP with Final Velocity
173116
@@ -177,7 +120,7 @@ Not all DMPs allow a final velocity > 0. In this case we analyze the effect
177120of changing final velocities in an appropriate variation of the DMP
178121formulation that allows to set the final velocity.
179122
180- [ Script] ( examples/plot_dmp_with_final_velocity.py )
123+ [ Script] ( https://github.com/dfki-ric/movement_primitives/blob/main/ examples/plot_dmp_with_final_velocity.py)
181124
182125### ProMPs
183126
@@ -188,7 +131,7 @@ The LASA Handwriting dataset learned with ProMPs. The dataset consists of
188131demonstrations and the second and fourth column show the imitated ProMPs
189132with 1-sigma interval.
190133
191- [ Script] ( examples/plot_promp_lasa.py )
134+ [ Script] ( https://github.com/dfki-ric/movement_primitives/blob/main/ examples/plot_promp_lasa.py)
192135
193136### Cartesian DMPs
194137
@@ -197,7 +140,7 @@ with 1-sigma interval.
197140A trajectory is created manually, imitated with a Cartesian DMP, converted
198141to a joint trajectory by inverse kinematics, and executed with a UR5.
199142
200- [ Script] ( examples/vis_cartesian_dmp.py )
143+ [ Script] ( https://github.com/dfki-ric/movement_primitives/blob/main/ examples/vis_cartesian_dmp.py)
201144
202145### Contextual ProMPs
203146
@@ -209,7 +152,7 @@ kinesthetic teaching. The panel width is considered to be the context over
209152which we generalize with contextual ProMPs. Each color in the above
210153visualizations corresponds to a ProMP for a different context.
211154
212- [ Script] ( examples/external_dependencies/vis_contextual_promp_distribution.py )
155+ [ Script] ( https://github.com/dfki-ric/movement_primitives/blob/main/ examples/external_dependencies/vis_contextual_promp_distribution.py)
213156
214157** Dependencies that are not publicly available:**
215158
@@ -229,7 +172,7 @@ visualizations corresponds to a ProMP for a different context.
229172We offer specific dual Cartesian DMPs to control dual-arm robotic systems like
230173humanoid robots.
231174
232- Scripts: [ Open3D] ( examples/external_dependencies/vis_solar_panel.py ) , [ PyBullet] ( examples/external_dependencies/sim_solar_panel.py )
175+ Scripts: [ Open3D] ( https://github.com/dfki-ric/movement_primitives/blob/main/ examples/external_dependencies/vis_solar_panel.py) , [ PyBullet] ( https://github.com/dfki-ric/movement_primitives/blob/main/ examples/external_dependencies/sim_solar_panel.py)
233176
234177** Dependencies that are not publicly available:**
235178
@@ -251,7 +194,7 @@ We can introduce a coupling term in a dual Cartesian DMP to constrain the
251194relative position, orientation, or pose of two end-effectors of a dual-arm
252195robot.
253196
254- Scripts: [ Open3D] ( examples/external_dependencies/vis_cartesian_dual_dmp.py ) , [ PyBullet] ( examples/external_dependencies/sim_cartesian_dual_dmp.py )
197+ Scripts: [ Open3D] ( https://github.com/dfki-ric/movement_primitives/blob/main/ examples/external_dependencies/vis_cartesian_dual_dmp.py) , [ PyBullet] ( https://github.com/dfki-ric/movement_primitives/blob/main/ examples/external_dependencies/sim_cartesian_dual_dmp.py)
255198
256199** Dependencies that are not publicly available:**
257200
@@ -271,7 +214,7 @@ Scripts: [Open3D](examples/external_dependencies/vis_cartesian_dual_dmp.py), [Py
271214If we have a distribution over DMP parameters, we can propagate them to state
272215space through an unscented transform.
273216
274- [ Script] ( examples/external_dependencies/vis_dmp_to_state_variance.py )
217+ [ Script] ( https://github.com/dfki-ric/movement_primitives/blob/main/ examples/external_dependencies/vis_dmp_to_state_variance.py)
275218
276219** Dependencies that are not publicly available:**
277220
@@ -286,6 +229,165 @@ space through an unscented transform.
286229 git clone
[email protected] :models-robots/kuka_lbr.git
287230 ```
288231
232+ ## Build API Documentation
233+
234+ You can build an API documentation with sphinx.
235+ You can install all dependencies with
236+
237+ ``` bash
238+ python -m pip install movement_primitives[doc]
239+ ```
240+
241+ ... and build the documentation from the folder ` doc/ ` with
242+
243+ ``` bash
244+ make html
245+ ```
246+
247+ It will be located at ` doc/build/html/index.html ` .
248+
249+ ## Test
250+
251+ To run the tests some python libraries are required:
252+
253+ ``` bash
254+ python -m pip install -e .[test]
255+ ```
256+
257+ The tests are located in the folder ` test/ ` and can be executed with:
258+ ` python -m nose test `
259+
260+ This command searches for all files with ` test ` and executes the functions with ` test_* ` .
261+
262+ ## Contributing
263+
264+ You can report bugs in the [ issue tracker] ( https://github.com/dfki-ric/movement_primitives/issues ) .
265+ If you have questions about the software, please use the [ discussions
266+ section] ( https://github.com/dfki-ric/movement_primitives/discussions ) .
267+ To add new features, documentation, or fix bugs you can open a pull request
268+ on [ GitHub] ( https://github.com/dfki-ric/movement_primitives ) . Directly pushing
269+ to the main branch is not allowed.
270+
271+ The recommended workflow to add a new feature, add documentation, or fix a bug
272+ is the following:
273+
274+ * Push your changes to a branch (e.g., feature/x, doc/y, or fix/z) of your fork
275+ of the repository.
276+ * Open a pull request to the main branch of the main repository.
277+
278+ This is a checklist for new features:
279+
280+ - are there unit tests?
281+ - does it have docstrings?
282+ - is it included in the API documentation?
283+ - run flake8 and pylint
284+ - should it be part of the readme?
285+ - should it be included in any example script?
286+
287+ ## Non-public Extensions
288+
289+ Scripts from the subfolder ` examples/external_dependencies/ ` require access to
290+ git repositories (URDF files or optional dependencies) and datasets that are
291+ not publicly available. They are available on request (email
292+ 293+
294+ Note that the library does not have any non-public dependencies! They are only
295+ required to run all examples.
296+
297+ ### MoCap Library
298+
299+ ``` bash
300+ # untested: pip install git+https://git.hb.dfki.de/dfki-interaction/mocap.git
301+ git clone
[email protected] :dfki-interaction/mocap.git
302+ cd mocap
303+ python -m pip install -e .
304+ cd ..
305+ ```
306+
307+ ### Get URDFs
308+
309+ ``` bash
310+ # RH5
311+ git clone
[email protected] :models-robots/rh5_models/pybullet-only-arms-urdf.git --recursive
312+ # RH5v2
313+ git clone
[email protected] :models-robots/rh5v2_models/pybullet-urdf.git --recursive
314+ # Kuka
315+ git clone
[email protected] :models-robots/kuka_lbr.git
316+ # Solar panel
317+ git clone
[email protected] :models-objects/solar_panels.git
318+ # RH5 Gripper
319+ git clone
[email protected] :motto/abstract-urdf-gripper.git --recursive
320+ ```
321+
322+ ### Data
323+
324+ I assume that your data is located in the folder ` data/ ` in most scripts.
325+ You should put a symlink there to point to your actual data folder.
326+
327+ ## Related Publications
328+
329+ This library implements several types of dynamical movement primitives and
330+ probabilistic movement primitives. These are described in detail in the
331+ following papers.
332+
333+ [ 1] Ijspeert, A. J., Nakanishi, J., Hoffmann, H., Pastor, P., Schaal, S. (2013).
334+ Dynamical Movement Primitives: Learning Attractor Models for Motor
335+ Behaviors, Neural Computation 25 (2), 328-373. DOI: 10.1162/NECO_a_00393,
336+ https://homes.cs.washington.edu/~todorov/courses/amath579/reading/DynamicPrimitives.pdf
337+
338+ [ 2] Pastor, P., Hoffmann, H., Asfour, T., Schaal, S. (2009).
339+ Learning and Generalization of Motor Skills by Learning from Demonstration.
340+ In 2009 IEEE International Conference on Robotics and Automation,
341+ (pp. 763-768). DOI: 10.1109/ROBOT.2009.5152385,
342+ https://h2t.iar.kit.edu/pdf/Pastor2009.pdf
343+
344+ [ 3] Muelling, K., Kober, J., Kroemer, O., Peters, J. (2013).
345+ Learning to Select and Generalize Striking Movements in Robot Table Tennis.
346+ International Journal of Robotics Research 32 (3), 263-279.
347+ https://www.ias.informatik.tu-darmstadt.de/uploads/Publications/Muelling_IJRR_2013.pdf
348+
349+ [ 4] Ude, A., Nemec, B., Petric, T., Murimoto, J. (2014).
350+ Orientation in Cartesian space dynamic movement primitives.
351+ In IEEE International Conference on Robotics and Automation (ICRA)
352+ (pp. 2997-3004). DOI: 10.1109/ICRA.2014.6907291,
353+ https://acat-project.eu/modules/BibtexModule/uploads/PDF/udenemecpetric2014.pdf
354+
355+ [ 5] Gams, A., Nemec, B., Zlajpah, L., Wächter, M., Asfour, T., Ude, A. (2013).
356+ Modulation of Motor Primitives using Force Feedback: Interaction with
357+ the Environment and Bimanual Tasks (2013), In 2013 IEEE/RSJ International
358+ Conference on Intelligent Robots and Systems (pp. 5629-5635). DOI:
359+ 10.1109/IROS.2013.6697172,
360+ https://h2t.anthropomatik.kit.edu/pdf/Gams2013.pdf
361+
362+ [ 6] Vidakovic, J., Jerbic, B., Sekoranja, B., Svaco, M., Suligoj, F. (2019).
363+ Task Dependent Trajectory Learning from Multiple Demonstrations Using
364+ Movement Primitives (2019),
365+ In International Conference on Robotics in Alpe-Adria Danube Region (RAAD)
366+ (pp. 275-282). DOI: 10.1007/978-3-030-19648-6_32,
367+ https://link.springer.com/chapter/10.1007/978-3-030-19648-6_32
368+
369+ [ 7] Paraschos, A., Daniel, C., Peters, J., Neumann, G. (2013).
370+ Probabilistic movement primitives, In C.J. Burges and L. Bottou and M.
371+ Welling and Z. Ghahramani and K.Q. Weinberger (Eds.), Advances in Neural
372+ Information Processing Systems, 26,
373+ https://papers.nips.cc/paper/2013/file/e53a0a2978c28872a4505bdb51db06dc-Paper.pdf
374+
375+ [ 8] Maeda, G. J., Neumann, G., Ewerton, M., Lioutikov, R., Kroemer, O.,
376+ Peters, J. (2017). Probabilistic movement primitives for coordination of
377+ multiple human–robot collaborative tasks. Autonomous Robots, 41, 593-612.
378+ DOI: 10.1007/s10514-016-9556-2,
379+ https://link.springer.com/article/10.1007/s10514-016-9556-2
380+
381+ [ 9] Paraschos, A., Daniel, C., Peters, J., Neumann, G. (2018).
382+ Using probabilistic movement primitives in robotics. Autonomous Robots, 42,
383+ 529-551. DOI: 10.1007/s10514-017-9648-7,
384+ https://www.ias.informatik.tu-darmstadt.de/uploads/Team/AlexandrosParaschos/promps_auro.pdf
385+
386+ [ 10] Lazaric, A., Ghavamzadeh, M. (2010).
387+ Bayesian Multi-Task Reinforcement Learning. In Proceedings of the 27th
388+ International Conference on International Conference on Machine Learning
389+ (ICML'10) (pp. 599-606). https://hal.inria.fr/inria-00475214/document
390+
289391## Funding
290392
291393This library has been developed initially at the
0 commit comments