Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adds IMU sensor #619

Open
wants to merge 27 commits into
base: main
Choose a base branch
from
Open

Adds IMU sensor #619

wants to merge 27 commits into from

Conversation

pascal-roth
Copy link
Collaborator

@pascal-roth pascal-roth commented Jul 2, 2024

Description

Add IMU sensor with cfg class IMUCfg and data class IMUData. Compared to the Isaac Sim implementation of the IMU Sensor, this sensor directly accesses the PhysX view buffers for speed acceleration.

Fixes #440

Type of change

  • New feature (non-breaking change which adds functionality)

Checklist

  • I have run the pre-commit checks with ./isaaclab.sh --format
  • I have made corresponding changes to the documentation
  • My changes generate no new warnings
  • I have added tests that prove my fix is effective or that my feature works
  • I have updated the changelog and the corresponding version in the extension's config/extension.toml file
  • I have added my name to the CONTRIBUTORS.md or my name already exists there

@pascal-roth pascal-roth self-assigned this Jul 2, 2024
Comment on lines +139 to +148
# obtain the velocities of the sensors
lin_vel_w, ang_vel_w = self._view.get_velocities()[env_ids].split([3, 3], dim=-1)
# store the velocities
# note: we clone here because the obtained tensors are read-only
self._data.ang_vel_b[env_ids] = math_utils.quat_rotate_inverse(self._data.quat_w[env_ids], ang_vel_w)
self._data.lin_acc_b[env_ids] = math_utils.quat_rotate_inverse(
self._data.quat_w[env_ids],
(lin_vel_w - self._last_lin_vel_w[env_ids]) / max(self._dt, self.cfg.update_period),
)
self._last_lin_vel_w[env_ids] = lin_vel_w.clone()
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The offset is only used for the pose. It isn't accounted for in calculating the linear and angular velocities/accelerations.

Copy link
Collaborator Author

@pascal-roth pascal-roth Jul 5, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it is included, as with the offset I am updating the world pose and orientation which is later used when transforming the velocities and accelerations from world into the sensor frame

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

But that isn't the same right? From the robot dynamics notes at ETH:

image image image image image

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you are correct, didn't thought that through

Copy link
Contributor

@Mayankm96 Mayankm96 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The math is wrong for calculating the linear and angular quantities when an offset is present.

Also need to update the docs: https://github.com/isaac-sim/IsaacLab/blob/main/docs/source/api/lab/omni.isaac.lab.sensors.rst

Copy link
Collaborator

@jsmith-bdai jsmith-bdai left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry for late review on this one! Mainly some doc changes, but great work! This will be useful for some users at AI institute too 😄

@@ -1,6 +1,11 @@
Changelog
---------

* Added IMU sensor implementation that directly accessess the physx view :class:`omni.isaac.lab.sensors.IMU`. The
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
* Added IMU sensor implementation that directly accessess the physx view :class:`omni.isaac.lab.sensors.IMU`. The
* Added IMU sensor implementation that directly accesses the physx view :class:`omni.isaac.lab.sensors.IMU`. The

class IMU(SensorBase):
"""The inertia measurement unit sensor.

The sensor can be attached to any RigidObject in the scene. The sensor provides the linear acceleration and angular
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

After #644 merges in, this should update to RigidObject or Articulation

Suggested change
The sensor can be attached to any RigidObject in the scene. The sensor provides the linear acceleration and angular
The sensor can be attached to any :class:`RigidObject` or :class:`Articulation` in the scene. The sensor provides the linear acceleration and angular

You might want to reformat this as I made the line much longer

be able to access the data from the sensor. It also initializes the internal buffers to store the data.

Raises:
RuntimeError: If the number of imu prims in the view does not match the number of environments.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Where is this check happening?

# data buffers
self._data.pos_w = torch.zeros(self._view.count, 3, device=self._device)
self._data.quat_w = torch.zeros(self._view.count, 4, device=self._device)
self._data.quat_w[:, 0] = 1.0
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🔥 You don't know how many times I've forgotten this and gotten some strange errors 😄

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we add unit tests for the new methods? The simplest test is just converting to and from a convention and ensuring the new solution is close enough

parser.add_argument(
"--terrain_type",
type=str,
default="generator",
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
default="generator",
default="generator",
choices=["generator", "usd", "plane"],

),
)

# sensors - frame transformer (filled inside unit test)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
# sensors - frame transformer (filled inside unit test)
# sensors - imu (filled inside unit test)

# read data from sim
self.scene.update(self.sim.get_physics_dt())

# skip first steo where initial velocity is zero
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
# skip first steo where initial velocity is zero
# skip first step where initial velocity is zero

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we expect this to work on CPU and GPU? If so, I would use the pattern in test_rigid_object.py where we iterate over CPU and GPU devices?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Proposal] Anyone tried adding an IMU Sensor?
3 participants