Using lidar sensor in orbit #291
-
Hi, I have a robot that uses a 2D lidar sensor. I have added the RTX Lidar sensor to the robot inside Isaac Sim. When I run the simulation from orbit I can see the dots created by the lidar sensors, indicating that it is working. However, I'm not sure how to access the data generated by the lidar sensors from orbit. From the documentation it reads: I'm not sure if this means that I can access the lidar sensor I created in Isaac Sim from its prim path, or if I have to spawn a new one from the orbit API. Can someone clarify the workflow for accessing data from lidar sensors in orbit? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 13 replies
-
Hi @LukasVindbjerg , We had planned to add support for RTX lidar but since it wasn't parallelized, we decided to remove it in favor of our custom wrap-based ray-caster. However, there are certain benefits to the RTX implementation (such as surface reflectivity and realism beyond physical meshes). At this point, the feature is not in our immediate development plans. However, it would be great to support this feature. As a starting point, you can make the sensor class similar to how |
Beta Was this translation helpful? Give feedback.
You need to call
env.render()
to allow rendering of sensor data. I noticed that we right now only call it implicitly in thestep
call when GUI is enabled. When you use RSL-RL or RecordVideo wrapper, they internally call therender
after step so you see it as working.But I'm not happy with this arrangement. Would be nice to have a general way to "call" render when a sensor needs it.