RECENT POSTS
- Book chapter on micro-ROS published
- Vulcanexus includes micro-ROS
- micro-ROS Humble Hawksbill release
- micro-ROS fully supports ROS 2 features
- Micro-ROS at ROS World Workshop on Execution Management
- embeddedRTPS the new experimental middleware for micro-ROS
- micro-ROS at PX4 Development Summit 2021
- Microsoft Azure RTOS integrates micro-ROS
- Renesas and eProsima Simplify Development of Professional Robotics Applications on RA MCUs with micro-ROS Development Framework
- Timeout RMW & QoS in RCLC APIs
- All posts ...
micro-ROS & MoveIt Demo!
Oct 5, 2020 francesca-finocchiaro
We have packaged and delivered a new demo to showcase the integration between micro-ROS and MoveIt 2, a manipulation framework for robotic applications created and maintained by PickNik, especially designed to interoperate with ROS 2. The final result is displayed with the ROS visualization tool Rviz.
We designed the demo in such a way that the MoveIt 2 manipulation and planification algorithms causes a virtual arm to reach out to the board as this is being moved in real space and publishes data on its position and orientation to the ROS 2 ecosystem.
With this demo we bring micro-ROS to a whole different level: thanks to MoveIt’s kinematic planification talent, an app running on a microcontroller is enabled to be integrated into complex algorithms and ROS 2 workflows. This allows all robots in micro-ROS’ ecosystem to perform actual ‘stunts’ of all types.
To run micro-ROS, we employ a STM32L4 Development IoT kit in combination with Zephyr. The board offers several general-purpose I/O pins and peripherals to communicate its 32-bits microcontroller with the external world. It also includes many sensors. For this demo, we make use of a 6-DoF Inertial Measurement Unit (LSM6DSL), composed of an accelerometer and a gyroscope, and a 3-DoF magnetometer (LIS3MDL). The fusion of the measurements fetched by these sensors outputs the pose, or quaternion attitude, of the board, providing its relative orientation with respect to a fixed reference frame. The pose data is then forwarded to the ROS world, where it is consumed by both Rviz and MoveIt. The former uses it directly to represent the position and orientation of the board in its graphical interface, while MoveIt uses it to calculate the movement that the virtual arm has to perform to “touch” it, according to its kinematic algorithms for motion planning. The resulting movement is then integrated into Rviz and represented by means of its virtual Panda robotic arm, a standard tool employed by MoveIt in tutorials and graphic interfaces.
Find the full video of the demo below:
Find the dedicated repo at this link, with instructions on how to reproduce the demo.
Share on:- Older
- Newer