top of page

Factories of Future: Collaborative Mobile Manipulation

Research Engineer project, RPL Lab, KTH Stockholm

Project Guide - Prof. Patric Jensfelt

Project Members - Anirvan Dutta, Thibaud Hiltenbrand

The aim of the project was to develop a framework for mobile manipulation which can collaborate with humans. I was responsible for the mapping, localization and motion planning for the mobile base comprising the Clearpath's Ridgeback. The challenge was to implement multi-modal sensor fusion using 3 Intel RGBD cameras, LIDAR and odometry information. The multiple RGBD cameras provided a wide field of view, but required significant computation resources. After analyzing various open source methods, RTABMap was selected for mapping and localization. RTABMap (standing for Real-Time Appearance-Based Mapping) is a powerful tool to efficiently map an environment, as well as localization inside it. Additionally, under the ROS framework, it provides easy integration with existing navigation stack move_base, making the entire pipeline as Simultaneous Planning Localization and Mapping (SPLAM).

​

For performing manipulation tasks, state machine and Baxter packages were utilized. Additionally, it was observed that the localization provided by the RTABMap was slightly erroneous and effected the manipulation performance. Therefore, an additional correction was performed by detecting the planar surface and aligning the mobile base accordingly using the table_top package. This ensured that mobile base consistently reaches and aligns at the assigned location for the pickup/drop tasks. 

 

Here is a short video of the complete demonstration and the complete source code is available here.

 

bottom of page