Hongling Wang, Chengjin Zhang, Yong Song, and Bao Pang
Robot arm, mobile robot perception SLAM, search and rescue environment, robot visual and LRF multi-sensors, percolator particle filter algorithms
We propose a simultaneous localization and mapping (SLAM) method that adds robot arm exploration to the SLAM process and uses a graphical user interface and Cyton Commands program to control a robot arm to implement perceptive exploration during the SLAM. In a significant SLAM process, a mobile robot explores perceptually the surroundings of search and rescue environment employing its arm to grasp, touch and feel. The robot then marks the significant objects, such as a dangerous area, victims, exit spots, on a newly formed map; this new map is uploaded to the robot’s industrial personal computer; then, the current metric map is updated, and the robot uses the new map to localize itself and then continues to explore the environment. In our simulations and experiments, the arm equipped on a Pioneer LX mobile robot touches and perceives objects within the exploration area while the robot performing the SLAM tasks with embedded visual system and laser range finder multi-sensors fusion, thereby, the effectiveness and feasibility of significant SLAM is validated.
Important Links:
Go Back