Huiming XingLiwei ShiEmail authorKun TangShuxiang GuoEmail authorXihuan HouYu LiuHuikang LiuYao Hu
1.Key Laboratory of Convergence Medical Engineering System and Healthcare Technology, the Ministry of Industry and Information Technology, School of Life ScienceBeijing Institute of TechnologyBeijingChina
2.Key Laboratory of Biomimetic Robots and Systems, Ministry of EducationBeijing Institute of TechnologyBeijingChina
3.Faculty of EngineeringKagawa UniversityTakamatsu, KagawaJapan
In the narrow, submarine, unstructured environment, the present localization approaches, such as GPS measurement, dead-reckoning, acoustic positioning, artificial landmarks-based method, are hard to be used for multiple small-scale underwater robots. Therefore, this paper proposes a novel RGB-D camera and Inertial Measurement Unit (IMU) fusion-based cooperative and relative close-range localization approach for special environments, such as underwater caves. Owing to the rotation movement with zero-radius, the cooperative localization of Multiple Turtle-inspired Amphibious Spherical Robot (MTASRs) is realized. Firstly, we present an efficient Histogram of Oriented Gradient (HOG) and Color Names (CNs) fusion feature extracted from color images of TASRs. Then, by training Support Vector Machine (SVM) classifier with this fusion feature, an automatic recognition method of TASRs is developed. Secondly, RGB-D camera-based measurement model is obtained by the depth map. In order to realize the cooperative and relative close-range localization of MTASRs, the MTASRs model is established with RGB-D camera and IMU. Finally, the depth measurement in water is corrected and the efficiency of RGB-D camera for underwater application is validated. Then experiments of our proposed localization method with three robots were conducted and the results verified the feasibility of the proposed method for MTASRs.
vision localization bio-inspired robots RGB-D camera histogram of oriented gradient and color names fusion feature Cooperative and Relative Localization (CRL)
Full text is available at :