全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

基于多传感信息融合的移动机器人定位与环境感知技术
Mobile Robot Localization and Environment Perception Technology Based on Multi-Sensor Information Fusion

DOI: 10.12677/airr.2024.132042, PP. 408-418

Keywords: 环境感知,移动机器人,多传感器融合,激光里程计,视觉里程计
Environment Sensing
, Mobile Robotics, Multi-Sensor Fusion, Laser Odometer, Visual Odometer

Full-Text   Cite this paper   Add to My Lib

Abstract:

本文针对移动机器人在复杂环境中的定位精度低和环境感知性能差的问题,基于IMU、GPS检测机器人自身位姿信息,基于激光雷达和RGB相机感知环境信息,开展机器人位姿信息与环境感知信息分层级融合的关键技术研究。在位姿信息融合层,通过粒子群优化算法优化BP神经网络,在基于优化后的BP神经网络改进无迹卡尔曼滤波实现INS与GPS松耦合的组合导航,减小INS元件IMU的偏置和噪声,且利用训练好的神经网络在GPS信号失锁的情况下输出预测信息对惯导系统进行误差校正,输出更精准的速度信息与坐标信息作为绝对位置约束;在环境感知融合层,将补偿后的IMU预积分测量值、加速度和角速度分别与视觉里程计和激光里程计进行二级融合,实现了机器人实时精准定位和更精细的环境地图构建。最后使用真实采集的轨迹对多传感信息两级融合的算法进行验证,实验结果表明该算法提升了机器人定位的准确性与环境感知性能,机器人运动轨迹与原真实轨迹的最大误差为1.46 m、最小误差为0.04 m,平均误差为0.60 m。
This article focuses on the problem of low positioning accuracy and poor environmental perception performance of mobile robots in complex environments. It conducts key technical research on hierarchical fusion of robot pose information and environment perception information based on IMU and GPS for detecting the robot’s own pose information, as well as laser radar and 3D camera for sensing environmental information. In the pose information fusion layer, particle swarm optimization algorithm is used to optimize BP neural network. Unbiased Kalman filtering with improved unscented Kalman filter is implemented to achieve INS-GPS loosely coupled navigation, reducing biases and noise from INS component IMU. Additionally, a trained neural network is utilized to output prediction information for error correction of inertial navigation system when GPS signal loss occurs, providing more accurate velocity and coordinate information as absolute position constraints. In the environment perception fusion layer, compensated IMU pre-integration measurements are fused with visual odometry and lidar odometry separately at a secondary level. This enables real-time precise localization of the robot and finer construction of environmental maps. Finally, real collected trajectories are used to validate the algorithm for two-level fusion of multi-sensor information. Experimental results show that this algorithm improves both positioning accuracy and environmental perception performance of the robot. The maximum error between robot motion trajectory and original true trajectory is 1.46 m units, while the minimum error is 0.04 m units with an average error of 0.60 m units.

References

[1]  胡飞. 移动机器人技术的应用与展望[J]. 现代职业教育, 2018(32): 229.
[2]  朱洪达, 罗强, 辛琪. 基于多传感器融合的定位技术概述[J]. 南方农机, 2023, 54(18): 165-167.
[3]  刘铭哲, 徐光辉, 唐堂, 等. 激光雷达SLAM算法综述[J]. 计算机工程与应用, 2024, 60(1): 1-14.
[4]  穆龙涛, 权超, 潘冠廷, 等. 复合型移动机器人技术进展与应用前景[J]. 现代农业装备, 2023, 44(4): 10-14.
[5]  仉新, 张旭阳, 毛宇新, 等. 多传感器融合的移动机器人导航系统研究[J]. 长江信息通信, 2023, 36(5): 52-55.
[6]  Duan, Y., Yang, C., Chen, H., et al. (2021) Low-Complexity Point Cloud Denoising for LiDAR by PCA-Based Dimension Reduction. Optics Communications, 482, Article ID: 126567.
https://doi.org/10.1016/j.optcom.2020.126567
[7]  Yin, Z. and Tuzel, O. (2017) VoxelNet: End-to-End Learning for Point Cloud Based 3D Object Detection. arXiv: 1711.06396.
[8]  于春淼. 基于深度相机的视觉SLAM与路径规划[D]: [硕士学位论文]. 哈尔滨: 哈尔滨工业大学, 2021.
[9]  Goodin, C., Carruth, D., Doude, M., et al. (2019) Predicting the Influence of Rain on LIDAR in ADAS. Electronics, 8, Article 89.
https://doi.org/10.3390/electronics8010089
[10]  闫大禹, 宋伟, 王旭丹, 等. 国内室内定位技术发展现状综述[J]. 导航定位学报, 2019, 7(4): 5-12.
[11]  梁文伟, 李魁. 车载RINS方位陀螺温度漂移模型在线修正方法[J]. 中国惯性技术学报, 2022, 30(1): 9-14, 21.
[12]  Mur-Artal, R. and Tardós, J.D. (2017) ORB-SLAM2: An Open-Source Slam System for Monocular, Stereo, and RGB-D Cameras. IEEE Transactions on Robotics, 33, 1255-1262.
https://doi.org/10.1109/TRO.2017.2705103
[13]  Qin, T., Li, P. and Shen, S. (2017) VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State Estimator. IEEE Transactions on Robotics, 34, 1004-1020.
[14]  He, Y., Zhao, J., Guo, Y., et al. (2018) PL-VIO: Tightly-Coupled Monocular Visual-Inertial Odometry Using Point and Line Features. Sensors, 18, Article 1159
https://doi.org/10.3390/s18041159
[15]  Yousuf, S. and Kadri, M.B. (2018) Robot Localization Indoors and Outdoors through Multi-Sensor Fusion. 14th International Conference on Emerging Technologies (ICET), Islamabad, 21-22 November 2018, 1-6.
https://doi.org/10.1109/ICET.2018.8603597
[16]  Rudyk, A.V., Semenov, A.O., Kryvinska, N., et al. (2020) Strapdown Inertial Navigation Systems for Positioning Mobile Robots—MEMS Gyroscopes Random Errors Analysis Using Allan Variance Method. Sensors, 20, Article 4841.
https://doi.org/10.3390/s20174841
[17]  He, C., Ma, R., Qu, H., et al. (2020) Research on Mobile Robot Positioning and Navigation System Based on Multi-Sensor Fusion. Journal of Physics: Conference Series, 1684, Article ID: 012011.
https://doi.org/10.1088/1742-6596/1684/1/012011
[18]  张逵, 郭杭, 敖龙辉. 基于联邦滤波的室内多传感器融合导航定位方法研究[J]. 测控技术, 2021, 40(7): 67-70.
[19]  Wisth, D., Camurri, M., Das, S., et al. (2021) Unified Multi-Modal Landmark Tracking for Tightly Coupled Lidar-Visual-Inertial Odometry. IEEE Robotics and Automation Letters, 6, 1004-1011.
https://doi.org/10.1109/LRA.2021.3056380
[20]  Haider, M.H., Wang, Z., Khan, A.A., et al. (2022) Robust Mobile Robot Navigation in Cluttered Environments Based on Hybrid Adaptive Neuro-Fuzzy Inference and Sensor Fusion. Journal of King Saud University-Computer and Information Sciences, 34, 9060-9070.
https://doi.org/10.1016/j.jksuci.2022.08.031
[21]  邵明志, 何涛, 朱永平, 等. 基于多传感器信息融合移动机器人导航定位研究[J]. 机床与液压, 2023, 51(5): 8-13.

Full-Text

comments powered by Disqus

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133

WeChat 1538708413