%0 Journal Article %T 基于多传感信息融合的移动机器人定位与环境感知技术
Mobile Robot Localization and Environment Perception Technology Based on Multi-Sensor Information Fusion %A 李力 %A 高明浩 %A 封淳曦 %A 李飞 %A 黄超 %A 何婷婷 %A 柏俊杰 %J Artificial Intelligence and Robotics Research %P 408-418 %@ 2326-3423 %D 2024 %I Hans Publishing %R 10.12677/airr.2024.132042 %X 本文针对移动机器人在复杂环境中的定位精度低和环境感知性能差的问题,基于IMU、GPS检测机器人自身位姿信息,基于激光雷达和RGB相机感知环境信息,开展机器人位姿信息与环境感知信息分层级融合的关键技术研究。在位姿信息融合层,通过粒子群优化算法优化BP神经网络,在基于优化后的BP神经网络改进无迹卡尔曼滤波实现INS与GPS松耦合的组合导航,减小INS元件IMU的偏置和噪声,且利用训练好的神经网络在GPS信号失锁的情况下输出预测信息对惯导系统进行误差校正,输出更精准的速度信息与坐标信息作为绝对位置约束;在环境感知融合层,将补偿后的IMU预积分测量值、加速度和角速度分别与视觉里程计和激光里程计进行二级融合,实现了机器人实时精准定位和更精细的环境地图构建。最后使用真实采集的轨迹对多传感信息两级融合的算法进行验证,实验结果表明该算法提升了机器人定位的准确性与环境感知性能,机器人运动轨迹与原真实轨迹的最大误差为1.46 m、最小误差为0.04 m,平均误差为0.60 m。
This article focuses on the problem of low positioning accuracy and poor environmental perception performance of mobile robots in complex environments. It conducts key technical research on hierarchical fusion of robot pose information and environment perception information based on IMU and GPS for detecting the robot’s own pose information, as well as laser radar and 3D camera for sensing environmental information. In the pose information fusion layer, particle swarm optimization algorithm is used to optimize BP neural network. Unbiased Kalman filtering with improved unscented Kalman filter is implemented to achieve INS-GPS loosely coupled navigation, reducing biases and noise from INS component IMU. Additionally, a trained neural network is utilized to output prediction information for error correction of inertial navigation system when GPS signal loss occurs, providing more accurate velocity and coordinate information as absolute position constraints. In the environment perception fusion layer, compensated IMU pre-integration measurements are fused with visual odometry and lidar odometry separately at a secondary level. This enables real-time precise localization of the robot and finer construction of environmental maps. Finally, real collected trajectories are used to validate the algorithm for two-level fusion of multi-sensor information. Experimental results show that this algorithm improves both positioning accuracy and environmental perception performance of the robot. The maximum error between robot motion trajectory and original true trajectory is 1.46 m units, while the minimum error is 0.04 m units with an average error of 0.60 m units. %K 环境感知,移动机器人,多传感器融合,激光里程计,视觉里程计
Environment Sensing %K Mobile Robotics %K Multi-Sensor Fusion %K Laser Odometer %K Visual Odometer %U http://www.hanspub.org/journal/PaperInformation.aspx?PaperID=88032