Jiang, P.; Hu, C.; Wang, T.; Lv, K.; Guo, T.; Jiang, J.; Hu, W. Research on a Visual/Ultra-Wideband Tightly Coupled Fusion Localization Algorithm. Sensors2024, 24, 1710.
Jiang, P.; Hu, C.; Wang, T.; Lv, K.; Guo, T.; Jiang, J.; Hu, W. Research on a Visual/Ultra-Wideband Tightly Coupled Fusion Localization Algorithm. Sensors 2024, 24, 1710.
Jiang, P.; Hu, C.; Wang, T.; Lv, K.; Guo, T.; Jiang, J.; Hu, W. Research on a Visual/Ultra-Wideband Tightly Coupled Fusion Localization Algorithm. Sensors2024, 24, 1710.
Jiang, P.; Hu, C.; Wang, T.; Lv, K.; Guo, T.; Jiang, J.; Hu, W. Research on a Visual/Ultra-Wideband Tightly Coupled Fusion Localization Algorithm. Sensors 2024, 24, 1710.
Abstract
In the autonomous navigation of mobile robots, precise positioning is crucial. In forest environments with weak satellite signals or in sites disturbed by complex environments, satellite positioning accuracy has difficulty in meeting the requirements of autonomous navigation positioning accuracy for robots. This article proposes a binocular vision SLAM/UWB tightly coupled localization method. The core of this algorithm is based on an Extended Kalman Filter (EKF), which utilizes the displacement increment output by binocular vision ORB-SLAM and the positioning coordinates calculated via UWB positioning as measurement information to achieve data fusion. The research utilized the constructed experimental platform to collect images and UWB ranging data in outdoor environments and experimentally validated the combined positioning method. The experimental results show that this algorithm outperforms the use of the individual UWB or visual SLAM localization methods in terms of positioning accuracy. Compared with traditional single positioning algorithms, this algorithm exhibits better practicality and stability, and can effectively suppress the impact of UWB non line of sight errors. The average error under normal line of sight conditions is 0.058 m, the root mean square error (RMSE) is 0.080, and the average error under non-line-of-sight conditions is 0.081 m, with an RMSE of 0.084. There is no significant jump or drift. Through the tightly coupled processing of binocular vision SLAM and UWB positioning, more reliable and accurate mobile robot positioning can be achieved, providing a useful reference for the autonomous navigation of mobile robots in complex environments.
Keywords
Extended Kalman Filtering; multi sensor fusion; visual SLAM; ultra-wide-band positioning
Subject
Computer Science and Mathematics, Other
Copyright:
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.