Abstract
Bridge inspection is currently a labor intensive task. Utilizing unmanned aerial vehicles (UAVs) to assist in inspection tasks is a promising direction. However, enabling UAVs for autonomous inspection involves the UAV state estimation problems. Since parts of UAV sensors could be unavailable, how to estimate states via sensor fusion is the key. In this paper, we propose a tightly-coupled nonlinear optimization-based system that integrates four kinds of sensors: camera, IMU, Ultra-wideband (UWB) range measurements, and global navigation satellite system (GNSS). Due to the tightly-coupled multi-sensor fusion method and system design, the system takes the advantage of the four sensors, and can seamlessly respond to indoor and outdoor GNSS and UWB loss or reacquisition. It can effectively reduce the long-term trajectory drift and provide smooth and continuous state estimation. The experimental results show that the proposed method outperforms the state-of-the-art approaches.
Article PDF
Similar content being viewed by others
Data Availability
Data sharing not applicable to this article as no data-sets were generated or analysed during the current study
References
Agarwal, S., Mierle, K., Team, T.C.S.: Ceres Solver (2022). https://github.com/ceres-solver/ceres-solver
Bloesch, M., Burri, M., Omari, S., Hutter, M., Siegwart, R.: Iterated extended kalman filter based visual-inertial odometry using direct photometric feedback. The International Journal of Robotics Research 36(10), 1053–1072 (2017)
Brunetto, N., Salti, S., Fioraio, N., Cavallari, T., Stefano, L.: Fusion of inertial and visual measurements for rgb-d slam on mobile devices. Proceedings of the IEEE International conference on computer vision workshops pp. 1–9 (2015)
Campos, C., Elvira, R., Rodríguez, J.J.G., Montiel, J.M., Tardós, J.D.: Orb-slam3: An accurate open-source library for visual, visual-inertial, and multimap slam. IEEE Trans. on Robot. 37(6), 1874–1890 (2021)
Cao, S., Lu, X., Shen, S.: Gvins: Tightly coupled gnss–visual–inertial fusion for smooth and consistent state estimation. IEEE Transactions on robotics (2022)
Cao, Y., Beltrame, G.: Vir-slam: Visual, inertial, and ranging slam for single and multi-robot systems. Auton. Robot. 45, 905–917 (2021)
Cha, Y.J., Choi, W., Büyüköztürk, O.: Deep learning-based crack damage detection using convolutional neural networks. Comput-Aided Civi. Infrastruct. Eng. 32(5), 361–378 (2017)
Chen, C., Zhu, H., Li, M., You, S.: A review of visual-inertial simultaneous localization and mapping from filtering-based and optimization-based perspectives. Robotics 7(3), 45 (2018)
Faessler, M., Franchi, A., Scaramuzza, D.: Differential flatness of quadrotor dynamics subject to rotor drag for accurate tracking of high-speed trajectories. IEEE Robot. Autom. Lett. 3(2), 620–626 (2017)
Forster, C., Carlone, L., Dellaert, F., Scaramuzza, D.: On-manifold preintegration for real-time visual-inertial odometry. IEEE Trans. Robot. 33(1), 1–21 (2016)
Forster, C., Pizzoli, M., Scaramuzza, D.: Svo: Fast semi-direct monocular visual odometry. 2014 IEEE international conference on robotics and automation (ICRA) pp. 15–22 (2014)
Forster, C., Zhang, Z., Gassner, M., Werlberger, M., Scaramuzza, D.: Svo: Semidirect visual odometry for monocular and multicamera systems. IEEE Trans. Robot. 33(2), 249–265 (2016)
Jia, S., Jiao, Y., Zhang, Z., Xiong, R., Wang, Y.: Fej-viro: A consistent first-estimate jacobian visual-inertial-ranging odometry pp. 1336–1343 (2022)
Lee, T., Leok, M., McClamroch, N.H.: Geometric tracking control of a quadrotor uav on se (3). In: 49th IEEE Conference on decision and control (CDC), IEEE, pp. 5420–5425 (2010)
Leutenegger, S., Lynen, S., Bosse, M., Siegwart, R., Furgale, P.: Keyframe-based visual-inertial odometry using nonlinear optimization. Int. J. Robot. Res. 34(3), 314–334 (2015)
Li, J., Bi, Y., Li, K., Wang, K., Lin, F., Chen, B.M.: Accurate 3d localization for mav swarms by uwb and imu fusion. IEEE 14th International conference on control and automation (ICCA) pp. 100–105 (2018)
Lucas, B.D., Kanade, T., et al.: An iterative image registration technique with an application to stereo vision 81 (1981)
Lynen, S., Achtelik, M.W., Weiss, S., Chli, M., Siegwart, R.: A robust and modular multi-sensor fusion approach applied to mav navigation. IEEE/RSJ International conference on intelligent robots and systems pp. 3923–3929 (2013). https://doi.org/10.1109/IROS.2013.6696917
Mishkin, D., Radenovic, F., Matas, J.: Repeatability is not enough: Learning affine regions via discriminability. In: Proceedings of the European conference on computer vision (ECCV), pp. 284–300 (2018)
Mourikis, A.I., Roumeliotis, S.I., et al.: A multi-state constraint kalman filter for vision-aided inertial navigation. IEEE International conference on robotics and automation (ICRA) 2, 6 (2007)
Nguyen, T.H., Nguyen, T.M., Xie, L.: Tightly-coupled single-anchor ultra-wideband-aided monocular visual odometry system. In: 2020 IEEE International conference on robotics and automation (ICRA), IEEE, pp. 665–671 (2020)
Nguyen, T.M., Yuan, S., Cao, M., Nguyen, T.H., Xie, L.: Viral slam: Tightly coupled camera-imu-uwb-lidar slam. (2021). arXiv preprint arXiv:2105.03296
Paul, M.K., Wu, K., Hesch, J.A., Nerurkar, E.D., Roumeliotis, S.I.: A comparative analysis of tightly-coupled monocular, binocular, and stereo vins. IEEE International conference on robotics and automation (ICRA) pp. 165–172 (2017)
Qin, T., Cao, S., Pan, J., Shen, S.: A general optimization-based framework for global pose estimation with multiple sensors. (2019). arXiv preprint arXiv:1901.03642
Qin, T., Li, P., Shen, S.: Vins-mono: A robust and versatile monocular visual-inertial state estimator. IEEE Trans. Robot. 34(4), 1004–1020 (2018)
Qin, T., Pan, J., Cao, S., Shen, S.: A general optimization-based framework for local odometry estimation with multiple sensors. (2019). arXiv preprint arXiv:1901.03638
Shen, L., Tao, H., Ni, Y., Wang, Y., Stojanovic, V.: Improved yolov3 model with feature map cropping for multi-scale road object detection. Meas. Sci. Technol. 34(4), 045,406 (2023)
Song, X., Wu, C., Stojanovic, V., Song, S.: 1 bit encoding–decoding-based event-triggered fixed-time adaptive control for unmanned surface vehicle with guaranteed tracking performance. Control. Eng. Pract. 135, 105,513 (2023)
Tateno, K., Tombari, F., Laina, I., Navab, N.: Cnn-slam: Real-time dense monocular slam with learned depth prediction pp. 6243–6252 (2017)
Wang, C., Zhang, H., Nguyen, T.M., Xie, L.: Ultra-wideband aided fast localization and mapping system pp. 1602–1609 (2017)
Wang, S., Clark, R., Wen, H., Trigoni, N.: Deepvo: Towards end-to-end visual odometry with deep recurrent convolutional neural networks pp. 2043–2050 (2017)
Weiss, S., Achtelik, M.W., Lynen, S., Chli, M., Siegwart, R.: Real-time onboard visual-inertial state estimation and self-calibration of mavs in unknown environments. IEEE International conference on robotics and automation pp. 957–964 (2012)
Wu, K., Ahmed, A.M., Georgiou, G.A., Roumeliotis, S.I.: A square root inverse filter for efficient vision-aided inertial navigation on mobile devices. Robot. Sci. Syst. 2 (2015)
Xu, H., Zhang, Y., Zhou, B., Wang, L., Yao, X., Meng, G., Shen, S.: Omni-swarm: A decentralized omnidirectional visual–inertial–uwb state estimation system for aerial swarms. IEEE Transactions on Robotics (2022)
Yi, K.M., Trulls, E., Lepetit, V., Fua, P.: Lift: Learned invariant feature transform pp. 467–483 (2016)
Zhou, C., Tao, H., Chen, Y., Stojanovic, V., Paszke, W.: Robust point-to-point iterative learning control for constrained systems: A minimum energy approach. Int. J. Robust Nonlinear Control 32(18), 10139–10161 (2022)
Acknowledgements
This research was completed thanks to the financial support from Taiwan MOST Grant 108-2221-E-008-074-MY3, 111-2221-E-008-097 and Taiwan NSTC Grant 112-2221-008-075.
Funding
This research was supported by Taiwan Swarm Innovation Inc., Taiwan MOST Grant 108-2221-E-008-074-MY3, 111-2221-E-008-097 and Taiwan NSTC Grant 112-2221-008-075.
Author information
Authors and Affiliations
Contributions
conceptualization, Bing-Xian Lu and Yu-Chung Tsai; methodology, Bing-Xian Lu and Yu-Chung Tsai; software, Bing-Xian Lu and Yu-Chung Tsai; validation, Bing-Xian Lu and Yu-Chung Tsai; formal analysis, Bing-Xian Lu, Yu-Chung Tsai and Kuo-Shih Tseng; investigation, Bing-Xian Lu and Yu-Chung Tsai and Kuo-Shih Tseng; resources, Bing-Xian Lu and Yu-Chung Tsai; data collection, Bing-Xian Lu and Yu-Chung Tsai; writing–original draft preparation, Yu-Chung Tsai; writing–review and editing, Kuo-Shih Tseng; visualization, Yu-Chung Tsai; supervision, Kuo-Shih Tseng; project administration, Kuo-Shih Tseng; funding acquisition, Kuo-Shih Tseng.
Corresponding author
Ethics declarations
Ethics approval
The research does not involve human participants, their data or biological material and it does not involve animals.
Conflicts of interest
The authors have no relevant financial or non-financial interests to disclose.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary Information
Below is the link to the electronic supplementary material.
Supplementary file 1 (mp4 54935 KB)
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Lu, BX., Tsai, YC. & Tseng, KS. GRVINS: Tightly Coupled GNSS-Range-Visual-Inertial System. J Intell Robot Syst 110, 36 (2024). https://doi.org/10.1007/s10846-023-02033-8
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s10846-023-02033-8