Xitian Gao, Baoquan Li, Xiaojing He, Wuxi Shi, and Xuebo Zhang
[1] C. Cadena, L. Carlone, H. Carrillo, Y. Latif, D. Scaramuzza,J. Neira, I.D. Reid, and J.J. Leonard, Past, present, and futureof simultaneous localization and mapping: Toward the robust-perception age, IEEE Transactions on Robotics, 32(6), 2016,1309–1332. [2] G. Huang, Visual-inertial navigation: A concise review, Proc.IEEE International Conf. on Robotics and Automation(ICRA), Montreal, Canada, 2019, 9572–9582. [3] S. Cai, G. Bao, and J. Pang, A structured light-basedvisual sensing system for detecting multi-layer and multi-trackwelding, International Journal of Robotics and Automation,36(4), 2021, 264–273. [4] S. Badalkhani, R. Havangi, and M. Farshad, An improvedsimultaneous localisation and mapping for dynamic environ-ments, International Journal of Robotics and Automation,36(6), 2021, 374–382. [5] B. Li, Y. Fang, and X. Zhang, Visual servo regulation ofwheeled mobile robots with an uncalibrated onboard camera,IEEE/ASME Transactions on Mechatronics, 21(5), 2016,2330–2342. [6] K. Li, B. Li, D. Li, C. Wang, and H. Deng, Numerical simulationof mechanical performances and outflow field for a rotor UAV,International Journal of Robotics and Automation, 36(6), 2021,462–470. [7] H. Duan, L. Xin, Y. Xu, G. Zhao, and S. Chen, Eagle-vision-inspired visual measurement algorithm for UAV’s autonomouslanding, International Journal of Robotics and Automation,35(2), 2020, 94–100. [8] T. Do, L.C. Carrillo-Arce, and S.I. Roumeliotis, High-speedautonomous quadrotor navigation through visual and inertialpaths, International Journal of Robotics Research, 38(4), 2019,486–504. [9] C. An, B. Li, W. Shi, and X. Zhang, Autonomous quadrotorUAV systems for dynamic platform landing with onboardsensors, International Journal of Robotics and Automation,38(4), 2023, 296–305. [10] T. Qin and S. Shen, Online temporal calibration for monocularvisual-inertial systems, Proc. IEEE/RSJ International Conf.on Intelligent Robots and Systems (IROS), Madrid, Spain,2018, 3662–3669. [11] M. Li and A.I. Mourikis, Online temporal calibration forcamera-IMU systems: Theory and algorithms, InternationalJournal of Robotics Research, 33(7), 2014, 947–964. [12] J. Gui, D. Gu, S. Wang, and H. Hu, A review of visualinertial odometry from filtering and optimisation perspectives,Advanced Robotics, 29(20), 2015, 1289–1301. [13] S. Shen, N. Michael, and V. Kumar, Tightly-coupled monocularvisual-inertial fusion for autonomous flight of rotorcraft MAVs,9Proc. IEEE International Conf. on Robotics and Automation,Seattle, WA, USA, 2015, 5303–5310. [14] M. Bloesch, S. Omari, M. Hutter, and R. Siegwart, Robustvisual inertial odometry using a direct EKF-based approach,Proc. IEEE/RSJ International Conf. on Intelligent Robots andSystems, Hamburg, Germany, 2015, 298–304. [15] Z. Feng, J. Li, L. Zhang, and C. Chen, Online spatialand temporal calibration for monocular direct visual-inertialodometry, Sensors, 19(10), 2019, 22–73. [16] A.I. Mourikis and S.I. Roumeliotis, A multi-state constraintKalman filter for vision-aided inertial navigation, Proc. IEEEInternational Conf. on Robotics and Automation, Roma, Italy,2007, 3565–3572. [17] M. Li and A.I. Mourikis, High-precision, consistent EKF-basedvisual-inertial odometry, International Journal of RoboticsResearch, 32(6), 2013, 690–711. [18] T. Qin, P. Li, and S. Shen, Relocalization, global optimizationand map merging for monocular visual-inertial SLAM, Proc.IEEE International Conf. on Robotics and Automation,Brisbane, Australia, 2018, 1198–1204. [19] C. Forster, L. Carlone, F. Dellaert, and D. Scaramuzza, On-manifold preintegration for real-time visual-inertial odometry,IEEE Transactions on Robotics, 33(1), 2017, 1–21. [20] S. Leutenegger, P. Furgale, V. Rabaud, M. Chli, K. Konolige,and R. Siegwart, Keyframe-based visual-inertial SLAM usingnonlinear optimization, International Journal of RoboticsResearch, 34(3), 2015, 314–334. [21] Y. He, J. Zhao, Y. Guo, W. He, and K. Yuan, PL-VIO: Tightly-coupled monocular visual-inertial odometry using point andline features, Sensors, 18(4), 2018, 11–59. [22] T. Qin, P. Li, and S. Shen, VINS-Mono: A robust and versatilemonocular visual-inertial state estimator, IEEE Transactionson Robotics, 34(4), 2018, 1004–1020. [23] Y. Yang, P. Geneva, K. Eckenhoff, and G. Huang, Degeneratemotion analysis for aided INS with online spatial and temporalsensor calibration, IEEE Robotics and Automation Letters,4(10), 2019, 2070–2077. [24] J. Rehder and R. Siegwart, Camera/IMU calibration revisited,Sensors, 17(11), 2017, 3257–3268. [25] Z. Yang and S. Shen, Monocular visual-inertial state estimationwith online initialization and camera-IMU extrinsic calibration,IEEE Transactions on Automation Science and Engineering,14(1), 2017, 39–51. [26] Y. Liu, R. Xiong, Y. Wang, H. Huang, X. Xie, X. Liu, and G.Zhang, Stereo visual-inertial odometry with multiple Kalmanfilters ensemble, IEEE Transactions on Industrial Electronics,63(10), 2016, 6205–6216. [27] J. Kaiser, A. Martinelli, F. Fontana, and D. Scaramuzza,Simultaneous state initialization and gyroscope bias calibrationin visual inertial aided navigation, IEEE Robotics andAutomation Letters, 2(1), 2017, 18–25. [28] X. Gao, B. Li, W. Shi, and F. Yan, Visual-inertial odometrysystem with simultaneous extrinsic parameters optimization,Proc. IEEE International Conf. on Advanced IntelligentMechatronics, Boston, MA, USA, 2020, 1977–1982. [29] W. Huang and H. Liu, Online initialization and automaticcamera-IMU extrinsic calibration for monocular visual-inertialSLAM, Proc. IEEE International Conf. on Robotics andAutomation, Brisbane, Australia, 2018, 5182–5189. [30] T. Qin and S. Shen, Robust initialization of monocularvisual-inertial estimation on aerial robots, Proc. IEEE/RSJInternational Conf. on Intelligent Robots and Systems,Vancouver, Canada, 2017, 4225–4232. [31] E. Mair, M. Fleps, M. Suppa, and D. Burschka, Spatio-temporal initialization for IMU to camera registration, Proc.IEEE International Conf. on Robotics and Biomimetics, KaronBeach, Thailand, 2011, 557–564. [32] S. Weiss, M.W. Achtelik, S. Lynen, M. Chli, and R. Siegwart,Real-time onboard visual-inertial state estimation and self-calibration of MAVs in unknown environments, Proc. IEEEInternational Conf. on Robotics and Automation, Saint Paul,MN, USA, 2012, 957–964. [33] R. Mur-Artal and J.D. Tardos, Visual-inertial monocular SLAMwith map reuse, IEEE Robotics and Automation Letters, 2(2),2017, 796–803. [34] M. Bloesch, M. Burri, S. Omari, M. Hutter, and R. Siegwart,Iterated extended Kalman filter based visual-inertial odometryusing direct photometric feedback, International Journal ofRobotics Research, 36(10), 2017, 1053–1072. [35] J. Kelly and G.S. Sukhatme, A general framework for temporalcalibration of multiple proprioceptive and exteroceptivesensors, Springer Tracts in Advanced Robotics, 79, 2012, 195–209. [36] P. Furgale, J. Rehder, and R. Siegwart, Unified temporal andspatial calibration for multi-sensor systems, Proc. IEEE/RSJInternational Conf. on Intelligent Robots and Systems, Tokyo,Japan, 2013, 1280–1286. [37] M. Li and A.I. Mourikis, 3-D motion estimation and onlinetemporal calibration for camera-IMU systems, Proc. of theIEEE International Conf. on Robotics and Automation,Karlsruhe, Germany, 2013, 5709–5716. [38] M. Burri, J. Nikolic, P. Gohl, T. Schneider, J. Rehder, S. Omari,M. W. Achtelik, and R. Siegwart, The EuRoC micro aerialvehicle datasets, International Journal of Robotics Research,35(10), 2016, 1157–1163. [39] D. Schubert, T. Goll, N. Demmel, V. Usenko, J. Stueckler, andD. Cremers, The TUM VI benchmark for evaluating visual-inertial odometry, Proc. IEEE/RSJ International Conf. onIntelligent Robots and Systems, Madrid, Spain, 2018, 1680–1687.
Important Links:
Go Back