DONG Rong, LI Maohai, LIN Rui, LIU Shiqi, DING Wen. Research of Outdoor Robot Localization Method on Fusion of Multi-camera and IMU[J]. Computer Engineering and Applications, 2022, 58(3): 289-296.
[1] DAVISON A J,REID I D,MOLTON N D,et al.MonoSLAM:real-time single camera SLAM[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2007,29(6):1052-1067.
[2] KLEIN G,MURRAY D.Parallel tracking and mapping for small AR workspaces[C]//2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality,2007:225-234.
[3] NEWCOMBE R A,LOVEGROVE S J,DAVISON A J.DTAM:densetracking and mapping in real-time[C]//Proceedings of IEEE International Conference on Computer Vision.Los Alamitos:IEEE Computer Society Press,2011:2320-2327.
[4] ENGEL J,SCH T,CREMERS D.LSD-SLAM:large-scale direct monocular SLAM[C]//European Conference on Computer Vision.[S.l.]:Springer International Publishing,2014.
[5] FORSTER C,PIZZOLI M,SCARAMUZZA D.SVO:fast semi-direct monocular visual odometry[C]//Proceedings of IEEE International Conference on Robotics and Automation.Los Alamitos:IEEE Computer Society Press,2014:15-22.
[6] ENGEL J,KOLTUN V,CREMERS D.Direct sparse odometry[J].IEEE Transactions on Pattern Analysis & Machine Intelligence,2018,40(3):611-625.
[7] MUR-ARTAL R,MONTIEL J M M,TARDOS J D.ORB-SLAM:a versatile and accurate monocular SLAM system[J].IEEE Transactions on Robotics,2017,31(5):1147-1163.
[8] MUR-ARTAL R,TARDOS J D.ORB-SLAM2:an open-source SLAM system for monocular,stereo and RGB-D cameras[J].IEEE Transactions on Robotics,2017,33(5):1255-1262.
[9] PLESS R.Using many cameras as one[C]//IEEE Computer Society Conference on Computer Vision and Pattern Recognition,2003.
[10] MOHAMMAD EHAB MOHAMMAD R.Multiple camera pose estimation[D].Hongkong:The Chinese University of Hongkong,2008.
[11] KAESS M,DELLAERT F.Visual SLAM with a multi-camera rig[R].Georgia Institute of Technology,2006.
[12] HARRIS C G,STEPHENS M.A combined corner and edge detector[C]//Alvey Vision Conference,1988.
[13] HEE LEE G,FAUNDORFER F,POLLEFEYS M.Motion estimation for self-driving cars with a generalized camera[C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition,2013:2746-2753.
[14] HARMAT A,SHARF I,TRENTINI M.Parallel tracking and mapping with multiple cameras on an unmanned aerial vehicle[C]//International Conference on Intelligent Robotics and Applications,2012:421-432.
[15] HARMAT A,TRENTINI M,SHARF I.Multi-camera tracking and mapping for unmanned aerial vehicles in unstructured environments[J].Journal of Intelligent & Robotic Systems,2015,78(2):291-317.
[16] TRIBOU M J.Multi-camera parallel tracking and mapping with non-overlapping fields of view[J].International Journal of Robotics Research,2015,34(2):1480-1500.
[17] URBAN S,HINZ S.MultiCol-SLAM-a modular real-time multi-camera SLAM system[J].arXiv:1610.07336,2016.
[18] YANG S,SCHERER S A,YI X,et al.Multi-camera visual SLAM for autonomous navigation of micro aerial vehicles[J].Robotics and Autonomous Systems,2017,93:116-134.
[19] YANG S,SCHERER S A,ZELL A,et al.Visual SLAM for autonomous MAVs with dual cameras[C]//IEEE International Conference on Robotics and Automation,2014:5227-5232.
[20] LYNEN S,ACHTELIK M W,WEISS S,et al.A robust and modular multi-sensor fusion approach applied to MAV navigation[C]//IEEE/RSJ International Conference on Intelligent Robots & Systems,2013.
[21] LEUTENEGGER S,LYNEN S,BOSSE M,et al.Keyframe-based visual-inertial odometry using nonlinear optimization[J].The International Journal of Robotics Research,2015,34(3):314-334.
[22] MOURIKIS A I,ROUMELIOTIS S I.A multi-state constraint Kalman filter for vision-aided inertial navigation[C]//IEEE International Conference on Robotics & Automation,2007.
[23] BLOESCH M,OMARI S,HUTTER M,et al.Robust visual inertial odometry using a direct EKF-based approach[C]//2015 IEEE/RSJ International Conference on Intelligent Robots and Systems(IROS),2015:298-304.
[24] QIN T,LI P L,SHEN S J,et al.VINS-Mono:a robust and versatile monocular visual-inertial state estimator[J].IEEE Transactions on Robotics,2018,34(4):1004-1020.