METHOD OF SIMULTANEOUS LOCALIZATION AND MAPPING FOR CONSTRUCTION OF 2.5D MAPS OF THE ENVIRONMENT USING ROS

Authors

DOI:

https://doi.org/10.30837/ITSSI.2023.24.145

Keywords:

SLAM; ROS; 2D LRF; RGB-D; 2.5D height map; simultaneous localization and mapping methods; intelligent robot; position estimation; modeling and simulation

Abstract

SLAM (Simultaneous Localization and Mapping) is a relevant topic of research and development in the field of robotics and computer vision. SLAM finds wide applications in various areas such as autonomous navigation of intelligent robots, solving problems in augmented and virtual reality, UAVs, and other systems. In recent years, SLAM has made significant progress due to the gradual development of its algorithms, the use of advanced sensors, and improvements in computational power of computers. The subject of this study is modern methods of real-time simultaneous localization and mapping. The goal of the research is to model the developed algorithm for constructing maps of the surrounding environment and determining the position and orientation of the intelligent robot in space in real-time using ROS packages. The purpose of this article is to demonstrate the results of combining SLAM methods and developing new approaches to solve simultaneous localization and mapping problems. In order to achieve the set objectives, a collaboration of laser scanning (2D LRF) and depth image reconstruction (RGB-D) methods was utilized for simultaneous localization and mapping of the intelligent robot and construction of a 2.5D environment map. The obtained results are promising and demonstrate the potential of the integrated SLAM methods, which collaborate to ensure accurate execution of simultaneous localization and mapping for intelligent robots in real-time mode. The proposed method allows for considering obstacle heights in constructing the map of the surrounding environment while requiring less computational power. In conclusion, this approach expands technologies without replacing existing working propositions and enables the use of modern methods for comprehensive detection and recognition of the surrounding environment through an efficient localization and mapping approach, providing more accurate results with fewer resources utilized.

Author Biographies

Igor Nevlyudov, Kharkіv National University of Radio Electronics

Doctor of Sciences (Engineering), Professor, Head at the Department of Computer-Integrated Technologies, Automation and Mechatronics

Sergiy Novoselov, Kharkіv National University of Radio Electronics

PhD (Engineering Sciences), Associate Professor, Professor at the Department of Computer Integrated Technologies, Automation and Mechatronics

Konstantin Sukhachov, Kharkіv National University of Radio Electronics

Master's degree at the Department of Computer-Integrated Technologies, Automation, and Mechatronics

References

Список літератури

Nevliudov I., Novoselov S., Sychova O., Mospan D. Multithreaded Software Control of Industrial Manipulator Movement, IEEE 4th International Conference on Modern Electrical and Energy System (MEES). Kremenchuk, Ukraine. 2022. Р. 1–6. DOI: 10.1109/MEES58014.2022.10005675

Сухачов К. Сучасні методи одночасної локалізації і картографування в режимі реального часу. Автоматизація та комп’ютерно-інтегровані технології у виробництві та освіті: стан, досягнення, перспективи розвитку. Матеріали Всеукраїнської науково-практичної Інтернет-конференції. Черкаси, 2023. С. 77–79. URL: https://conference.ikto.net/

Nevliudov I., Novoselov S., Sychova O., Tesliuk S. Development of the Architecture of the Base Platform Agricultural Robot for Determining the Trajectory Using the Method of Visual Odometry. IEEE XVII-th International Conference on the Perspective Technologies and Methods in MEMS Design (MEMSTECH). Polyana (Zakarpattya). Ukraine. 2021. Р. 64–68. DOI: 10.1109/MEMSTECH53091.2021.9468008

Khan M. S. A. et al. Investigation of Widely Used SLAM Sensors Using Analytical Hierarchy Process. Journal of sensors. 2022. Vol. 2022. P. 1–15. DOI: https://doi.org/10.1155/2022/5428097

Jiang G. et al. A simultaneous localization and mapping (SLAM) framework for 2.5D map building based on low-cost lidar and vision fusion. Applied sciences. 2019. Vol. 9, № 10. 2105. DOI: https://doi.org/10.3390/app9102105

Rosas-Cervantes A. et al. Sensors Multi-Robot 2.5 D localization and mapping using a monte carlo algorithm on a multi-level surface. 208 MDPI Journals Awarded Impact Factor. 2021. Vol. 21. № 13. 4588. DOI: https://doi.org/10.3390/s21134588

Debeunne C., Vivet D. A review of visual-lidar fusion based simultaneous localization and mapping. Sensors. 2020. Vol. 20. №. 7. 2068. DOI: https://doi.org/10.3390/s20072068

Bustos A. P. et al. Visual SLAM: why bundle adjust? International conference on robotics and automation (ICRA). Montreal, QC, Canada. 20–24 May. 2019. 1043. DOI: https://doi.org/10.1109/icra.2019.8793749

O'Mahony N. et al. Deep learning for visual navigation of unmanned ground vehicles: a review. 29th Irish signals and systems conference (ISSC). Belfast. 21–22 June 2018. 859. DOI: https://doi.org/10.1109/issc.2018.8585381

Freitas C. M. Autonomous navigation with simultaneous localization and mapping in/outdoor: master's thesis. 2020. URL: https://hdl.handle.net/10216/128968

Lynen S. et al. A robust and modular multi-sensor fusion approach applied to MAV navigation. IEEE/RSJ international conference on intelligent robots and systems (IROS 2013). Tokyo. 3–7 November. 2013. 6290. DOI: https://doi.org/10.1109/iros.2013.6696917.

Oh T. et al. Graph structure-based simultaneous localization and mapping using a hybrid method of 2D laser scan and monocular camera image in environments with laser scan ambiguity. Sensors. 2015. Vol. 15. № 7. P. 15830–15852. DOI: https://doi.org/10.3390/s150715830

López E. et al. A multi-sensorial simultaneous localization and mapping (SLAM) system for low-cost micro aerial vehicles in gps-denied environments. Sensors. 2017. Vol. 17. № 4. 802. DOI: https://doi.org/10.3390/s17040802

Nam T., Shim J., Cho Y. A 2.5D map-based mobile robot localization via cooperation of aerial and ground robots. Sensors. 2017. Vol. 17. №. 12. 2730. DOI: https://doi.org/10.3390/s17122730

Zhang Z. et al. Scale estimation and correction of the monocular simultaneous localization and mapping (SLAM) based on fusion of 1D laser range finder and vision data. Sensors. 2018. Vol. 18. №. 6. 1948. DOI: https://doi.org/10.3390/s18061948

Shin Y.-S., Park Y. S., Kim A. Direct visual SLAM using sparse depth for camera-lidar system. IEEE international conference on robotics and automation (ICRA). Brisbane, QLD, Australia. 21–25 May 2018. P. 5144–5151. DOI: https://doi.org/10.1109/icra.2018.8461102

Xu Y., Ou Y., Xu T. SLAM of robot based on the fusion of vision and LIDAR. IEEE international conference on cyborg and bionic systems (CBS). Shenzhen. 25–27 October 2018. 2058. DOI: https://doi.org/10.1109/cbs.2018.8612212

Jiang, G. et al. FFT-Based Scan-Matching for SLAM Applications with Low-Cost Laser Range Finders. Applied sciences. 2018. Vol. 9. № 1. 41. DOI: https://doi.org/10.3390/app9010041

Alves S. A. F. T. Particle-Filter based 3D mapping, localization and SLAM for indoor mobile robot navigation: master's thesis. 2019. URL: http://hdl.handle.net/10316/87952

Bresenham J. E. Algorithm for computer control of a digital plotter. IBM Systems Journal. 1965. Vol. 4. № 1. P. 25–30. DOI: https://doi.org/10.1147/sj.41.0025

Joseph L. Learning Robotics using Python: Design, simulate, program, and prototype an autonomous mobile robot using ROS, OpenCV, PCL, and Python, 2nd Edition. Packt Publishing, 2018. 280 p.

Nevludov I., Sychova O., Reznichenko O., Novoselov S., Mospan D., Mospan V. Control System for Agricultural Robot Based on ROS. IEEE International Conference on Modern Electrical and Energy Systems (MEES). Kremenchuk, Ukraine. 2021. Р. 1–6. DOI: 10.1109/MEES52427.2021.9598560

References

Nevliudov, I., Novoselov, S., Sychova, O., Mospan, D. (2022), "Multithreaded Software Control of Industrial Manipulator Movement", IEEE 4th International Conference on Modern Electrical and Energy System (MEES), Kremenchuk, Ukraine. Р. 1–6. DOI: 10.1109/MEES58014.2022.10005675

Sukhachov, K. (2023), "Modern methods of simultaneous localization and mapping in real time mode" ["Suchasni metody odnochasnoi lokalizatsii i kartohrafuvannia v rezhymi realnoho chasu"]. Automation and computer-integrated technologies in production and education: status, achievements, development prospects. All-Ukrainian scientific and practical Internet conference, Cherkasy, Р. 77–79. available: https://conference.ikto.net/

Nevliudov, I., Novoselov, S., Sychova, O., Tesliuk, S. (2021), "Development of the Architecture of the Base Platform Agricultural Robot for Determining the Trajectory Using the Method of Visual Odometry", IEEE XVII-th International Conference on the Perspective Technologies and Methods in MEMS Design (MEMSTECH), Polyana (Zakarpattya), Ukraine, Р. 64–68. DOI: 10.1109/MEMSTECH53091.2021.9468008

Khan, M. S. A. et al. (2022), "Investigation of Widely Used SLAM Sensors Using Analytical Hierarchy Process", Journal of sensors, Vol. 2022, P. 1–15. DOI: https://doi.org/10.1155/2022/5428097

Jiang, G. et al. (2019), "A simultaneous localization and mapping (SLAM) framework for 2.5D map building based on low-cost lidar and vision fusion", Applied sciences, Vol. 9, № 10, 2105. DOI: https://doi.org/10.3390/app9102105

Rosas-Cervantes, A. et al. (2021), "Sensors Multi-Robot 2.5 D localization and mapping using a monte carlo algorithm on a multi-level surface", 208 MDPI Journals Awarded Impact Factor, Vol. 21. № 13. 4588. DOI: https://doi.org/10.3390/s21134588

Debeunne, C., Vivet, D. (2020), "A review of visual-lidar fusion based simultaneous localization and mapping", Sensors, Vol. 20, №. 7, 2068. DOI: https://doi.org/10.3390/s20072068

Bustos, A. P. et al. (2019), "Visual SLAM: why bundle adjust?", International conference on robotics and automation (ICRA), Montreal, QC, Canada, 20–24 May, 1043. DOI: https://doi.org/10.1109/icra.2019.8793749

O'Mahony, N. et al. (2018), "Deep learning for visual navigation of unmanned ground vehicles: a review", 29th Irish signals and systems conference (ISSC), Belfast, 21–22 June, 859. DOI: https://doi.org/10.1109/issc.2018.8585381

Freitas, C. M. "Autonomous navigation with simultaneous localization and mapping in/outdoor: master's thesis", 2020, available at: https://hdl.handle.net/10216/128968

Lynen, S. et al. (2013), "A robust and modular multi-sensor fusion approach applied to MAV navigation", IEEE/RSJ international conference on intelligent robots and systems (IROS 2013), Tokyo, 3–7 November, 6290. DOI: https://doi.org/10.1109/iros.2013.6696917

Oh, T. et al. (2015), "Graph structure-based simultaneous localization and mapping using a hybrid method of 2D laser scan and monocular camera image in environments with laser scan ambiguity", Sensors, Vol. 15, № 7, P. 15830–15852. DOI: https://doi.org/10.3390/s150715830

López, E. et al. (2017), "A multi-sensorial simultaneous localization and mapping (SLAM) system for low-cost micro aerial vehicles in gps-denied environments", Sensors, Vol. 17, № 4, 802. DOI: https://doi.org/10.3390/s1

Nam, T., Shim, J., Cho, Y. A (2017), "2.5D map-based mobile robot localization via cooperation of aerial and ground robots", Sensors, Vol. 17, № 12, 2730. DOI: https://doi.org/10.3390/s17122730

Zhang, Z. et al. (2018), "Scale estimation and correction of the monocular simultaneous localization and mapping (SLAM) based on fusion of 1D laser range finder and vision data", Sensors, Vol. 18, № 6, 1948. DOI: https://doi.org/10.3390/s18061948

Shin, Y.-S., Park, Y. S., Kim, A. (2018), "Direct visual SLAM using sparse depth for camera-lidar system", IEEE international conference on robotics and automation (ICRA), Brisbane, QLD, Australia, 21–25 May 2018, P. 5144–5151. DOI: https://doi.org/10.1109/icra.2018.8461102

Xu, Y., Ou ,Y., Xu, T. (2018), "SLAM of robot based on the fusion of vision and LIDAR", IEEE international conference on cyborg and bionic systems (CBS), Shenzhen, 25–27 October. 2058. DOI: https://doi.org/10.1109/cbs.2018.8612212

Jiang, G. et al. (2018), "FFT-Based Scan-Matching for SLAM Applications with Low-Cost Laser Range Finders", Applied sciences, Vol. 9, № 1. 41. DOI: https://doi.org/10.3390/app9010041

Alves, S. A. (2019),"Particle-Filter based 3D mapping, localization and SLAM for indoor mobile robot navigation: master's thesis", available at: http://hdl.handle.net/10316/87952

Bresenham, J. E. (1965), "Algorithm for computer control of a digital plotter", IBM Systems Journal, Vol. 4, № 1, P. 25–30. DOI: https://doi.org/10.1147/sj.41.0025

Joseph, L. (2018), Learning Robotics using Python: Design, simulate, program, and prototype an autonomous mobile robot using ROS, OpenCV, PCL, and Python, 2nd Edition, Packt Publishing, 280 p.

Nevludov, I., Sychova, O., Reznichenko, O., Novoselov, S., Mospan, D., Mospan, V. (2021), "Control System for Agricultural Robot Based on ROS", IEEE International Conference on Modern Electrical and Energy Systems (MEES), Kremenchuk, Ukraine, Р. 1–6. DOI: 10.1109/MEES52427.2021.9598560

Published

2023-11-13

How to Cite

Nevlyudov, I., Novoselov, S., & Sukhachov, K. (2023). METHOD OF SIMULTANEOUS LOCALIZATION AND MAPPING FOR CONSTRUCTION OF 2.5D MAPS OF THE ENVIRONMENT USING ROS. INNOVATIVE TECHNOLOGIES AND SCIENTIFIC SOLUTIONS FOR INDUSTRIES, (2 (24), 145–160. https://doi.org/10.30837/ITSSI.2023.24.145