RGB-D SLAM Technique for an Indoor UAV Robot using Levenberg-Marquardt Optimization Approach

Document Type : Original Research (Full Papers)

Authors

1 Faculty of Electrical, Biomedical and Mechatronics Engineering, Qazvin Branch, Islamic Azad University, Qazvin, Iran

2 Faculty of Electrical, Biomedical and Mechatronics Engineering, Qazvin Branch, Islamic Azad University

3 Department of Electrical, Biomedical and Mechatronics Engineering, Qazvin Branch, Islamic Azad University, Qazvin, Iran

10.22094/jcr.2022.1960424.1273

Abstract

Simultaneous localization and mapping (SLAM) technique is a practical approach for unmanned aerial vehicles (UAVs) to position themselves in unknown zones. In a structured arena with sufficient landmarks and enough lighting, the performance of the existing algorithms is satisfactory. But in a typical indoor field and in absence of GPS signal and poor texture and insufficient lighting, the SLAM would be unstable for navigation owing to the lack of features. In this article's suggested technique, the accuracy and resilience in many unknown situations (including dynamic and static ones) are enhanced by extracting edge and corner features instead of lone point features. A pre-processing block is intended to improve picture frames captured by the RGB-D sensor put on a robot with subpar characteristics. Using a predefined distance function, we filter out dynamic characteristics and solve dynamic issues in the same manner as static problems. Real-time use of our suggested strategy effectively reduces the influence of outliers and moving objects on the SLAM. This improves the accuracy of the procedure's computing output significantly. We validated our findings using data from the Technical University of Munich (TUM) to evaluate the proposed method. Additionally, our developed UAV is utilized for testing as well. The results of the trials indicate that the suggested approach is more precise and less susceptible to changes and system noise than the existing methods.

Keywords


  • Receive Date: 06 June 2022
  • Revise Date: 03 September 2022
  • Accept Date: 11 September 2022
  • First Publish Date: 25 October 2022