Abstract
his paper presents a simple approach for drone navigation to follow a predetermined path using visual input only without reliance on a Global Positioning System (GPS). A Convolutional Neural Network (CNN) is used to output the steering command of the drone in an end-to-end approach. We tested our approach in two simulated environments in the Unreal Engine using the AirSim plugin for drone simulation. Results show that the proposed approach, despite its simplicity, has average cross track distance less than 2.9 meters in the simulated environment. We also investigate the significance of data augmentation in path following. Finally, we conclude by suggesting possible enhancements for extending our approach to more difficult paths in real life, in the hope that one day visual navigation will become the norm in GPS-denied zones.
Abstract (translated)
本文提出了一种简单的无人机导航方法,即在不依赖全球定位系统(GPS)的情况下,仅使用视觉输入来跟踪预定路径。使用卷积神经网络(CNN)以端到端的方式输出无人机的转向命令。我们在两个模拟环境中使用了用于无人机模拟的airsim插件,在虚拟引擎中测试了我们的方法。结果表明,该方法简单易行,但在模拟环境中,平均轨道距离小于2.9米。我们还研究了数据扩充在路径跟踪中的意义。最后,我们提出了将我们的方法扩展到现实生活中更困难的路径的可能增强,希望有朝一日视觉导航将成为GPS拒绝区域的标准。
URL
https://arxiv.org/abs/1905.01658