Title:
Imitation Learning for UAS Navigation in Cluttered Environments
Imitation Learning for UAS Navigation in Cluttered Environments
Author(s)
Harris, Caleb M.
Choi, Youngjun
Mavris, Dimitri N.
Choi, Youngjun
Mavris, Dimitri N.
Advisor(s)
Editor(s)
Collections
Supplementary to
Permanent Link
Abstract
Autonomous navigation is a critical component for the use of unmanned aerial systems (UAS) in complex tasks such as package delivery and disaster response. In recent years, these systems have seen increased usage in harsh tasks such as search and rescue and disaster relief, however there remains challenges for efficient and safe operation in a fully autonomous mission. This work seeks to provide a data-driven, vision-based method to navigating and searching through a clustered environment, which is high-speed, low-cost and vehicle-agnostic. This is done by first assuming obstacle avoidance as a two-dimensional navigation task that can be solved by knowing the relative location of the goal and the 2D image of the obstacle in the camera frame. Imitation learning is used to train a deep neural network from an expert planning policy, while a model predictive controller tracks the target. All the processing is capable onboard the vehicle, with the assumption that the general target direction is forward of the camera-frame, and that the global state estimation error is low. The framework and trained model are tested in simulation, with a quadcopter conducting search scenarios in different environments. The resulting framework is quicker to avoid obstacles and can be applied on small, low-cost systems with a single monocular camera.
Sponsor
Date Issued
2021-01-04
Extent
Resource Type
Text
Resource Subtype
Paper