Person:
Johnson, Eric N.

Associated Organization(s)
ORCID
ArchiveSpace Name Record

Publication Search Results

Now showing 1 - 2 of 2
  • Item
    Monocular Visual Mapping for Obstacle Avoidance on UAVs
    (Georgia Institute of Technology, 2014-01) Magree, Daniel ; Mooney, John G. ; Johnson, Eric N.
    An unmanned aerial vehicle requires adequate knowledge of its surroundings in order to operate in close proximity to obstacles. UAVs also have strict payload and power constraints which limit the number and variety of sensors available to gather this information. It is desirable, therefore, to enable a UAV to gather information about potential obstacles or interesting landmarks using common and lightweight sensor systems. This paper presents a method of fast terrain mapping with a monocular camera. Features are extracted from camera images and used to update a sequential extended Kalman filter. The features locations are parameterized in inverse depth to enable fast depth convergence. Converged features are added to a persistent terrain map which can be used for obstacle avoidance and additional vehicle guidance. Simulation results, results from recorded flight test data, and flight test results are presented to validate the algorithm.
  • Item
    GPS-denied Indoor and Outdoor Monocular Vision Aided Navigation and Control of Unmanned Aircraft
    (Georgia Institute of Technology, 2013-05) Chowdhary, Girish ; Johnson, Eric N. ; Magree, Daniel ; Wu, Allen ; Shein, Andy
    GPS-denied closed-loop autonomous control of unstable Unmanned Aerial Vehicles (UAVs) such as rotorcraft using information from a monocular camera has been an open problem. Most proposed Vision aided Inertial Navigation Systems (V-INSs) have been too computationally intensive or do not have sufficient integrity for closed-loop flight. We provide an affirmative answer to the question of whether V-INSs can be used to sustain prolonged real-world GPS-denied flight by presenting a V-INS that is validated through autonomous flight-tests over prolonged closed-loop dynamic operation in both indoor and outdoor GPS-denied environments with two rotorcraft unmanned aircraft systems (UASs). The architecture efficiently combines visual feature information from a monocular camera with measurements from inertial sensors. Inertial measurements are used to predict frame-to-frame transition of online selected feature locations, and the difference between predicted and observed feature locations is used to bind in real-time the inertial measurement unit drift, estimate its bias, and account for initial misalignment errors. A novel algorithm to manage a library of features online is presented that can add or remove features based on a measure of relative confidence in each feature location. The resulting V-INS is sufficiently efficient and reliable to enable real-time implementation on resource-constrained aerial vehicles. The presented algorithms are validated on multiple platforms in real-world conditions: through a 16-min flight test, including an autonomous landing, of a 66 kg rotorcraft UAV operating in an unconctrolled outdoor environment without using GPS and through a Micro-UAV operating in a cluttered, unmapped, and gusty indoor environment.