Person:
Johnson, Eric N.

Associated Organization(s)
ORCID
ArchiveSpace Name Record

Publication Search Results

Now showing 1 - 2 of 2
Thumbnail Image
Item

Vision-Based Optimal Landing On a Moving Platform

2016-05 , Nakamura, Takuma , Haviland, Stephen , Bershadsky, Dmitry , Johnson, Eric N.

This paper describes a vision-based control architecture designed to enable autonomous landing on a moving platform. The landing trajectory is generated by using the receding-horizon differential dynamic programming (DDP), an optimal control method. The trajectory generation is aided by the output of a vision-based target tracking system. The vision system uses multiple extended Kalman filters which allows us to estimate the position and heading of the moving target via the observed locations. The combination of vision-based target tracking system and the receding-horizon DDP gives an unmanned aerial vehicle the capability to adaptively generate a landing trajectory against tracking errors and disturbances. Additionally, by adding the exterior penalty function to the cost of the DDP we can easily constrain the trajectory from collisions and physically infeasible solutions. We provide key mathematics needed for the implementation and share the results of the image-in-the-loop simulation and flight tests to validate the suggested methodology.

Thumbnail Image
Item

Vision-Based Closed-Loop Tracking Using Micro Air Vehicles

2016 , Nakamura, Takuma , Haviland, Stephen , Bershadsky, Dmitry , NodeIn, Daniel Magree , Johnson, Eric N.

This paper describes the target detection and tracking architecture used by the Georgia Tech Aerial Robotics team for the American Helicopter Society (AHS) Micro Aerial Vehicle (MAV) challenge. The vision system described enables vision-aided navigation with additional abilities such as target detection and tracking all performed onboard the vehicles computer. The author suggests a robust target tracking method that does not solely depend on the image obtained from a camera, but also utilizes the other sensor outputs and runs a target location estimator. The machine learning based target identification method uses Haar-like classifiers to extract the target candidate points. The raw measurements are plugged into multiple Extended Kalman Filters (EKFs). The statistical test (Z-test) is used to bound the measurement, and solve the corresponding problem. Using Multiple EKFs allows us not only to optimally estimate the target location, but also to use the information as one of the criteria to evaluate the tracking performance. The MAV utilizes performance-based criteria that determine whether or not to initiate a maneuver such as hover or land over/on the target. The performance criteria are closed in the loop which allows the system to determine at any time whether or not to continue with the maneuver. For Vision-aided Inertial Navigation System (VINS), a corner Harris algorithm finds the feature points, and we track them using the statistical knowledge. The feature point locations are integrated in Bierman Thornton extended Kalman Filter (BTEKF) with Inertial Measurement Unit (IMU) and sonar sensor outputs to generate vehicle states: position, velocity, attitude, accelerometer and gyroscope biases. A 6- degrees-of-freedom quadrotor flight simulator is developed to test the suggested method. This paper provides the simulation results of the vision-based maneuvers: hovering over the target, and landing on the target. In addition to the simulation results, flight tests have been conducted to show and validate the system performance. The 500 gram Georgia Tech Quadrotor (GTQ)- Mini, was used for the flight tests. All processing is done onboard the vehicle and it is able to operate without human interaction. Both of the simulation and flight test results show the effectiveness of the suggested method. This system and vehicle were used for the AHS 2015 MAV Student Challenge where the GPS-denied closed-loop target search is required. The vehicle successfully found the ground target, and landed on the desired location. This paper shares the data obtained from the competition.