Vision-based Target Tracking with Adaptive Target State Estimator

Author(s)
Sattigeri, Ramachandra J.
Calise, Anthony J.
Ha, Jin-Cheol
Advisor(s)
Editor(s)
Associated Organization(s)
Organizational Unit
Daniel Guggenheim School of Aerospace Engineering
The Daniel Guggenheim School of Aeronautics was established in 1931, with a name change in 1962 to the School of Aerospace Engineering
Organizational Unit
Organizational Unit
Series
Supplementary to:
Abstract
This paper presents an approach to vision-based target tracking with a neural network (NN) augmented Kalman filter as the adaptive target state estimator. The vision sensor onboard the follower (tracker) aircraft is a single camera. Real-time image processing implemented in the onboard flight computer is used to derive measurements of relative bearing (azimuth and elevation angles) and the maximum angle subtended by the target aircraft on the image plane. These measurements are used to update the NN augmented Kalman filter. This filter generates estimates of the target aircraft position, velocity and acceleration in inertial 3D space that are used in the guidance and flight control law to guide the follower aircraft relative to the target aircraft. Applications of the presented approach include vision-based autonomous formation flight, pursuit and autonomous aerial refueling. The NN augmenting the Kalman filter estimates the target acceleration and hence provides for robust state estimation in the presence of unmodeled target maneuvers. Vision-in-the-loop simulation results obtained in a 6DOF real-time simulation of vision-based autonomous formation flight are presented to illustrate the efficacy of the adaptive target state estimator design.
Sponsor
Date
2007-08
Extent
Resource Type
Text
Resource Subtype
Proceedings
Rights Statement
Rights URI