(Georgia Institute of Technology, 2007-08)
Sattigeri, Ramachandra J.; Johnson, Eric N.; Calise, Anthony J.; Ha, Jin-Cheol
This paper presents an approach to vision-based target tracking with a neural network
(NN) augmented Kalman filter as the adaptive target state estimator. The vision sensor
onboard the follower (tracker) aircraft is a single camera. Real-time image processing
implemented in the onboard flight computer is used to derive measurements of relative
bearing (azimuth and elevation angles) and the maximum angle subtended by the target
aircraft on the image plane. These measurements are used to update the NN augmented
Kalman filter. This filter generates estimates of the target aircraft position, velocity and
acceleration in inertial 3D space that are used in the guidance and flight control law to guide
the follower aircraft relative to the target aircraft. Applications of the presented approach
include vision-based autonomous formation flight, pursuit and autonomous aerial refueling.
The NN augmenting the Kalman filter estimates the target acceleration and hence provides
for robust state estimation in the presence of unmodeled target maneuvers. Vision-in-the-loop
simulation results obtained in a 6DOF real-time simulation of vision-based autonomous
formation flight are presented to illustrate the efficacy of the adaptive target state estimator
design.