Title:
A UAV-ENABLED CALIBRATION METHOD FOR REMOTE CAMERAS ROBUST TO LOCALIZATION UNCERTAINTY

dc.contributor.advisor Mavris, Dimitri N.
dc.contributor.author Commun, Domitille
dc.contributor.committeeMember Pradalier, Cedric
dc.contributor.committeeMember Kennedy, Graeme
dc.contributor.committeeMember Fischer, Olivia
dc.contributor.committeeMember Balchanos, Michael
dc.contributor.department Aerospace Engineering
dc.date.accessioned 2022-01-14T16:07:02Z
dc.date.available 2022-01-14T16:07:02Z
dc.date.created 2021-12
dc.date.issued 2021-08-23
dc.date.submitted December 2021
dc.date.updated 2022-01-14T16:07:02Z
dc.description.abstract Several video applications rely on camera calibration, a key enabler towards the measurement of metric parameters from images. For instance, monitoring environmental changes through remote cameras, such as glacier size changes, or measuring vehicle speed from security cameras, require cameras to be calibrated. Calibrating a camera is necessary to implement accurate computer vision techniques for the automated analysis of video footage. This automated analysis enables the ability to save cost and time in a variety of fields, such as manufacturing, civil engineering, architecture and safety. The large number of cameras installed and operated continues to increase. A vast portion of these cameras are ”hard-to-reach” cameras. ”Hard-to-reach” cameras refer to installed cameras that cannot be removed from their location without impacting the camera parameters or the camera’s operational use. This includes remote sensing cameras or security cameras. Many of these cameras are not calibrated, and successfully being able to calibrate them is a key need as applications continue growing for the use of automated measurements using the video provided by the cameras. Existing calibration methods can be divided into two groups: object-based calibration, which relies on the use of a calibration target of known dimensions, and self-calibration, which relies on the camera motion or scene geometry constraints. However, these methods have not been adapted for use with remote cameras that are hard-to-reach and have large field-of-views. Indeed, the object-based calibration method requires a tedious and manual process that is not adapted to a large field of view. Furthermore, the self-calibration requires restricted conditions to work correctly and is thus not scalable to a large type of hard-to-reach cameras, with many different parameters, and various viewing scenes. Based on this need, the research objective of this thesis is to develop a camera calibration method for hard-to-reach cameras. The method must satisfy a series of requirements caused by the remote status of the cameras being calibrated: • Be adapted to large fields-of-view since these cameras cannot be accessed easily (which prevents the use of object-based calibration techniques) • Be scalable to various environments (which is not feasible using self-calibration techniques that require strict assumptions about the scene) • Be automated to enable the calibration of the large number of already installed cameras • Be able to correct for the large non-linear distortion that is frequently present with these cameras In response to the calibration need, this thesis proposes a solution that relies on the use of a drone or a robot as a moving target to collect the 3D and 2D matching points required for the calibration. The target localization in the 3D space and on the image is subject to errors, and the approach must be tested to evaluate its ability to calibrate cameras despite measurement uncertainties. This work demonstrates the success of the calibration approach using realistic simulations and real-world testing. The approach is robust against localization uncertainties. It is also environment independent, and highly automated, on the contrary to existing calibration techniques. First, this work defines a drone trajectory that covers the entire field of view and enables a robust correspondence between 3D and 2D key points. The corresponding experiment evaluates the calibration quality while the 2D localization is subject to uncertainties. It demonstrates using simulations for several cameras that the use of the moving target following this trajectory enables the collection of a complete training set, and results in an accurate calibration with an RMS reprojection error of 3.2 pixels on average. This error is smaller than 3.6 pixels which is a threshold derived in this thesis, and which corresponds to an accurate calibration. Then, the drone design is modified to add a marker to improve the target detection accuracy. Experiment 2 demonstrates the robustness of this solution in challenging conditions, such as in complex environments for the target detection. The modified drone design leads to improvement in calibration accuracy with an RMS reprojection error of 2.4 pixels on average, and is adapted for detection despite backgrounds or flight conditions that introduce complication in the target detection. This research also develops a strategy to evaluate the impact of camera parameters, drone path parameters, and 3D and 2D localization uncertainties on the calibration accuracy. Applying this strategy to 5000 simulated camera models leads to recommendations for path parameters for the drone-based calibration approach and highlights the impact of camera parameters on the calibration accuracy. It demonstrates that specific sampling step lengths lead to a better calibration, and demonstrates the relationship between the drone-camera distance and the accuracy. This experiment results in recommendations for the drone path. It also evaluated the RMS reprojection error for the 5000 cameras. The average of this error is equal to 4 pixels. Linking this result to the speed measurement application, 4 pixels error corresponds to a speed measurement error smaller than 0.5km/h when measuring the speed of a vehicle 15 meters away using a pinhole camera of focal length 900 pixels. The knowledge gained from these experiments is applied in a real-world test, which completes the demonstration of the drone-based camera calibration approach. The real test is made using a commercial drone and GPS, in an urban environment and in a challenging background. This hardware experiment shows the steps to follow to reproduce the drone-based remote camera calibration technique. The calibration error equals 7.7 pixels, and can be reduced if a RTK GPS is used as 3D localization sensor. Finally, this work demonstrates using an optimization process for several simulated cameras that the sampling size can be reduced by more than half for a faster calibration while maintaining a good calibration accuracy.
dc.description.degree Ph.D.
dc.format.mimetype application/pdf
dc.identifier.uri http://hdl.handle.net/1853/66057
dc.language.iso en_US
dc.publisher Georgia Institute of Technology
dc.subject UAV
dc.subject Remote Cameras
dc.subject Calibration
dc.subject Optimization
dc.subject Trajectory
dc.subject Path
dc.subject GPS uncertainty
dc.subject Image detection uncertainty
dc.title A UAV-ENABLED CALIBRATION METHOD FOR REMOTE CAMERAS ROBUST TO LOCALIZATION UNCERTAINTY
dc.type Text
dc.type.genre Dissertation
dspace.entity.type Publication
local.contributor.advisor Mavris, Dimitri N.
local.contributor.corporatename Daniel Guggenheim School of Aerospace Engineering
local.contributor.corporatename Aerospace Systems Design Laboratory (ASDL)
local.contributor.corporatename College of Engineering
local.relation.ispartofseries Doctor of Philosophy with a Major in Aerospace Engineering
relation.isAdvisorOfPublication d355c865-c3df-4bfe-8328-24541ea04f62
relation.isOrgUnitOfPublication a348b767-ea7e-4789-af1f-1f1d5925fb65
relation.isOrgUnitOfPublication a8736075-ffb0-4c28-aa40-2160181ead8c
relation.isOrgUnitOfPublication 7c022d60-21d5-497c-b552-95e489a06569
relation.isSeriesOfPublication f6a932db-1cde-43b5-bcab-bf573da55ed6
thesis.degree.level Doctoral
Files
Original bundle
Now showing 1 - 1 of 1
Thumbnail Image
Name:
COMMUN-DISSERTATION-2021.pdf
Size:
30.54 MB
Format:
Adobe Portable Document Format
Description:
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
LICENSE.txt
Size:
3.87 KB
Format:
Plain Text
Description: