Thumbnail Image
Commun, Domitille
Mavris, Dimitri N.
Associated Organizations
Supplementary to
Several video applications rely on camera calibration, a key enabler towards the measurement of metric parameters from images. For instance, monitoring environmental changes through remote cameras, such as glacier size changes, or measuring vehicle speed from security cameras, require cameras to be calibrated. Calibrating a camera is necessary to implement accurate computer vision techniques for the automated analysis of video footage. This automated analysis enables the ability to save cost and time in a variety of fields, such as manufacturing, civil engineering, architecture and safety. The large number of cameras installed and operated continues to increase. A vast portion of these cameras are ”hard-to-reach” cameras. ”Hard-to-reach” cameras refer to installed cameras that cannot be removed from their location without impacting the camera parameters or the camera’s operational use. This includes remote sensing cameras or security cameras. Many of these cameras are not calibrated, and successfully being able to calibrate them is a key need as applications continue growing for the use of automated measurements using the video provided by the cameras. Existing calibration methods can be divided into two groups: object-based calibration, which relies on the use of a calibration target of known dimensions, and self-calibration, which relies on the camera motion or scene geometry constraints. However, these methods have not been adapted for use with remote cameras that are hard-to-reach and have large field-of-views. Indeed, the object-based calibration method requires a tedious and manual process that is not adapted to a large field of view. Furthermore, the self-calibration requires restricted conditions to work correctly and is thus not scalable to a large type of hard-to-reach cameras, with many different parameters, and various viewing scenes. Based on this need, the research objective of this thesis is to develop a camera calibration method for hard-to-reach cameras. The method must satisfy a series of requirements caused by the remote status of the cameras being calibrated: • Be adapted to large fields-of-view since these cameras cannot be accessed easily (which prevents the use of object-based calibration techniques) • Be scalable to various environments (which is not feasible using self-calibration techniques that require strict assumptions about the scene) • Be automated to enable the calibration of the large number of already installed cameras • Be able to correct for the large non-linear distortion that is frequently present with these cameras In response to the calibration need, this thesis proposes a solution that relies on the use of a drone or a robot as a moving target to collect the 3D and 2D matching points required for the calibration. The target localization in the 3D space and on the image is subject to errors, and the approach must be tested to evaluate its ability to calibrate cameras despite measurement uncertainties. This work demonstrates the success of the calibration approach using realistic simulations and real-world testing. The approach is robust against localization uncertainties. It is also environment independent, and highly automated, on the contrary to existing calibration techniques. First, this work defines a drone trajectory that covers the entire field of view and enables a robust correspondence between 3D and 2D key points. The corresponding experiment evaluates the calibration quality while the 2D localization is subject to uncertainties. It demonstrates using simulations for several cameras that the use of the moving target following this trajectory enables the collection of a complete training set, and results in an accurate calibration with an RMS reprojection error of 3.2 pixels on average. This error is smaller than 3.6 pixels which is a threshold derived in this thesis, and which corresponds to an accurate calibration. Then, the drone design is modified to add a marker to improve the target detection accuracy. Experiment 2 demonstrates the robustness of this solution in challenging conditions, such as in complex environments for the target detection. The modified drone design leads to improvement in calibration accuracy with an RMS reprojection error of 2.4 pixels on average, and is adapted for detection despite backgrounds or flight conditions that introduce complication in the target detection. This research also develops a strategy to evaluate the impact of camera parameters, drone path parameters, and 3D and 2D localization uncertainties on the calibration accuracy. Applying this strategy to 5000 simulated camera models leads to recommendations for path parameters for the drone-based calibration approach and highlights the impact of camera parameters on the calibration accuracy. It demonstrates that specific sampling step lengths lead to a better calibration, and demonstrates the relationship between the drone-camera distance and the accuracy. This experiment results in recommendations for the drone path. It also evaluated the RMS reprojection error for the 5000 cameras. The average of this error is equal to 4 pixels. Linking this result to the speed measurement application, 4 pixels error corresponds to a speed measurement error smaller than 0.5km/h when measuring the speed of a vehicle 15 meters away using a pinhole camera of focal length 900 pixels. The knowledge gained from these experiments is applied in a real-world test, which completes the demonstration of the drone-based camera calibration approach. The real test is made using a commercial drone and GPS, in an urban environment and in a challenging background. This hardware experiment shows the steps to follow to reproduce the drone-based remote camera calibration technique. The calibration error equals 7.7 pixels, and can be reduced if a RTK GPS is used as 3D localization sensor. Finally, this work demonstrates using an optimization process for several simulated cameras that the sampling size can be reduced by more than half for a faster calibration while maintaining a good calibration accuracy.
Date Issued
Resource Type
Resource Subtype
Rights Statement
Rights URI