Title:
Automated 3D vision-based tracking of construction entities

Thumbnail Image
Author(s)
Park, Man-Woo
Authors
Advisor(s)
Brilakis, Ioannis
Advisor(s)
Editor(s)
Associated Organization(s)
Series
Supplementary to
Abstract
In construction sites, tracking project-related entities such as construction equipment, materials, and personnel provides useful information for productivity measurement, progress monitoring, on-site safety enhancement, and activity sequence analysis. Radio frequency technologies such as Global Positioning Systems (GPS), Radio Frequency Identification (RFID) and Ultra Wide Band (UWB) are commonly used for this purpose. However, on large-scale congested sites, deploying, maintaining and removing such systems can be costly and time-consuming because radio frequency technologies require tagging each entity to track. In addition, privacy issues can arise from tagging construction workers, which often limits the usability of these technologies on construction sites. A vision-based approach that can track moving objects in camera views can resolve these problems. The purpose of this research is to investigate the vision-based tracking system that holds promise to overcome the limitations of existing radio frequency technologies for large-scale, congested sites. The proposed method use videos from static cameras. Stereo camera system is employed for tracking of construction entities in 3D. Once the cameras are fixed on the site, intrinsic and extrinsic camera parameters are discovered through camera calibration. The method automatically detects and tracks interested objects such as workers and equipment in each camera view, which generates 2D pixel coordinates of tracked objects. The 2D pixel coordinates are converted to 3D real-world coordinates based on calibration. The method proposed in this research was implemented in .NET Framework 4.0 environment, and tested on the real videos of construction sites. The test results indicated that the methods could locate construction entities with accuracy comparable to GPS.
Sponsor
Date Issued
2012-08-21
Extent
Resource Type
Text
Resource Subtype
Dissertation
Rights Statement
Rights URI