Title:
Illumination compensation in video surveillance analysis

dc.contributor.advisor Wills, D. Scott
dc.contributor.advisor Wills, Linda M.
dc.contributor.author Bales, Michael Ryan en_US
dc.contributor.committeeMember Bader, David
dc.contributor.committeeMember Howard, Ayanna
dc.contributor.committeeMember Kim, Jongman
dc.contributor.committeeMember Romberg, Justin
dc.contributor.department Electrical and Computer Engineering en_US
dc.date.accessioned 2011-07-06T16:46:37Z
dc.date.available 2011-07-06T16:46:37Z
dc.date.issued 2011-03-30 en_US
dc.description.abstract Problems in automated video surveillance analysis caused by illumination changes are explored, and solutions are presented. Controlled experiments are first conducted to measure the responses of color targets to changes in lighting intensity and spectrum. Surfaces of dissimilar color are found to respond significantly differently. Illumination compensation model error is reduced by 70% to 80% by individually optimizing model parameters for each distinct color region, and applying a model tuned for one region to a chromatically different region increases error by a factor of 15. A background model--called BigBackground--is presented to extract large, stable, chromatically self-similar background features by identifying the dominant colors in a scene. The stability and chromatic diversity of these features make them useful reference points for quantifying illumination changes. The model is observed to cover as much as 90% of a scene, and pixels belonging to the model are 20% more stable on average than non-member pixels. Several illumination compensation techniques are developed to exploit BigBackground, and are compared with several compensation techniques from the literature. Techniques are compared in terms of foreground / background classification, and are applied to an object tracking pipeline with kinematic and appearance-based correspondence mechanisms. Compared with other techniques, BigBackground-based techniques improve foreground classification by 25% to 43%, improve tracking accuracy by an average of 20%, and better preserve object appearance for appearance-based trackers. All algorithms are implemented in C or C++ to support the consideration of runtime performance. In terms of execution speed, the BigBackground-based illumination compensation technique is measured to run on par with the simplest compensation technique used for comparison, and consistently achieves twice the frame rate of the two next-fastest techniques. en_US
dc.description.degree Ph.D. en_US
dc.identifier.uri http://hdl.handle.net/1853/39535
dc.publisher Georgia Institute of Technology en_US
dc.subject Tracking en_US
dc.subject Color en_US
dc.subject Computer vision en_US
dc.subject Background model en_US
dc.subject BigBackground en_US
dc.subject Illumination change en_US
dc.subject.lcsh Video surveillance
dc.subject.lcsh Lighting
dc.subject.lcsh Video recording Lighting
dc.title Illumination compensation in video surveillance analysis en_US
dc.type Text
dc.type.genre Dissertation
dspace.entity.type Publication
local.contributor.advisor Wills, Linda M.
local.contributor.corporatename School of Electrical and Computer Engineering
local.contributor.corporatename College of Engineering
relation.isAdvisorOfPublication c965b932-6dbb-46d3-8e30-6d7809f2f9b6
relation.isOrgUnitOfPublication 5b7adef2-447c-4270-b9fc-846bd76f80f2
relation.isOrgUnitOfPublication 7c022d60-21d5-497c-b552-95e489a06569
Files
Original bundle
Now showing 1 - 1 of 1
Thumbnail Image
Name:
bales_michael_r_201105_phd.pdf
Size:
4.36 MB
Format:
Adobe Portable Document Format
Description: