Title:
Saliency Detection and Model-based Tracking: a Two Part Vision System for Small Robot Navigation in Forested Environments

dc.contributor.author Roberts, Richard
dc.contributor.author Ta, Duy-Nguyen
dc.contributor.author Straub, Julian
dc.contributor.author Ok, Kyel
dc.contributor.author Dellaert, Frank
dc.contributor.corporatename Georgia Institute of Technology. College of Computing
dc.contributor.corporatename Georgia Institute of Technology. Center for Robotics and Intelligent Machines
dc.date.accessioned 2012-09-24T18:55:42Z
dc.date.available 2012-09-24T18:55:42Z
dc.date.issued 2012-05-01
dc.description ©Copyright 2012 Society of Photo-Optical Instrumentation Engineers. One print or electronic copy may be made for personal use only. Systematic electronic or print reproduction and distribution, duplication of any material in this paper for a fee or for commercial purposes, or modification of the content of the paper are prohibited. http://dx.doi.org/10.1117/12.919598 en_US
dc.description Presented at Unmanned Systems Technology XIV - SPIE Defense; Security; and Sensing, April 25-27 2012, Baltimore, MD.
dc.description DOI: 10.1117/12.919598
dc.description.abstract Towards the goal of fast, vision-based autonomous flight, localization, and map building to support local planning and control in unstructured outdoor environments, we present a method for incrementally building a map of salient tree trunks while simultaneously estimating the trajectory of a quadrotor flying through a forest. We make significant progress in a class of visual perception methods that produce low-dimensional, geometric information that is ideal for planning and navigation on aerial robots, while directing computational resources using motion saliency, which selects objects that are important to navigation and planning. By low-dimensional geometric information, we mean coarse geometric primitives, which for the purposes of motion planning and navigation are suitable proxies for real-world objects. Additionally, we develop a method for summarizing past image measurements that avoids expensive computations on a history of images while maintaining the key non-linearities that make full map and trajectory smoothing possible. We demonstrate results with data from a small, commercially-available quad-rotor flying in a challenging, forested environment. en_US
dc.identifier.citation Richard Roberts ; Duy-Nguyen Ta ; Julian Straub ; Kyel Ok and Frank Dellaert, "Saliency detection and model-based tracking: a two part vision system for small robot navigation in forested environment", Proc. SPIE 8387, Unmanned Systems Technology XIV, 83870S (May 1, 2012). en_US
dc.identifier.doi 10.1117/12.919598
dc.identifier.uri http://hdl.handle.net/1853/44943
dc.language.iso en_US en_US
dc.publisher Georgia Institute of Technology en_US
dc.publisher.original SPIE
dc.subject Forested environment en_US
dc.subject Localization map building en_US
dc.subject Low-dimensional geometric information en_US
dc.subject Map building en_US
dc.subject Motion planning en_US
dc.subject Motion saliency en_US
dc.subject Navigation en_US
dc.subject Outdoor environments en_US
dc.subject Quadrotor en_US
dc.subject Vision-based autonomous flight en_US
dc.title Saliency Detection and Model-based Tracking: a Two Part Vision System for Small Robot Navigation in Forested Environments en_US
dc.type Text
dc.type.genre Proceedings
dspace.entity.type Publication
local.contributor.author Dellaert, Frank
local.contributor.corporatename Institute for Robotics and Intelligent Machines (IRIM)
local.contributor.corporatename College of Computing
relation.isAuthorOfPublication dac80074-d9d8-4358-b6eb-397d95bdc868
relation.isOrgUnitOfPublication 66259949-abfd-45c2-9dcc-5a6f2c013bcf
relation.isOrgUnitOfPublication c8892b3c-8db6-4b7b-a33a-1b67f7db2021
Files
Original bundle
Now showing 1 - 1 of 1
Thumbnail Image
Name:
Roberts12spie.pdf
Size:
2.52 MB
Format:
Adobe Portable Document Format
Description:
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
1.76 KB
Format:
Item-specific license agreed upon to submission
Description: