Title:
Auditory Evidence Grids
Auditory Evidence Grids
dc.contributor.author | Martinson, Eric | |
dc.contributor.author | Schultz, Alan | |
dc.contributor.corporatename | Georgia Institute of Technology | |
dc.contributor.corporatename | Naval Research Laboratory (U.S.) | |
dc.date.accessioned | 2008-03-17T18:27:45Z | |
dc.date.available | 2008-03-17T18:27:45Z | |
dc.date.issued | 2006 | |
dc.description.abstract | Sound source localization on a mobile robot can be a difficult task due to a variety of problems inherent to a real environment, including robot ego-noise, echoes, and the transient nature of ambient noise. As a result, source localization data are often very noisy and unreliable. In this work, we overcome some of these problems by combining the localization evidence over a variety of robot poses using an evidence grid. The result is a representation that localizes the pertinent objects well over time, can be used to filter poor localization results, and may also be useful for global re-localization from sound localization results. | en_US |
dc.identifier.uri | http://hdl.handle.net/1853/20531 | |
dc.language.iso | en_US | en_US |
dc.publisher | Georgia Institute of Technology | en_US |
dc.subject | Auditory mapping | en_US |
dc.subject | Evidence grid | en_US |
dc.subject | Mobile robots | en_US |
dc.subject | Sound source localization | en_US |
dc.title | Auditory Evidence Grids | en_US |
dc.type | Text | |
dc.type.genre | Paper | |
dspace.entity.type | Publication | |
local.contributor.corporatename | College of Computing | |
local.contributor.corporatename | Mobile Robot Laboratory | |
local.contributor.corporatename | Institute for Robotics and Intelligent Machines (IRIM) | |
relation.isOrgUnitOfPublication | c8892b3c-8db6-4b7b-a33a-1b67f7db2021 | |
relation.isOrgUnitOfPublication | 488966cd-f689-41af-b678-bbd1ae9c01d4 | |
relation.isOrgUnitOfPublication | 66259949-abfd-45c2-9dcc-5a6f2c013bcf |