Person:
Kemp, Charles C.

Associated Organization(s)
ORCID
ArchiveSpace Name Record

Publication Search Results

Now showing 1 - 5 of 5
Thumbnail Image
Item

RFID-Guided Robots for Pervasive Automation

2010-01-15 , Deyle, Travis , Nguyen, Hai , Reynolds, Matt S. , Kemp, Charles C.

Passive UHF RFID tags are well matched to robots' needs. Unlike lowfrequency (LF) and high-frequency (HF) RFID tags, passive UHF RFID tags are readable from across a room, enabling a mobile robot to efficiently discover and locate them. Using tags' unique IDs, a semantic database, and RF perception via actuated antennas, this paper shows how a robot can reliably interact with people and manipulate labeled objects.

Thumbnail Image
Item

A foveated passive UHF RFID system for mobile manipulation

2008-09 , Deyle, Travis , Anderson, Cressel D. , Kemp, Charles C. , Reynolds, Matt S.

We present a novel antenna and system architecture for mobile manipulation based on passive RFID technology operating in the 850 MHz - 950 MHz ultra-high-frequency (UHF) spectrum. This system exploits the electromagnetic properties of UHF radio signals to present a mobile robot with both wide-angle dasiaperipheral visionpsila, sensing multiple tagged objects in the area in front of the robot, and focused, high-acuity dasiacentral visionpsila, sensing only tagged objects close to the end effector of the manipulator. These disparate tasks are performed using the same UHF RFID tag, coupled in two different electromagnetic modes. Wide-angle sensing is performed with an antenna designed for far-field electromagnetic wave propagation, while focused sensing is performed with a specially designed antenna mounted on the end effector that optimizes near-field magnetic coupling. We refer to this RFID system as dasiafoveatedpsila, by analogy with the anatomy of the human eye. We report a series of experiments on an untethered autonomous mobile manipulator in a 2.5D environment that demonstrate the features of this architecture using two novel behaviors, one in which data from the far-field antenna is used to determine if a specific tagged object is present in the robotpsilas working area and to navigate to that object, and a second using data from the near-field antenna to grasp a specified object from a collection of visually identical objects. The same UHF RFID tag is used to facilitate both the navigation and grasping tasks.

Thumbnail Image
Item

RF vision: RFID receive signal strength indicator (RSSI) images for sensor fusion and mobile manipulation

2009-10 , Deyle, Travis , Nguyen, Hai , Reynolds, Matt S. , Kemp, Charles C.

In this work we present a set of integrated methods that enable an RFID-enabled mobile manipulator to approach and grasp an object to which a self-adhesive passive (battery-free) UHF RFID tag has been affixed. Our primary contribution is a new mode of perception that produces images of the spatial distribution of received signal strength indication (RSSI) for each of the tagged objects in an environment. The intensity of each pixel in the 'RSSI image' is the measured RF signal strength for a particular tag in the corresponding direction. We construct these RSSI images by panning and tilting an RFID reader antenna while measuring the RSSI value at each bearing. Additionally, we present a framework for estimating a tagged object's 3D location using fused ID-specific features derived from an RSSI image, a camera image, and a laser range finder scan. We evaluate these methods using a robot with actuated, long-range RFID antennas and finger-mounted short-range antennas. The robot first scans its environment to discover which tagged objects are within range, creates a user interface, orients toward the user-selected object using RF signal strength, estimates the 3D location of the object using an RSSI image with sensor fusion, approaches and grasps the object, and uses its finger-mounted antennas to confirm that the desired object has been grasped. In our tests, the sensor fusion system with an RSSI image correctly located the requested object in 17 out of 18 trials (94.4%), an 11.1% improvement over the system's performance when not using an RSSI image. The robot correctly oriented to the requested object in 8 out of 9 trials (88.9%), and in 3 out of 3 trials the entire system successfully grasped the object selected by the user.

Thumbnail Image
Item

Probabilistic UHF RFID tag pose estimation with multiple antennas and a multipath RF propagation

2008-09 , Deyle, Travis , Kemp, Charles C. , Reynolds, Matt S.

We present a novel particle filter implementation for estimating the pose of tags in the environment with respect to an RFID-equipped robot. This particle filter combines signals from a specially designed RFID antenna system with odometry and an RFID signal propagation model. Our model includes antenna characteristics, direct-path RF propagation, and multipath RF propagation. We first describe a novel 6-antenna RFID sensor system that provides the robot with a 360-degree view of the tags in its environment. We then present the results of real-world evaluation where RFID-inferred tag position is compared with ground truth data from a laser range-finder. In our experiments the system is shown to estimate the pose of UHF RFID tags in a real-world environment without requiring a priori training or map-building. The system exhibits 6.1 deg mean bearing error and 0.69 m mean range error over robot to tag distances of over 4 m in an environment with significant multipath. The RFID system provides the ability to uniquely identify specific tagged locations and objects, and to discriminate among multiple tagged objects in the field at the same time, which are important capabilities that a laser range-finder does not provide. We expect that this new type of multiple-antenna RFID system, including particle filters that incorporate RF signal propagation models, will prove to be a valuable sensor for mobile robots operating in semi-structured environments where RFID tags are present.

Thumbnail Image
Item

PPS-Tags: Physical, Perceptual and Semantic Tags for Autonomous Mobile Manipulation

2009-10 , Nguyen, Hai , Deyle, Travis , Reynolds, Matt S. , Kemp, Charles C.

For many promising application areas, autonomous mobile manipulators do not yet exhibit sufficiently robust performance. We propose the use of tags applied to task-relevant locations in human environments in order to help autonomous mobile manipulators physically interact with the location, perceive the location, and understand the location’s semantics. We call these tags physical, perceptual and semantic tags (PPS-tags). We present three examples of PPS-tags, each of which combines compliant and colorful material with a UHF RFID tag. The RFID tag provides a unique identifier that indexes into a semantic database that holds information such as the following: what actions can be performed at the location, how can these actions be performed, and what state changes should be observed upon task success? We also present performance results for our robot operating on a PPS-tagged light switch, rocker light switch, lamp, drawer, and trash can. We tested the robot performing the available actions from 4 distinct locations with each of these 5 tagged devices. For the light switch, rocker light switch, lamp, and trash can, the robot succeeded in all trials (24/24). The robot failed to open the drawer when starting from an oblique angle, and thus succeeded in 6 out of 8 trials. We also tested the ability of the robot to detect failure in unusual circumstances, such as the lamp being unplugged and the drawer being stuck.