Organizational Unit:
School of Psychology

Research Organization Registry ID
Description
Previous Names
Parent Organization
Parent Organization
Organizational Unit
Includes Organization(s)
Organizational Unit

Publication Search Results

Now showing 1 - 10 of 35
  • Item
    Understanding older adults' perceptions of usefulness of an assistive home robot
    (Georgia Institute of Technology, 2013-11-21) Beer, Jenay M.
    Developing robots that are useful to older adults is more than simply creating robots that complete household tasks. To ensure that older adults perceive a robot to be useful, careful consideration of the users’ capabilities, robot autonomy, and task is needed (Venkatesh & Davis, 2000). The purpose of this study was to investigate the construct of perceived usefulness within the context of robot assistance. Mobile older adults (N = 12) and older adults with mobility loss (N=12) participated in an autonomy selection think aloud task, and a persona based interview. Findings suggest that older adults with mobility loss preferred an autonomy level where they command/control the robot themselves. Mobile older adults’ preferences were split between commanding/controlling the robot themselves, or the robot commands/controls itself. Reasons for their preferences were related to decision making, and were task specific. Additionally, findings from the persona base interview study support Technology Acceptance Model (TAM) constructs, as well as adaptability, reliability, and trust as positively correlated with perceptions of usefulness. However, despite the positive correlation, barriers and facilitators of acceptance identified in the interview suggest that perceived usefulness judgments are complex, and some questionnaire constructs were interpreted differently between participants. Thus, care should be taken when applying TAM constructs to other domains, such as robot assistance to promote older adult independence.
  • Item
    Understanding the construct of human trust in domestic service robots
    (Georgia Institute of Technology, 2013-11-18) Olson, Katherine E.
    Simple robots are already being deployed and adopted by some consumers for use at home. The robots currently in development for home use are far more sophisticated. However, it was not know the extent to which humans would trust them. The purpose of this study was to identify factors that influence trust in domestic service robots across a range of users with different capabilities and experience levels. Twelve younger adults (aged 18-28) and 24 older adults (12 low technology users and 12 high technology users) aged 65-75 participated in a structured interview, card-sorting task, and several questionnaires. Most participants had heard about or seen robots, but indicated they had little experience with them. However, most had positive opinions about robots and indicated they would trust a robot to assist with tasks in their homes, though it was dependent on the task. Before making a decision to trust a robot, participants wanted to know a lot of information about the robot such robot reliability, capabilities, and limitations. When asked to select their trust preference for human versus robot assistance for specific tasks, participants had preferences for both human and robot assistance, although it was dependent on the task. Many participants defined trust in robots similar to definitions of trust in automation (Ezer, 2008; Jian et al., 2000). Additionally, they had high rates of selection for adjectives used to describe trust in automation and also selected some adjectives used to describe trust in humans when asked to select characteristics they most associated with trustworthy and untrustworthy robots. Overall, there were some differences between age and technology experience groups, but there were far more similarities. By carefully considering user needs, robot designers can develop robots that have the potential to be adopted by a wide range of people.
  • Item
    Emotion and motion: age-related differences in recognizing virtual agent facial expressions
    (Georgia Institute of Technology, 2011-10-05) Smarr, Cory-Ann
    Technological advances will allow virtual agents to increasingly help individuals with daily activities. As such, virtual agents will interact with users of various ages and experience levels. Facial expressions are often used to facilitate social interaction between agents and humans. However, older and younger adults do not label human or virtual agent facial expressions in the same way, with older adults commonly mislabeling certain expressions. The dynamic formation of facial expression, or motion, may provide additional facial information potentially making emotions less ambiguous. This study examined how motion affects younger and older adults in recognizing various intensities of emotion displayed by a virtual agent. Contrary to the dynamic advantage found in emotion recognition for human faces, older adults had higher emotion recognition for static virtual agent faces than dynamic ones. Motion condition did not influence younger adults' emotion recognition. Younger adults had higher emotion recognition than older adults for the emotions of anger, disgust, fear, happiness, and sadness. Low intensities of expression had lower emotion recognition than medium to high expression intensities.
  • Item
    Recognizing facial expression of virtual agents, synthetic faces, and human faces: the effects of age and character type on emotion recognition
    (Georgia Institute of Technology, 2010-04-08) Beer, Jenay Michelle
    An agent's facial expression may communicate emotive state to users both young and old. The ability to recognize emotions has been shown to differ with age, with older adults more commonly misidentifying the facial emotions of anger, fear, and sadness. This research study examined whether emotion recognition of facial expressions differed between different types of on-screen agents, and between age groups. Three on-screen characters were compared: a human, a synthetic human, and a virtual agent. In this study 42 younger (age 28-28) and 42 older (age 65-85) adults completed an emotion recognition task with static pictures of the characters demonstrating four basic emotions (anger, fear, happiness, and sadness) and neutral. The human face resulted in the highest proportion match, followed by the synthetic human, then the virtual agent with the lowest proportion match. Both the human and synthetic human faces resulted in age-related differences for the emotions anger, fear, sadness, and neutral, with younger adults showing higher proportion match. The virtual agent showed age-related differences for the emotions anger, fear, happiness, and neutral, with younger adults showing higher proportion match. The data analysis and interpretation of the present study differed from previous work by utilizing two unique approaches to understanding emotion recognition. First, misattributions participants made when identifying emotion were investigated. Second, a similarity index of the feature placement between any two virtual agent emotions was calculated, suggesting that emotions were commonly misattributed as other emotions similar in appearance. Overall, these results suggest that age-related differences transcend human faces to other types of on-screen characters, and differences between older and younger adults in emotion recognition may be further explained by perceptual discrimination between two emotions of similar feature appearance.
  • Item
    Exploring everyday privacy behaviors and misclosures
    (Georgia Institute of Technology, 2009-12-08) Caine, Kelly Erinn
    As access to information changes with increased use of technology, privacy becomes an increasingly prominent issue among technology users. Privacy concerns should be taken seriously because they influence system adoption, the way a system is used, and may even lead to system disuse. Threats to privacy are not only due to traditional security and privacy issues; human factors issues such as unintentional disclosure of information also influence the preservation of privacy in technology systems. A dual pronged approach was used to examine privacy. First, a broad investigation of younger and older adults' privacy behaviors was conducted. The goal of this study was to gain a better understanding of privacy across technologies, to discover the similarities, and identify the differences in what privacy means across contexts as well as provide a means to evaluate current theories of privacy. This investigation resulted in a categorization of privacy behaviors associated with technology. There were three high level privacy behavior categories identified: avoidance, modification, and alleviatory behavior. This categorization furthers our understanding about the psychological underpinnings of privacy concerns and suggests that 1) common privacy feelings and behaviors exist across people and technologies and 2) alternative designs which consider these commonalities may increase privacy. Second, I examined one specific human factors issue associated with privacy: disclosure error. This investigation focused on gaining an understanding of how to support privacy by preventing misclosure. A misclosure is an error in disclosure. When information is disclosed in error, or misclosed, privacy is violated in that information not intended for a specific person(s) is nevertheless revealed to that person. The goal of this study was to provide a psychological basis for design suggestions for improving privacy in technology which was grounded in empirical findings. The study furthers our understanding about privacy errors in the following ways: First, it demonstrates for the first time that both younger and older adults experience misclosures . Second, it suggests that misclosures occur even when technology is very familiar to the user. Third, it revealed that some misclosure experiences result in negative consequences, suggesting misclosure is a potential threat to privacy. Finally, by exploring the context surrounding each reported misclosure, I was able to propose potential design suggestions that may decrease the likelihood of misclosure.
  • Item
    Is a robot an appliance, teammate, or friend? age-related differences in expectations of and attitudes toward personal home-based robots
    (Georgia Institute of Technology, 2008-11-11) Ezer, Neta
    Future advances in technology may allow home-based robots to perform complex collaborative activities with individuals of different ages. Two studies were conducted to understand the expectations of and attitudes toward home-based robots by younger and older adults. One study involved questionnaires sent to 2500 younger adults (aged 18-28) and 2500 older adults (aged 65-86) in the Atlanta Metropolitan area. One hundred and eighty questionnaires were completed and returned by individuals in the targeted age groups. For the questionnaire, participants were asked to imagine a robot in their home and then to answer questions about how well characteristics matched their imagined robot. Participants' technology and robot experience, demographic information, and health information were also collected. In conjunction with the questionnaire study, twelve younger adults (aged 19-26) and twenty-four older adults in two sub-age groups (younger-older, aged 65-75, and older-older aged 77-85) were interviewed about their expectations of and attitudes toward a robot in their home. They were asked to imagine a robot in their home and answer numerous questions about the tasks their envisioned robot would perform, the appearance of the robot, and other general questions about their interaction with the robot. The results of the studies suggest that individuals have many different ideas about what a robot in the home would be like. Mostly, they want a robot to perform mundane or repetitive tasks, such as cleaning, and picture a robot as a time-saving device. However, individuals are willing to have a robot perform other types of tasks, if they see benefits of having the robot perform those tasks. The ability of the robot to perform tasks efficiently, with minimal effort on the part of the human, appears to be more important in determining acceptance of the robot than its social ability or appearance. Overall, individuals both younger and older seem to be very open to the idea of a robot in their home as long it is useful and not too difficult to use.
  • Item
    The manipulation of user expectancies: effects on reliance, compliance, and trust using an automated system
    (Georgia Institute of Technology, 2008-03-31) Mayer, Andrew K.
    As automated technologies continue to advance, they will be perceived more as collaborative team members and less as simply helpful machines. Expectations of the likely performance of others play an important role in how their actual performance is judged (Stephan, 1985). Although user expectations have been expounded as important for human-automation interaction, this factor has not been systematically investigated. The purpose of the current study was to examine the effect older and younger adults expectations of likely automation performance have on human-automation interaction. In addition, this study investigated the effect of different automation errors (false alarms and misses) on dependence, reliance, compliance, and trust in an automated system. Findings suggest that expectancy effects are relatively short lived, significantly affecting reliance and compliance only through the first experimental block. The effects of type of automation error indicate that participants in a false alarm condition increase reliance and decrease compliance while participants in a miss condition do not change their behavior. The results are important because expectancies must be considered when designing training for human-automation interaction. In addition, understanding the effects of type of automation errors is crucial for the design of automated systems. For example, if the automation is designed for diverse and dynamic environments where automation performance may fluctuate, then a deeper understanding of automation functioning may be needed by users.
  • Item
    Effects of mental model quality on collaborative system performance
    (Georgia Institute of Technology, 2008-03-31) Wilkison, Bart D.
    As the tasks humans perform become more complicated and the technology manufactured to support those tasks becomes more adaptive, the relationship between humans and automation transforms into a collaborative system. In this system each member depends on the input of the other to reach a predetermined goal beneficial to both parties. Studying the human/automation dynamic as a social team provides a new set of variables affecting performance previously unstudied by automation researchers. One such variable is the shared mental model (Mathieu, Heffner, Goodwin, Salas, & Cannon-Bowers, 2000). This study examined the relationship between mental model quality and collaborative system performance within the domain of a navigation task. Participants navigated through a simulated city with the help of a navigational system performing at two levels of accuracy; 70% and 100%. Participants with robust mental models of the task environment identified automation errors when they occurred and optimally navigated to destinations. Conversely, users with vague mental models were less likely to identify automation errors, and chose inefficient routes to destinations. Thus, mental model quality proved to be an efficient predictor of navigation performance. Additionally, participants with no mental model performed as well as participants with vague mental models. The difference in performance was the number and type of errors committed. This research is important as it supports previous assertions that humans and automated systems can work as teammates and perform teamwork (Nass, Fog, & Moon, 2000). Thus, other variables found to impact human/human team performance might also affect human/automation team performance just as this study explored the effects of a primarily human/human team performance variable, the mental model. Additionally, this research suggests that a training program creating a weak, inaccurate, or incomplete mental model in the user is equivalent to no training program in terms of performance. Finally, through a qualitative model, this study proposes mental model quality affects the constructs of user self confidence and trust in automation. These two constructs are thought to ultimately determine automation usage (Lee & Moray, 1994). To validate the model a follow on study is proposed to measure automation usage as mental model quality changes.
  • Item
    Toward an understanding of optimal performance within a human-automation collaborative system: Effects of error and verification costs
    (Georgia Institute of Technology, 2006-11-20) Ezer, Neta
    Automated products, especially automated decision aids, have the potential to improve the lives of older adults by supporting their daily needs. Although automation seems promising in this arena, there is evidence that humans, in general, tend to have difficulty optimizing their behavior with a decision aid, and older adults even more so. In a human-automation collaborative system, the ability to balance costs involved in relying on the automation and those involved in verifying the automation is essential for optimal performance and error minimization. Thus, this study was conducted to better understand the processes associated with balancing these costs and also to examine age differences in these processes. Cost of reliance on automation was evaluated using an object counting task. Participants were required to indicate the number of circles on a display, with support coming from a computer estimate decision aid. They were instructed to rely on the aid if they believed its answer or verify the aid by manually counting the circles on the screen if they did not believe the aid to be correct. Manipulations in this task were the cost of a wrong answer, either -5, -10, -25, or -50 points and the cost of verification, either high or low. It was expected that participants would develop a general pattern of appropriate reliance across the cost conditions, but would not change their reliance behavior enough to reach optimality. Older adults were expected to rely on the decision aid to a lesser extent than younger adults in all conditions, yet rate the automation as being more reliable. It was found that older and younger adults did not show large differences in reliance, although older adults tend to be more resistant to changing their reliance due to costs than younger adults. Both age groups significantly underutilized the computer estimate, yet overestimated its reliability. The results are important because it may be necessary to design automated devices and training programs differently for older adults than for younger adults, to direct them towards an optimal strategy of reliance.
  • Item
    Privacy Perceptions of Visual Sensing Devices: Effects of Users' Ability and Type of Sensing Device
    (Georgia Institute of Technology, 2006-07-17) Caine, Kelly E.
    Homes that can collaborate with their residents rather than simply provide shelter are becoming a reality. These homes such as Georgia Techs Aware Home and MITs house_n can potentially provide support to their residents. Because aging adults may be faced with increasing mental and/or physical limitation(s) they may stand to benefit, in particular, from supports provided by these homes if they utilize the technologies they offer. However, the advanced technology in these aware homes often makes use of sensing devices that capture some kind of image-based information. Image-based information capture has previously been shown to elicit privacy concerns among users, and even lead to disuse of the system. The purpose of this study was to explore the privacy concerns that older adults had about a home equipped with visual sensing devices. Using a scenario-based structured interview approach I investigated how the type of images the home captures as well as the physical and mental health of the residents of the home affected privacy concerns as well as perceived benefits. In addition, responses to non-scenario-based open ended structured interview questions were used to gain an understanding of the characteristics of the influential variables. Results suggest that although most older adults express some concerns about using a visual sensing device in their home, the potential benefits of having such a device in specific circumstances outweigh their concerns. These findings have implications in privacy and technology acceptance theory as well as for designers of home based visual monitoring systems.