Organizational Unit:
School of Psychology

Research Organization Registry ID
Description
Previous Names
Parent Organization
Parent Organization
Organizational Unit
Includes Organization(s)
Organizational Unit

Publication Search Results

Now showing 1 - 8 of 8
  • Item
    Exploring everyday privacy behaviors and misclosures
    (Georgia Institute of Technology, 2009-12-08) Caine, Kelly Erinn
    As access to information changes with increased use of technology, privacy becomes an increasingly prominent issue among technology users. Privacy concerns should be taken seriously because they influence system adoption, the way a system is used, and may even lead to system disuse. Threats to privacy are not only due to traditional security and privacy issues; human factors issues such as unintentional disclosure of information also influence the preservation of privacy in technology systems. A dual pronged approach was used to examine privacy. First, a broad investigation of younger and older adults' privacy behaviors was conducted. The goal of this study was to gain a better understanding of privacy across technologies, to discover the similarities, and identify the differences in what privacy means across contexts as well as provide a means to evaluate current theories of privacy. This investigation resulted in a categorization of privacy behaviors associated with technology. There were three high level privacy behavior categories identified: avoidance, modification, and alleviatory behavior. This categorization furthers our understanding about the psychological underpinnings of privacy concerns and suggests that 1) common privacy feelings and behaviors exist across people and technologies and 2) alternative designs which consider these commonalities may increase privacy. Second, I examined one specific human factors issue associated with privacy: disclosure error. This investigation focused on gaining an understanding of how to support privacy by preventing misclosure. A misclosure is an error in disclosure. When information is disclosed in error, or misclosed, privacy is violated in that information not intended for a specific person(s) is nevertheless revealed to that person. The goal of this study was to provide a psychological basis for design suggestions for improving privacy in technology which was grounded in empirical findings. The study furthers our understanding about privacy errors in the following ways: First, it demonstrates for the first time that both younger and older adults experience misclosures . Second, it suggests that misclosures occur even when technology is very familiar to the user. Third, it revealed that some misclosure experiences result in negative consequences, suggesting misclosure is a potential threat to privacy. Finally, by exploring the context surrounding each reported misclosure, I was able to propose potential design suggestions that may decrease the likelihood of misclosure.
  • Item
    Is a robot an appliance, teammate, or friend? age-related differences in expectations of and attitudes toward personal home-based robots
    (Georgia Institute of Technology, 2008-11-11) Ezer, Neta
    Future advances in technology may allow home-based robots to perform complex collaborative activities with individuals of different ages. Two studies were conducted to understand the expectations of and attitudes toward home-based robots by younger and older adults. One study involved questionnaires sent to 2500 younger adults (aged 18-28) and 2500 older adults (aged 65-86) in the Atlanta Metropolitan area. One hundred and eighty questionnaires were completed and returned by individuals in the targeted age groups. For the questionnaire, participants were asked to imagine a robot in their home and then to answer questions about how well characteristics matched their imagined robot. Participants' technology and robot experience, demographic information, and health information were also collected. In conjunction with the questionnaire study, twelve younger adults (aged 19-26) and twenty-four older adults in two sub-age groups (younger-older, aged 65-75, and older-older aged 77-85) were interviewed about their expectations of and attitudes toward a robot in their home. They were asked to imagine a robot in their home and answer numerous questions about the tasks their envisioned robot would perform, the appearance of the robot, and other general questions about their interaction with the robot. The results of the studies suggest that individuals have many different ideas about what a robot in the home would be like. Mostly, they want a robot to perform mundane or repetitive tasks, such as cleaning, and picture a robot as a time-saving device. However, individuals are willing to have a robot perform other types of tasks, if they see benefits of having the robot perform those tasks. The ability of the robot to perform tasks efficiently, with minimal effort on the part of the human, appears to be more important in determining acceptance of the robot than its social ability or appearance. Overall, individuals both younger and older seem to be very open to the idea of a robot in their home as long it is useful and not too difficult to use.
  • Item
    The manipulation of user expectancies: effects on reliance, compliance, and trust using an automated system
    (Georgia Institute of Technology, 2008-03-31) Mayer, Andrew K.
    As automated technologies continue to advance, they will be perceived more as collaborative team members and less as simply helpful machines. Expectations of the likely performance of others play an important role in how their actual performance is judged (Stephan, 1985). Although user expectations have been expounded as important for human-automation interaction, this factor has not been systematically investigated. The purpose of the current study was to examine the effect older and younger adults expectations of likely automation performance have on human-automation interaction. In addition, this study investigated the effect of different automation errors (false alarms and misses) on dependence, reliance, compliance, and trust in an automated system. Findings suggest that expectancy effects are relatively short lived, significantly affecting reliance and compliance only through the first experimental block. The effects of type of automation error indicate that participants in a false alarm condition increase reliance and decrease compliance while participants in a miss condition do not change their behavior. The results are important because expectancies must be considered when designing training for human-automation interaction. In addition, understanding the effects of type of automation errors is crucial for the design of automated systems. For example, if the automation is designed for diverse and dynamic environments where automation performance may fluctuate, then a deeper understanding of automation functioning may be needed by users.
  • Item
    Effects of mental model quality on collaborative system performance
    (Georgia Institute of Technology, 2008-03-31) Wilkison, Bart D.
    As the tasks humans perform become more complicated and the technology manufactured to support those tasks becomes more adaptive, the relationship between humans and automation transforms into a collaborative system. In this system each member depends on the input of the other to reach a predetermined goal beneficial to both parties. Studying the human/automation dynamic as a social team provides a new set of variables affecting performance previously unstudied by automation researchers. One such variable is the shared mental model (Mathieu, Heffner, Goodwin, Salas, & Cannon-Bowers, 2000). This study examined the relationship between mental model quality and collaborative system performance within the domain of a navigation task. Participants navigated through a simulated city with the help of a navigational system performing at two levels of accuracy; 70% and 100%. Participants with robust mental models of the task environment identified automation errors when they occurred and optimally navigated to destinations. Conversely, users with vague mental models were less likely to identify automation errors, and chose inefficient routes to destinations. Thus, mental model quality proved to be an efficient predictor of navigation performance. Additionally, participants with no mental model performed as well as participants with vague mental models. The difference in performance was the number and type of errors committed. This research is important as it supports previous assertions that humans and automated systems can work as teammates and perform teamwork (Nass, Fog, & Moon, 2000). Thus, other variables found to impact human/human team performance might also affect human/automation team performance just as this study explored the effects of a primarily human/human team performance variable, the mental model. Additionally, this research suggests that a training program creating a weak, inaccurate, or incomplete mental model in the user is equivalent to no training program in terms of performance. Finally, through a qualitative model, this study proposes mental model quality affects the constructs of user self confidence and trust in automation. These two constructs are thought to ultimately determine automation usage (Lee & Moray, 1994). To validate the model a follow on study is proposed to measure automation usage as mental model quality changes.
  • Item
    Toward an understanding of optimal performance within a human-automation collaborative system: Effects of error and verification costs
    (Georgia Institute of Technology, 2006-11-20) Ezer, Neta
    Automated products, especially automated decision aids, have the potential to improve the lives of older adults by supporting their daily needs. Although automation seems promising in this arena, there is evidence that humans, in general, tend to have difficulty optimizing their behavior with a decision aid, and older adults even more so. In a human-automation collaborative system, the ability to balance costs involved in relying on the automation and those involved in verifying the automation is essential for optimal performance and error minimization. Thus, this study was conducted to better understand the processes associated with balancing these costs and also to examine age differences in these processes. Cost of reliance on automation was evaluated using an object counting task. Participants were required to indicate the number of circles on a display, with support coming from a computer estimate decision aid. They were instructed to rely on the aid if they believed its answer or verify the aid by manually counting the circles on the screen if they did not believe the aid to be correct. Manipulations in this task were the cost of a wrong answer, either -5, -10, -25, or -50 points and the cost of verification, either high or low. It was expected that participants would develop a general pattern of appropriate reliance across the cost conditions, but would not change their reliance behavior enough to reach optimality. Older adults were expected to rely on the decision aid to a lesser extent than younger adults in all conditions, yet rate the automation as being more reliable. It was found that older and younger adults did not show large differences in reliance, although older adults tend to be more resistant to changing their reliance due to costs than younger adults. Both age groups significantly underutilized the computer estimate, yet overestimated its reliability. The results are important because it may be necessary to design automated devices and training programs differently for older adults than for younger adults, to direct them towards an optimal strategy of reliance.
  • Item
    Privacy Perceptions of Visual Sensing Devices: Effects of Users' Ability and Type of Sensing Device
    (Georgia Institute of Technology, 2006-07-17) Caine, Kelly E.
    Homes that can collaborate with their residents rather than simply provide shelter are becoming a reality. These homes such as Georgia Techs Aware Home and MITs house_n can potentially provide support to their residents. Because aging adults may be faced with increasing mental and/or physical limitation(s) they may stand to benefit, in particular, from supports provided by these homes if they utilize the technologies they offer. However, the advanced technology in these aware homes often makes use of sensing devices that capture some kind of image-based information. Image-based information capture has previously been shown to elicit privacy concerns among users, and even lead to disuse of the system. The purpose of this study was to explore the privacy concerns that older adults had about a home equipped with visual sensing devices. Using a scenario-based structured interview approach I investigated how the type of images the home captures as well as the physical and mental health of the residents of the home affected privacy concerns as well as perceived benefits. In addition, responses to non-scenario-based open ended structured interview questions were used to gain an understanding of the characteristics of the influential variables. Results suggest that although most older adults express some concerns about using a visual sensing device in their home, the potential benefits of having such a device in specific circumstances outweigh their concerns. These findings have implications in privacy and technology acceptance theory as well as for designers of home based visual monitoring systems.
  • Item
    Explaining dual-task implicit learning deficits: the effect of within stimulus presentation
    (Georgia Institute of Technology, 2006-04-04) Nichols, Timothy A.
    Under typical between stimulus dual-task conditions, implicit sequence learning typically suffers, except under within stimulus conditions, where the stimuli for both tasks are the same. This finding is inconclusive, given that it has not been replicated and the study under which it was obtained was methodologically flawed. The finding also seemed to contradict the psychological refractory period finding that simultaneous presentation of the two task stimuli will result in performance decrements. Two experiments were conducted to test the effect of within stimulus presentation in a dual-task implicit learning task. In Experiment 1, within stimulus presentation resulted in improved sequence learning, relative to between stimulus presentation. The second experiment did not show an effect of response selection load under within stimulus presentation conditions. The findings suggest that implicit learning can occur under attentionally demanding conditions, but that the incidental task structure to be learned should be comprised of stimuli that are already attended during primary task processing.
  • Item
    Factors that affect trust and reliance on an automated aid
    (Georgia Institute of Technology, 2006-04-03) Sanchez, Julian
    Previous research efforts aimed at understanding the relationship between automation reliability and reliance on the automation have mainly focused on a single dimension of reliability, the automations error rate. Efforts to understand the effects of additional dimensions, such as types of errors, have merely provided suggestions about the effects that automation false alarms and misses can have on human behavior). Furthermore, other dimensions of reliability, such as the distribution of errors in time, have been almost completely ignored. A multi-task simulation of an agricultural vehicle was used in this investigation. The simulator was composed of two main tasks, a collision avoidance task and a tracking task. The collision avoidance task was supported by an imperfect automated collision avoidance system and the tracking task was performed manually. The results of this investigation indicated that there are distinct patterns of reliance that develop as a function of error type, which are dependent on the state of the automation (alarms or non-alarms). The different distributions of errors across time had an effect on the estimates of reliability and subjective trust ratings. The recency of errors was negatively related to perceived reliability and trust. The results of the current investigation also suggest that older adults are able to adjust their behavior according to the characteristics of the automation, although it takes them longer to do so. Furthermore, it appears that older adults are willing to use automated systems, as long as they are reliable enough to reduce workload.