Organizational Unit:
School of Psychology

Research Organization Registry ID
Description
Previous Names
Parent Organization
Parent Organization
Organizational Unit
Includes Organization(s)
Organizational Unit

Publication Search Results

Now showing 1 - 2 of 2
  • Item
    Emotion and motion: age-related differences in recognizing virtual agent facial expressions
    (Georgia Institute of Technology, 2011-10-05) Smarr, Cory-Ann
    Technological advances will allow virtual agents to increasingly help individuals with daily activities. As such, virtual agents will interact with users of various ages and experience levels. Facial expressions are often used to facilitate social interaction between agents and humans. However, older and younger adults do not label human or virtual agent facial expressions in the same way, with older adults commonly mislabeling certain expressions. The dynamic formation of facial expression, or motion, may provide additional facial information potentially making emotions less ambiguous. This study examined how motion affects younger and older adults in recognizing various intensities of emotion displayed by a virtual agent. Contrary to the dynamic advantage found in emotion recognition for human faces, older adults had higher emotion recognition for static virtual agent faces than dynamic ones. Motion condition did not influence younger adults' emotion recognition. Younger adults had higher emotion recognition than older adults for the emotions of anger, disgust, fear, happiness, and sadness. Low intensities of expression had lower emotion recognition than medium to high expression intensities.
  • Item
    Recognizing facial expression of virtual agents, synthetic faces, and human faces: the effects of age and character type on emotion recognition
    (Georgia Institute of Technology, 2010-04-08) Beer, Jenay Michelle
    An agent's facial expression may communicate emotive state to users both young and old. The ability to recognize emotions has been shown to differ with age, with older adults more commonly misidentifying the facial emotions of anger, fear, and sadness. This research study examined whether emotion recognition of facial expressions differed between different types of on-screen agents, and between age groups. Three on-screen characters were compared: a human, a synthetic human, and a virtual agent. In this study 42 younger (age 28-28) and 42 older (age 65-85) adults completed an emotion recognition task with static pictures of the characters demonstrating four basic emotions (anger, fear, happiness, and sadness) and neutral. The human face resulted in the highest proportion match, followed by the synthetic human, then the virtual agent with the lowest proportion match. Both the human and synthetic human faces resulted in age-related differences for the emotions anger, fear, sadness, and neutral, with younger adults showing higher proportion match. The virtual agent showed age-related differences for the emotions anger, fear, happiness, and neutral, with younger adults showing higher proportion match. The data analysis and interpretation of the present study differed from previous work by utilizing two unique approaches to understanding emotion recognition. First, misattributions participants made when identifying emotion were investigated. Second, a similarity index of the feature placement between any two virtual agent emotions was calculated, suggesting that emotions were commonly misattributed as other emotions similar in appearance. Overall, these results suggest that age-related differences transcend human faces to other types of on-screen characters, and differences between older and younger adults in emotion recognition may be further explained by perceptual discrimination between two emotions of similar feature appearance.