Title:
Spatiotemporal patterns of parietofrontal activity and eye movements underlying the visual perception of complex human tool use

dc.contributor.author Natraj, Nikhilesh
dc.contributor.corporatename School of Applied Physiology
dc.contributor.corporatename School of Biological Sciences
dc.contributor.corporatename College of Sciences
dc.contributor.corporatename College of Sciences
dc.contributor.corporatename School of Biological Sciences
dc.contributor.department Applied Physiology
dc.date.accessioned 2017-01-11T14:00:19Z
dc.date.available 2017-01-11T14:00:19Z
dc.date.created 2015-12
dc.date.issued 2015-11-16
dc.date.submitted December 2015
dc.date.updated 2017-01-11T14:00:19Z
dc.description.abstract When watching a child learning to use a spoon, a mother is immediately able to recognize the error when the child grabs the bowl rather than the stem, or when the child uses the spoon to try and scoop paper. Recognizing proper tool grasp-postures and use-contexts is an ability vital for daily life and can be lost due to brain injury. A better understanding of how the brain encodes contextual and grasp-specific tool-use not only furthers basic neuroscience, but also has strong relevance to deficits arising from neural pathologies. However, the majority of research till date has studied the neural response to viewing tools in isolation or viewing simple tool-grasps. These studies have shown that the recognition of tools to be a complex visuomotor process, as not only was the visual cortex engaged but also parietal and frontal regions that underlie actual tool-use. The recognition of tools therefore involves automatically recalling their motor information (graspability and manipulability) via activation of parietofrontal motor regions, a property called action affordances. Yet, it is still unclear how parietofrontal regions encode the combination of contextual and grasp-specific tool-use scenes. In addition, parietofrontal regions are multifaceted and also underlie visuospatial attention and eye movements. It is possible a relationship might exist between eye movements, attention and tool-use understanding over parietofrontal regions. Therefore the overall goal of this thesis was to understand the spatiotemporal patterns of parietofrontal activity and eye movements underlying the perceptual of contextual and grasp-specific static tool use images. Electroencephalography (EEG) was used to measure neural activity, combined with eye tracking to measure fixation and saccades. Overall, results from this thesis present evidence that the affordances of non-functional grasp-postures perturbed an observer from understanding the contextual uses of tools, with corresponding unique patterns of parietofrontal activity and eye movements. This effect was most robust when the tool was placed in contexts that afforded a certain degree of tool-use. Results also revealed a relationship between attention, eye movements and action perception over parietofrontal regions. Specifically, saccades perturbed activity over frontal regions during the perception of non-functional grasp postures and in addition, there was greater engagement of the left precuneus in the superior parietal lobe if the observer had to quickly parse the scene information using peripheral vision and rely on short term memory. In contrast, there was greater engagement of the left middle temporal gyrus if the observer had the ability to parse scene information continuously using foveal attention. Results in this thesis shed light on the neural and visual mechanisms in understanding the affordances of non-functional grasp postures, and the relation between the two mechanisms. The automatic sensitivity in understanding the intent of non-functional grasp-postures may correspond to a lifetime of learning the affordances of grasp-specific action outcomes with tools. Such cognitive motor knowledge may be vital in navigating a human environment almost entirely constructed on advanced tool-use knowledge and findings from this thesis have many potential applications in the field of neuro-rehabilitation.
dc.description.degree Ph.D.
dc.format.mimetype application/pdf
dc.identifier.uri http://hdl.handle.net/1853/56207
dc.language.iso en_US
dc.publisher Georgia Institute of Technology
dc.subject Electroencephalography
dc.subject Eye tracking
dc.subject Parietal
dc.subject Frontal
dc.subject Perception
dc.subject Vision
dc.subject Pattern recognition
dc.title Spatiotemporal patterns of parietofrontal activity and eye movements underlying the visual perception of complex human tool use
dc.type Text
dc.type.genre Dissertation
dspace.entity.type Publication
local.contributor.corporatename College of Sciences
local.contributor.corporatename School of Biological Sciences
local.relation.ispartofseries Doctor of Philosophy with a Major in Applied Physiology
relation.isOrgUnitOfPublication 85042be6-2d68-4e07-b384-e1f908fae48a
relation.isOrgUnitOfPublication c8b3bd08-9989-40d3-afe3-e0ad8d5c72b5
relation.isOrgUnitOfPublication 85042be6-2d68-4e07-b384-e1f908fae48a
relation.isSeriesOfPublication ead85f7a-56bd-4216-a2d8-a66530e2e8b9
thesis.degree.level Doctoral
Files
Original bundle
Now showing 1 - 5 of 10
Thumbnail Image
Name:
NATRAJ-DISSERTATION-2015.pdf
Size:
22.8 MB
Format:
Adobe Portable Document Format
Description:
No Thumbnail Available
Name:
Video9_Correct vs Spatial 500msExp.avi
Size:
15.02 MB
Format:
Microsoft Audio/Video Inerleaved
Description:
No Thumbnail Available
Name:
Video8_Correct vs Incorrect 500msExp.avi
Size:
16.38 MB
Format:
Microsoft Audio/Video Inerleaved
Description:
No Thumbnail Available
Name:
Video7__Correct vs Spatial 100msExp.avi
Size:
12.11 MB
Format:
Microsoft Audio/Video Inerleaved
Description:
No Thumbnail Available
Name:
Video6_Correct vs Incorrect 100msExp.avi
Size:
12.16 MB
Format:
Microsoft Audio/Video Inerleaved
Description:
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
LICENSE.txt
Size:
3.87 KB
Format:
Plain Text
Description: