Research Symposium

24th annual Undergraduate Research Symposium, April 3, 2024

Jonathan (Evan) Chisolm Poster Session 2: 10:45 am - 11:45 am/228


037A86F0-6D71-40FC-943C-B1A116DE427C.jpeg

BIO


Hi, my name is Evan Chisolm, and I'm from Pensacola, Florida. Currently, I am a senior at Florida State University, majoring in Behavioral Neuroscience. My future career goal is to become a pediatric neurosurgeon. Beyond the clinical aspect of neuroscience, I also enjoy the research aspect. The research project I'm working on is aimed at determining how object perception, utilizing tracking tools, can be sensitive enough to detect early cognitive decline and subtle brain changes. In my free time, I enjoy hanging out with friends, listening to music, going on nature walks, watching movies, playing football or soccer, and working out.

Hand and eye movements during object categorization discriminate between younger and older adults

Authors: Jonathan (Evan) Chisolm, Chris Martin
Student Major: Behavioral Neuroscience
Mentor: Chris Martin
Mentor's Department: Department of Psychology
Mentor's College: College of Arts and Sciences
Co-Presenters: Valery Sastoque

Abstract


The ability to flexibly categorize objects is an essential aspect of adaptive behavior. In complex environments with rapidly changing task demands, accurate categorization requires the resolution of feature-based interference. Recent neuroimaging and neuropsychological evidence suggest that perirhinal cortex allows us to group objects based on either their semantic or visual features when faced with cross-modal interference. We build on these findings by asking whether hand and eye movements made in the context of categorization tasks with cross-modal interference discriminate between younger and older adults. We additionally examined whether these behavioral indices track overall cognitive status in older adults. Three objects were presented on each trial: a referent, a target, and a distractor. Targets in the visual categorization task were visually similar to the referent, whereas distractors were semantically similar to the referent. Targets in the semantic categorization task were semantically similar to the referent, whereas distractors were visually similar to the referent. Categorization decisions were made by touching targets in our motion-tracking experiment and with a button press in our eye-tracking experiment. We found that reach trajectory and gaze, which are continuous measures of decision making, reliably discriminated between younger and older adults. In both cases, older adults were influenced by the distractors to a greater degree than were younger adults. Most interestingly, reach and gaze were significant predictors of overall cognitive function in the older adult group. These findings suggest that hand and eye movements may reveal subtle age-related changes in cognitive functions supported by perirhinal cortex.

Screenshot 2024-03-25 at 5.58.23 PM.png

Keywords: Motion Tracking, Eye-Tracking, Memory, Alzheimer's Disease, Object Categorization