At the Intersection of Head and Eye Control in Patients with Blindness and Low Vision
- John-Ross(JR) Rizzo, Health System Director, Disability Inclusion, Rusk Rehabilitation, NYU Langone Health
- Todd E. Hudson
- William Seiple
- Mahya Beheshti
Understanding the role of eye movements and their coupling with other body movements when acquiring navigation-relevant visual information is critical, especially for individuals with navigation challenges. This project will deploy a head-mounted eyetracker to examine eye-head synergies in sighted individuals and people with blindness/low vision to assess the presence and nature of any differences in the saccadic main sequence and the extended eye-head saccadic main sequence. We predict that the degree of decoupling will correlate with the degree and type of visual impairment (central/peripheral). Results will advance our knowledge of integrated motor control for the visually impaired in urban navigation.
Category: Urban Health
Project Description & Overview
Advanced eye-tracking technology has radically changed the way we can assess behavior during navigation, and particularly the role of eye movements and their coupling with other body movements when acquiring navigation-relevant visual information. Understanding these interrelationships is especially important for individuals with navigation challenges, because our ability to maintain healthy levels of physical activity depends critically on our comfort and proficiency in navigating through our home and larger urban environments.
To understand the navigation challenges of pBLV we have collected eye and head movement data during locomotion in normally sighted and BLV individuals. We hypothesize that eye and hand movements will be decoupled during navigation, as compared to healthy controls and predict that the coupling dynamics of eye and hand movements are correlated with the degree and type of vision loss. We will analyze the frequency of eye and head movements using a new head-mounted eye-tracking technology along with eye and head movements. Signal processing techniques that we have developed previously will be used to analyze acquired data and test our hypotheses. Finally, statistical analyses will be performed to offer reliable and scalable conclusions.
Existing data from our head-mounted eyetracking system will be provided.
-Signal processing and data visualization (Python or Matlab)
-Programming (Cpp, Python or Matlab)
Learning Outcomes & Deliverables
1. Students will learn data collection techniques and pre-processing methods.
2. Students will be trained on experimental design and hypothesis testing.
3. Students will learn data modeling and analysis techniques.