Skip to main content
MYFAU Guest homeNews home
Story

'Eye' on Health: AI Detects Dizziness and Balance Disorders Remotely

FAU researchers and collaborators have developed a cost-effective, AI-powered system for diagnosing nystagmus - a condition causing involuntary eye movements - using smartphone videos and cloud-based analysis.

Researchers, Goggles, Study, Eye Movements

FAU researchers are also experimenting with a wearable headset equipped with deep learning capabilities to detect nystagmus in real-time.


Artificial intelligence is playing an increasingly vital role in modern medicine, particularly in interpreting medical images to help clinicians assess disease severity, guide treatment decisions and monitor disease progression. Despite these advancements, most current AI models are based on static datasets, limiting their adaptability and real-time diagnostic potential.

To address this gap, researchers from Florida Atlantic University and collaborators, have developed a novel proof-of-concept deep learning model that leverages real-time data to assist in diagnosing nystagmus – a condition characterized by involuntary, rhythmic eye movements often linked to vestibular or neurological disorders.

Gold-standard diagnostic tools such as videonystagmography (VNG) and electronystagmography have been used to detect nystagmus. However, these methods come with notable drawbacks: high costs (with VNG equipment often exceeding $100,000), bulky setups, and inconvenience for patients during testing. FAU’s AI-driven system offers a cost-effective and patient-friendly alternative, for a quick and reliable screening for balance disorders and abnormal eye movements.

The platform allows patients to record their eye movements using a smartphone, securely upload the video to a cloud-based system, and receive remote diagnostic analysis from vestibular and balance experts – all without leaving their home.

At the heart of this innovation is a deep learning framework that uses real-time facial landmark tracking to analyze eye movements. The AI system automatically maps 468 facial landmarks and evaluates slow-phase velocity – a key metric for identifying nystagmus intensity, duration and direction. It then generates intuitive graphs and reports that can easily be interpreted by audiologists and other clinicians during virtual consultations.

Results of the pilot study involving 20 participants, published in Cureus (part of Springer Nature), demonstrated that the AI system’s assessments closely mirrored those obtained through traditional medical devices. This early success underscores the model’s accuracy and potential for clinical reliability, even in its initial stages.

“Our AI model offers a promising tool that can partially supplement – or, in some cases, replace – conventional diagnostic methods, especially in telehealth environments where access to specialized care is limited,” said Ali Danesh, Ph.D., principal investigator of the study, senior author, a professor in the Department of Communication Sciences and Disorders within FAU’s College of Education and a professor of biomedical science within FAU’s Charles E. Schmidt College of Medicine. “By integrating deep learning, cloud computing and telemedicine, we’re making diagnosis more flexible, affordable and accessible – particularly for low-income rural and remote communities.”  

The team trained their algorithm on more than 15,000 video frames, using a structured 70:20:10 split for training, testing and validation. This rigorous approach ensured the model’s robustness and adaptability across varied patient populations. The AI also employs intelligent filtering to eliminate artifacts such as eye blinks, ensuring accurate and consistent readings.

Beyond diagnostics, the system is designed to streamline clinical workflows. Physicians and audiologists can access AI-generated reports via telehealth platforms, compare them with patients’ electronic health records, and develop personalized treatment plans. Patients, in turn, benefit from reduced travel, lower costs and the convenience of conducting follow-up assessments by simply uploading new videos from home – enabling clinicians to track disorder progression over time.

In parallel, FAU researchers are also experimenting with a wearable headset equipped with deep learning capabilities to detect nystagmus in real-time. Early tests in controlled environments have shown promise, though improvements are still needed to address challenges such as sensor noise and variability among individual users.

“While still in its early stages, our technology holds the potential to transform care for patients with vestibular and neurological disorders,” said Harshal Sanghvi, Ph.D., first author, an FAU electrical engineering and computer science graduate, and a postdoctoral fellow at FAU’s College of Medicine and College of Business. “With its ability to provide non-invasive, real-time analysis, our platform could be deployed widely – in clinics, emergency rooms, audiology centers and even at home.”

Sanghvi worked closely with his mentors and co-authors on this project including Abhijit S. Pandya, Ph.D., FAU Department of Electrical Engineering and Computer Science and FAU Department of Biomedical Engineering, and B. Sue Graves, Ed.D., Department of Exercise Science and Health Promotion, FAU Charles E. Schmidt College of Science.

This interdisciplinary initiative includes collaborators from FAU’s College of Business, College of Medicine, College of Engineering and Computer Science, College of Science, and partners from Advanced Research, Marcus Neuroscience Institute – part of Baptist Health – at Boca Raton Regional Hospital, Loma Linda University Medical Center, and Broward Health North. Together, they are working to enhance the model’s accuracy, expand testing across diverse patient populations, and move toward FDA approval for broader clinical adoption.

“As telemedicine becomes an increasingly integral part of health care delivery, AI-powered diagnostic tools like this one are poised to improve early detection, streamline specialist referrals, and reduce the burden on health care providers,” said Danesh. “Ultimately, this innovation promises better outcomes for patients –regardless of where they live.”

Along with Pandya and Graves, study co-authors are Jilene Moxam, Advanced Research LLC; Sandeep K. Reddy, Ph.D., FAU College of Engineering and Computer Science; Gurnoor S. Gill, FAU College of Medicine; Sajeel A. Chowdhary, M.D., Marcus Neuroscience Institute – part of Baptist Health – at Boca Raton Regional Hospital; Kakarla Chalam, M.D., Ph.D., Loma Linda University; and Shailesh Gupta, M.D., Broward Health North.  

//

A smartphone was used to record eye movements of a research participant in response to optokinetic visual stimuli. Parallel recordings were also obtained using an infrared camera through a Videonystagmography (VNG) system. An AI-generated algorithm analyzing the smartphone video successfully replicated the results of the VNG recordings with high accuracy. This method shows promise for remote assessment of abnormal eye movements and could support clinicians in diagnosing patients who are unable to attend in-person appointments.

-FAU-

Latest Research