Engineers Bring Sign Language to 'Life' Using AI
For millions of deaf and hard-of-hearing individuals around the world, communication barriers can make everyday interactions challenging. Traditional solutions, like sign language interpreters, are often scarce, expensive and dependent on human availability. In an increasingly digital world, the demand for smart, assistive technologies that offer real-time, accurate and accessible communication solutions is growing, aiming to bridge this critical gap.
American Sign Language (ASL) is one of the most widely used sign languages, consisting of distinct hand gestures that represent letters, words and phrases. Existing ASL recognition systems often struggle with real-time performance, accuracy and robustness across diverse environments.
A major challenge in ASL systems lies in distinguishing visually similar gestures such as “A” and “T” or “M” and “N,” which often leads to misclassifications. Additionally, the dataset quality presents significant obstacles, including poor image resolution, motion blur, inconsistent lighting, and variations in hand sizes, skin tones and backgrounds. These factors introduce bias and reduce the model’s ability to generalize across different users and environments.
To tackle these challenges, researchers from the College of Engineering and Computer Science at Florida Atlantic University have developed an innovative real-time ASL interpretation system. Combining the object detection power of YOLOv11 with MediaPipe’s precise hand tracking, the system can accurately recognize ASL alphabet letters in real time. Using advanced deep learning and key hand point tracking, it translates ASL gestures into text, enabling users to interactively spell names, locations and more with remarkable accuracy.
At its core, a built-in webcam serves as a contact-free sensor, capturing live visual data that is converted into digital frames for gesture analysis. MediaPipe identifies 21 keypoints on each hand to create a skeletal map, while YOLOv11 uses these points to detect and classify ASL letters with high precision.
“What makes this system especially notable is that the entire recognition pipeline – from capturing the gesture to classifying it – operates seamlessly in real time, regardless of varying lighting conditions or backgrounds,” said Bader Alsharif, the first author and a Ph.D. candidate in the FAU Department of Electrical Engineering and Computer Science. “And all of this is achieved using standard, off-the-shelf hardware. This underscores the system’s practical potential as a highly accessible and scalable assistive technology, making it a viable solution for real-world applications.”
Results of the study, published in the journal Sensors, confirm the system’s effectiveness, which achieved a 98.2% accuracy (mean Average Precision, mAP@0.5) with minimal latency. This finding highlights the system’s ability to deliver high precision in real-time, making it an ideal solution for applications that require fast and reliable performance, such as live video processing and interactive technologies.
With 130,000 images, the ASL Alphabet Hand Gesture Dataset includes a wide variety of hand gestures captured under different conditions to help models generalize better. These conditions cover diverse lighting environments (bright, dim and shadowed), a range of backgrounds (both outdoor and indoor scenes), and various hand angles and orientations to ensure robustness.
Each image is carefully annotated with 21 keypoints, which highlight essential hand structures such as fingertips, knuckles and the wrist. These annotations provide a skeletal map of the hand, allowing models to distinguish between similar gestures with exceptional accuracy.
“This project is a great example of how cutting-edge AI can be applied to serve humanity,” said Imad Mahgoub, Ph.D., co-author and Tecore Professor in the FAU Department of Electrical Engineering and Computer Science. “By fusing deep learning with hand landmark detection, our team created a system that not only achieves high accuracy but also remains accessible and practical for everyday use. It’s a strong step toward inclusive communication technologies.”
The deaf population in the U.S. is approximately 11 million, or 3.6% of the population, and about 15% of American adults (37.5 million) experience hearing difficulties.
“The significance of this research lies in its potential to transform communication for the deaf community by providing an AI-driven tool that translates American Sign Language gestures into text, enabling smoother interactions across education, workplaces, health care and social settings,” said Mohammad Ilyas, Ph.D., co-author and a professor in the FAU Department of Electrical Engineering and Computer Science. “By developing a robust and accessible ASL interpretation system, our study contributes to the advancement of assistive technologies to break down barriers for the deaf and hard of hearing population.”
Future work will focus on expanding the system’s capabilities from recognizing individual ASL letters to interpreting full ASL sentences. This would enable more natural and fluid communication, allowing users to convey entire thoughts and phrases seamlessly.
“This research highlights the transformative power of AI-driven assistive technologies in empowering the deaf community,” said Stella Batalama, Ph.D., dean of the College of Engineering and Computer Science. “By bridging the communication gap through real-time ASL recognition, this system plays a key role in fostering a more inclusive society. It allows individuals with hearing impairments to interact more seamlessly with the world around them, whether they are introducing themselves, navigating their environment, or simply engaging in everyday conversations. This technology not only enhances accessibility but also supports greater social integration, helping create a more connected and empathetic community for everyone.”
Study co-authors are Easa Alalwany, Ph.D., a recent Ph.D. graduate of the FAU College of Engineering and Computer Science and an assistant professor at Taibah University in Saudi Arabia; Ali Ibrahim, Ph.D., a Ph.D. graduate of the FAU College of Engineering and Computer Science.
//
Bader Alsharif, the first author, demonstrates how the system spells out names and locations in real time using American Sign Language.
-FAU-
Latest News Desk
- A Pair of FAU Students Recognized as 2025 Goldwater ScholarsFor the fourth consecutive year, Florida Atlantic University students have been recognized as Goldwater Scholars.
- Photobomb: Shark Cam Captures Ocean Encounter With a Great WhiteIn an unprecedented underwater encounter, a nurse shark equipped with a camera tag has captured footage of a great white shark off the coast of Boynton Beach, delighting FAU marine biologists.
- FAU Highlights Ecosystem, Achievements at eMerge AmericasThe Florida Atlantic University entrepreneurial ecosystem recently took center stage at the two-day eMerge Americas conference as three colleges showcased their expansion and future vision.
- Report: Supply Chain Index Declines as Tariffs Hit EconomyThe economy could be headed for a downturn as the supply chain starts to contract amid tariff policies and rising uncertainty, according to researchers at Florida Atlantic University and four other schools.
- FAU Career Center Developing a Statewide Pipeline of TalentThe FAU Career Center is helping employers develop a strong talent pipeline by connecting students and recent graduates with leading organizations through expanding partnerships.
- 'Flex Appeal:' Balancing Armor and Efficiency in Sea Turtle ShellsUsing biomechanics, FAU researchers have uncovered the "trade-off" between armor and efficiency in sea turtle shells, and how this balance has allowed them to survive in the ocean for millions of years.