An Instant Sign Language Translate App

HandsTalk

HandsTalk is a mobile app that uses advanced computer vision technology and generative AI to provide real-time translation from sign language to English. It utilizes built-in cameras without the need for specialized hardware, making it accessible to a wider audience.

Real-Time AI Translation

HandsTalk employs advanced computer vision technology to offer precise real-time translation from sign language to English. Generative AI is utilized to refine the detected sign language into coherent sentences.

Seamless Platform

We provide a seamless platform for sign language learners to improve their skills and for sign language users to communicate, with a user-friendly interface.

Data-driven Insights

A vast amount of sign language data was gathered from various institutions to train the sign language detection model using deep learning.


Motivation


Motivation

The shortage of sign language translators in Hong Kong is concerning, with only one available for every 3,000 deaf individuals. With 50,000 deaf or hearing-impaired individuals in Hong Kong, there is a high demand for real-time sign language translation services. HandsTalk, a mobile app that offers real-time translation from sign language to English using only built-in cameras, making it more accessible to a wider audience.

The shortage of sign language translators in Hong Kong is concerning, with only one available for every 3,000 deaf individuals. With 50,000 deaf or hearing-impaired individuals in Hong Kong, there is a high demand for real-time sign language translation services. HandsTalk, a mobile app that offers real-time translation from sign language to English using only built-in cameras, making it more accessible to a wider audience.

System Architecture

HandsTalk is composed of four main components that work together to facilitate effective communication, including 1) Seamless User Interface, 2) Real-time Body Posture Detection, 3) Sign Language Deep Learning, and 4) AI-Driven Sentence Completion.


Feedback from Sign Language Users:

Friendly All-in-One User Interface:
"The application is very user-friendly and easy to use."

Benefiting a Wider Audience with Built-In Cameras:
"It's amazing that your application doesn't require specialized devices like leap motion or haptic gloves. This makes it more accessible to more people and allows for greater ease of use."


Sign Language Communication App for the Deaf:
"This app has the potential to make communication easier for sign language users."


Learning Platform for Sign Language Learners:
"It is a great tool for those who want to learn sign language or improve their skills.


Immediate and Accurate Translation:
"I was impressed with how quickly and accurately the application detected changes in my pose or distance from the camera. It was able to recognize and translate the correct sign language most of the time, which made communication much easier for me."


Real-Time Adaptive Translation:
"Great job! The application is highly accurate and able to quickly detect changes in pose or distance from the camera. It can successfully recognize and translate the correct sign language most of the time."

Special Thanks:

We would like to express our deepest gratitude to our esteemed project advisor, Prof. Kenneth Wai-Ting LEUNG, for his invaluable guidance and advice throughout the project.

 

In addition, we would like to extend our heartfelt thanks to sign language users Wan Yongrui and Liao Yong for their invaluable feedback that significantly contributed to enhancing our app.

Team Members

LEE Cheuk Sum
SO Ho Mang Marcus
WONG Ho Leong

CPEG, HKUST

HKUST @ 2024. Designed by KWT2

System Architecture

HandsTalk is composed of four main components that work together to facilitate effective communication, including 1) Seamless User Interface, 2) Real-time Body Posture Detection, 3) Sign Language Deep Learning, and 4) AI-Driven Sentence Completion.


Feedback from Sign Language Users:

Friendly All-in-One User Interface:

"The application is very user-friendly and easy to use."

Benefiting a Wider Audience with Built-In Cameras:

"It's amazing that your application doesn't require specialized devices like leap motion or haptic gloves. This makes it more accessible to more people and allows for greater ease of use."


Sign Language Communication App for the Deaf:

"This app has the potential to make communication easier for sign language users."


Learning Platform for Sign Language Learners:

"It is a great tool for those who want to learn sign language or improve their skills.


Immediate and Accurate Translation:

"I was impressed with how quickly and accurately the application detected changes in my pose or distance from the camera. It was able to recognize and translate the correct sign language most of the time, which made communication much easier for me."


Real-Time Adaptive Translation:

"Great job! The application is highly accurate and able to quickly detect changes in pose or distance from the camera. It can successfully recognize and translate the correct sign language most of the time."

Special Thanks:

We would like to express our deepest gratitude to our esteemed project advisor, Prof. Kenneth Wai-Ting LEUNG, for his invaluable guidance and advice throughout the project.

 

In addition, we would like to extend our heartfelt thanks to sign language users Wan Yongrui and Liao Yong for their invaluable feedback that significantly contributed to enhancing our app.

Team Members

LEE Cheuk Sum, SO Ho Mang Marcus and WONG Ho Leong

Computer Engineering Program
Hong Kong University of Science and Technology

HKUST @ 2024. Designed by KWT2

HKUST @ 2024. Designed by KWT2

System Architecture

HandsTalk is composed of four main components that work together to facilitate effective communication, including 1) Seamless User Interface, 2) Real-time Body Posture Detection, 3) Sign Language Deep Learning, and 4) AI-Driven Sentence Completion.