top of page

Assistive Technology for Human Computer Interface

Our team is pioneering an innovative project aimed at enhancing accessibility through a groundbreaking two-handed gesture control suite. This system utilizes webcam technology to interpret hand gestures, providing a seamless and disability-friendly interface. Without the need for traditional input devices like keyboards or mice, this project enables intuitive control over various computer functions, including cursor navigation and air writing, redefining the standards of contactless interaction. This project can be separated into two main areas of research: eye tracking, and multi-frame gestures

eyeres.png
Eye Tracking

Our research is focused on integrating eye-tracking methods using advanced landmarks and deep neural networks (DNNs) for improved accuracy. This approach aims to overcome challenges like varying eye shapes and distances from the screen, paving the way for more precise cursor control through eye movements.

Multi-Frame Gestures

This project delves into the realm of multi-frame gesture technology. By utilizing drag-based gestures as a foundational element, we aim to emulate complex functionalities, such as 3D touch interactions, on traditional desktop interfaces. This approach marks a significant step in our ongoing efforts to refine and expand human-computer interaction capabilities by making gestures  more simple and intuitive

bottom of page