Abstract
The aim of this project is to develop an intelligent, hands-free control system for wheelchairs using eye gaze direction. This system leverages computer vision and machine learning to detect and classify eye movements—such as looking left, right, up, down, or steady—into specific directional commands. Using a webcam and facial landmark detection (via Dlib and Haar cascades), the system isolates eye regions and extracts Histogram of Oriented Gradients (HOG) features for real-time classification using a K-Nearest Neighbors (KNN) algorithm.
The interface is designed using PyQt5, offering an intuitive GUI to train models, capture datasets, and monitor detection. Additionally, the system includes optional serial communication to send movement commands to external hardware, such as a microcontroller-controlled wheelchair. This project aims to enhance mobility for people with physical disabilities, providing a low-cost, non-invasive alternative to traditional control methods.
Reviews
There are no reviews yet.