This project implements real-time hand gesture recognition using OpenCV, NumPy, and PyAutoGUI. It tracks hand movements through a webcam, detects convexity defects between fingers, and maps gestures to keyboard actions. Specifically, when an open-hand gesture (four or more defects) is detected, the system simulates a spacebar press which is good for controlling games or applications without physical input! pretty coool right???
Develop a real-time, gesture-based control system that replaces conventional input methods using hand movements.
Hand shapes and finger positions can be tracked using contours and convexity defects, allowing gesture-based input control.
Video input is captured from a webcam and processed using OpenCV.
- Convert the Region of Interest (ROI) to HSV color space.
- Apply a skin color mask to isolate the hand.
- Perform thresholding and contour detection.
- Compute the convex hull and convexity defects of the hand contour.
- If four or more defects are detected → Simulate a spacebar press using PyAutoGUI.
The system detects hand gestures with high accuracy in well-lit conditions and provides real-time feedback on recognized gestures.
The project successfully maps hand gestures to keyboard inputs, enabling a hands-free interaction method for various applications.
To run this project, install the required dependencies:
pip install opencv-python numpy pyautogui- Run the script:
python hand_gestures.py
- Show your hand inside the ROI (displayed on screen).
- When four or more fingers are detected, the system presses the spacebar automatically.
- Press "ESC" to exit.
- Expand gesture controls to map additional actions (e.g., volume control, scrolling).
- Improve robustness in low-light conditions.
- Optimize hand segmentation for different skin tones and backgrounds.
This project is open source and available under the MIT License.
Enjoy!!🤖✋ P.S. (If it doesn't work or theres a bug, I'll get back to this project I promise but rn its Feb, 20-tariff-25 and life be lifin so...)