This repository features a project for Hand Gesture Detection utilizing Python, OpenCV, and YOLOv8. The project allows for real-time data recording, annotation, and model development aimed at recognizing specific hand gestures.
- Hand Gesture Detection:
- Captures hand gesture images from a webcam feed.
- Images are saved in specified folders for each gesture.
- Key-activated controls to start and stop image saving.
- Dataset Preparation:
- Provides a script to record hand gesture data.
- Supports annotation of the recorded data using Roboflow.
- Model Training and Deployment:
- Includes a Jupyter Notebook for training, evaluating, and deploying the YOLOv8 hand gesture detection model.
- Separate Python Script real-time detection:
- Includes a separate Python script that detects and displays hand gestures from a live camera feed.
data_record.py
: Python script to record webcam feed data for hand gesture detection.model_training.ipynb
: Jupyter Notebook for training, testing, and deploying the hand gesture detection model.requirements.txt
: List of required Python packages.testing.py
: Python script for real-time hand gesture detection.
- Python 3.x
- OpenCV
- YOLOv8
- Roboflow (for data annotation)
-
Clone the repository:
git clone https://github.com/Pushtogithub23/yolo-hand-gestures-detection.git cd yolo-hand-gestures-detection
-
Install required libraries:
pip install -r requirements.txt
-
Train your custom YOLOv8 model weights. Typically, the weights are saved in
runs/detect/train/weights/
.
The data_record.py
script captures images from a webcam feed to create a dataset of hand gestures.
-
Run the script:
python data_record.py
-
Controls:
- Press 's' to start saving images.
- Press 'p' to stop and close the program.
This will save images in the
DATA/CallMe
directory or any folder specified incapture_hand_images()
.
After recording the hand gesture data, images were annotated on Roboflow. Annotation was essential for training the YOLOv8 model with labelled gesture data.
The model_training.ipynb
notebook provides:
- Training and testing steps for the hand gesture detection model.
- Code to evaluate model performance.
- Guidance for deploying the trained model.