Skip to content

This project uses Python, OpenCV, and YOLOv8 for Hand Gesture Detection, enabling real-time data recording, annotation, and model development to recognize specific hand gestures.

Notifications You must be signed in to change notification settings

Pushtogithub23/yolo-hand-gestures-detection

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Hand Gesture Detection using YOLOv8

This repository features a project for Hand Gesture Detection utilizing Python, OpenCV, and YOLOv8. The project allows for real-time data recording, annotation, and model development aimed at recognizing specific hand gestures.

detection_result_2

Features

  • Hand Gesture Detection:
    • Captures hand gesture images from a webcam feed.
    • Images are saved in specified folders for each gesture.
    • Key-activated controls to start and stop image saving.
  • Dataset Preparation:
    • Provides a script to record hand gesture data.
    • Supports annotation of the recorded data using Roboflow.
  • Model Training and Deployment:
    • Includes a Jupyter Notebook for training, evaluating, and deploying the YOLOv8 hand gesture detection model.
  • Separate Python Script real-time detection:
    • Includes a separate Python script that detects and displays hand gestures from a live camera feed.

Project Structure

  • data_record.py: Python script to record webcam feed data for hand gesture detection.
  • model_training.ipynb: Jupyter Notebook for training, testing, and deploying the hand gesture detection model.
  • requirements.txt: List of required Python packages.
  • testing.py: Python script for real-time hand gesture detection.

Setup and Requirements

Prerequisites

  • Python 3.x
  • OpenCV
  • YOLOv8
  • Roboflow (for data annotation)

Installation

  1. Clone the repository:

    git clone https://github.com/Pushtogithub23/yolo-hand-gestures-detection.git
    cd yolo-hand-gestures-detection
  2. Install required libraries:

    pip install -r requirements.txt
  3. Train your custom YOLOv8 model weights. Typically, the weights are saved in runs/detect/train/weights/.

Usage

Hand Gesture Data Recording

The data_record.py script captures images from a webcam feed to create a dataset of hand gestures.

  1. Run the script:

    python data_record.py
  2. Controls:

    • Press 's' to start saving images.
    • Press 'p' to stop and close the program.

    This will save images in the DATA/CallMe directory or any folder specified in capture_hand_images().

Data Annotation

After recording the hand gesture data, images were annotated on Roboflow. Annotation was essential for training the YOLOv8 model with labelled gesture data.

Hand Gesture Detection Notebook

The model_training.ipynb notebook provides:

  • Training and testing steps for the hand gesture detection model.
  • Code to evaluate model performance.
  • Guidance for deploying the trained model.

Project link

  • You can find the project on Roboflow by clicking here
  • You can find the training logs on wandb by clicking on this link

About

This project uses Python, OpenCV, and YOLOv8 for Hand Gesture Detection, enabling real-time data recording, annotation, and model development to recognize specific hand gestures.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published