Skip to content

siddhp1/Gesture-Controller

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

35 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Gesture Controller

Thumbnail Image of Gesture Controller

A deep learning powered multimedia controller operated by hand gestures.

The application is built using Flask, Python, OpenCV, Mediapipe, and TailwindCSS, with a TensorFlow-trained model and data collected via Mediapipe.

About

The model consists of two dense layers with ReLU activation, followed by a fully-connected dense layer with softmax activation. It uses the Adam optimizer and sparse categorical cross-entropy as the loss function. The model achieves a validation accuracy of 97%.

Landmark data was collected from the HaGRID (512px) dataset.

Setup

  1. Clone the repository:

    git clone https://github.com/siddhp1/Gesture-Controller.git
    cd Gesture-Controller/app
  2. Create environment and install dependencies:

    python -m venv venv
    source venv/bin/activate
    pip install -r requirements.txt

Usage

  1. Run application:

    python -m main
  2. Open GUI:

    Go to http://localhost:5000 in your web browser.

License

This project is licensed under the MIT License.

About

Deep learning powered multimedia controller operated by hand gestures.

Resources

License

Stars

Watchers

Forks