Skip to content

Nikhithasv/Hand-tracking-drawing-

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 

Repository files navigation

Hand-Tracking Paint Application

Abstract

The Hand-Tracking Paint Application introduces a novel approach to interactive digital art creation by combining advanced computer vision techniques and hand-tracking technology. This application allows users to draw on a virtual canvas using natural hand gestures captured in real-time by a webcam. The project emphasizes a seamless and engaging user experience through intuitive color selection and gesture-based canvas clearing. Leveraging the Mediapipe library, the application achieves precise detection and tracking of hand landmarks, enabling users to express their creativity effortlessly.

Key Features

  • Color Selection: Users can choose from a palette of four colors (blue, green, red, and yellow) by positioning their hands in designated areas on the canvas, enhancing the fluidity of the creative process.

  • Canvas Clearing Gesture: A specific hand movement (bringing the index finger close to the thumb) triggers the reset of the canvas, providing users with an easy way to start new drawings.

  • Responsive Hand Tracking: The robust hand-tracking algorithm ensures accurate detection of gestures, contributing to the application's overall responsiveness and user experience.

Introduction

Evolution of Digital Art

The Hand-Tracking Paint Application represents a dynamic exploration at the intersection of computer vision and interactive technology. In an era where traditional drawing tools meet advanced computing capabilities, this project introduces an immersive approach to digital art creation.

Technological Foundation

The project relies on two key technological pillars - OpenCV and Mediapipe. OpenCV, an open-source computer vision library, handles real-time video frames and image processing, while Mediapipe specializes in hand tracking, enabling the interpretation of hand gestures for virtual canvas interaction.

User-Centric Design

The application prioritizes the user's creative experience by leveraging the natural dexterity of human hands. Removing barriers imposed by conventional input devices, users can explore the digital canvas with the fluidity of their hand movements, fostering a direct connection between the artist and the artwork.

Inclusive Digital Creativity

This project democratizes digital creativity, making it accessible to individuals without specialized artistic tools or skills. The interactive nature of the application ensures inclusivity, inviting both seasoned artists and novices to explore the limitless possibilities of digital creation.

Requirements

Software

  • Python: Primary programming language.
  • OpenCV: Open-source computer vision library.
  • NumPy: Numerical operations in Python.
  • Mediapipe: Hand tracking library by Google.

Hardware

  • Webcam: Necessary for capturing real-time video feed.
  • Computer: Sufficient processing power for real-time video processing.
  • Operating System: Platform-independent (Windows, Linux, macOS).

Project Execution

Importing Libraries

The necessary libraries include OpenCV, NumPy, and Mediapipe.

import cv2
import numpy as np
import mediapipe as mp
from collections import deque

Color Point Handling

Color points for blue, green, red, and yellow are stored in deques, and indices keep track of the current position in each deque.

bpoints = [deque(maxlen=1024)]
gpoints = [deque(maxlen=1024)]
rpoints = [deque(maxlen=1024)]
ypoints = [deque(maxlen=1024)]

Kernel for Dilation

A 5x5 matrix of ones is used for dilation purposes.

kernel = np.ones((5,5), np.uint8)

Color Information

BGR values for blue, green, red, and yellow are defined, along with an index for color selection.

colors = [(255, 0, 0), (0, 255, 0), (0, 0, 255), (0, 255, 255)]
colorIndex = 0

Canvas Setup

The canvas is initialized, and color selection buttons are drawn on it.

paintWindow = np.zeros((471,636,3)) + 255
# (Rectangles and Text Drawing Code)
cv2.namedWindow('Paint', cv2.WINDOW_AUTOSIZE)

Mediapipe Hand Tracking Initialization

Mediapipe is initialized for hand tracking.

mpHands = mp.solutions.hands
hands = mpHands.Hands(max_num_hands=1, min_detection_confidence=0.7)
mpDraw = mp.solutions.drawing_utils

Webcam Initialization

OpenCV's VideoCapture is used to capture video from the default camera.

cap = cv2.VideoCapture(0)

Main Loop

The program continuously captures frames from the webcam, detects hand landmarks, processes user actions, and draws on the canvas until the user presses 'q'.

while ret:
    # (Frame Capture and Processing Code)
    # (Landmark Detection and Processing Code)
    # (User Action Processing Code)
    # (Drawing Code)
    if cv2.waitKey(1) == ord('q'):
        break

Cleanup

The webcam is released, and OpenCV windows are destroyed upon program exit.

cap.release()
cv2.destroyAllWindows()

Future Development

The project suggests potential improvements such as varied brush sizes, optimization of hand tracking for increased accuracy, enhanced user feedback mechanisms, and exploration of multi-user interaction capabilities.


Feel free to use this README template for your GitHub repository, updating it with specific details about your project, licensing information, and any additional sections relevant to your implementation.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages