Overview
Hand Sync is an innovative project that leverages OpenCV, MediaPipe, and PyAutoGUI to transform hand movements into mouse cursor controls. By capturing live video from a webcam, this system processes hand landmarks and enables intuitive mouse interactions based on specific hand gestures. With its real-time processing and visual feedback, Hand Sync offers an efficient and engaging way to interact with your computer.
Features
Hand Detection and Tracking
Utilizes MediaPipe to detect and track hand landmarks in real-time with high accuracy.
Mouse Movement Control
Maps the position of the index finger tip to control the mouse cursor on the screen seamlessly.
Mouse Dragging
Initiates a drag operation when the thumb and index finger are brought close together, mimicking a natural pinch gesture.
Real-time Processing
Ensures minimal delay for smooth and responsive interaction by processing the video stream in real-time.
Visual Feedback
Displays the live video feed with hand landmarks and gesture interactions highlighted for better user understanding and debugging.
Getting Started
Follow these steps to set up and run Hand Sync:
Clone the Repository
Clone the project repository to your local machine.
Install Dependencies
Open a terminal and navigate to the project folder.
Run the following command to install the required packages:
pip install -r requirements.txt
Run the Application
In the terminal, execute the following command from the root directory of the project:
streamlit run track.py
How It Works
The system captures live video feed from the webcam.
MediaPipe detects hand landmarks and provides coordinates for various hand points.
OpenCV processes the video frames, while PyAutoGUI maps the hand movements to mouse cursor actions.
Specific gestures, such as pinching with the thumb and index finger, trigger mouse dragging operations.
Applications
Accessibility: Enhances computer interaction for individuals with limited mobility.
Gaming: Introduces gesture-based controls for an immersive experience.
Presentations: Enables hands-free navigation through slides or applications.
Future Enhancements
Integration with additional gestures for advanced functionalities.
Customizable gesture mappings for personalized user experiences.
Optimization for low-resource devices to expand accessibility.
Conclusion
Hand Sync demonstrates the potential of combining computer vision and machine learning for intuitive human-computer interaction. With its robust features and real-time capabilities, it paves the way for innovative applications across various domains. Start exploring Hand Sync today and experience the future of gesture-based computing!