This project allows you to control a TurtleBot in Gazebo using hand gestures, captured via your webcam and interpreted using MediaPipe in a Jupyter Notebook. This repository is only responsible for detecting gestures and publishing direction commands, which are then received by a ROS 2 node that moves the TurtleBot accordingly.
- MediaPipe is used to detect and classify hand gestures in real-time using your webcam.
- Based on specific hand gestures (like open palm, fist, etc.), the notebook publishes direction commands like:
forwardbackwardleftrightstop
- These commands are published to a ROS 2 topic (e.g.,
/gesture_direction). - A separate ROS 2 Python script subscribes to that topic and moves the TurtleBot in Gazebo accordingly.