This project demonstrates how to enhance YOLO-based pose detection by incorporating hand position detection. The integration allows for more robust and precise pose detection with added hand-specific keypoints.
- YOLO Pose Detection: Detects human poses efficiently.
- Hand Position Integration: Adds hand detection, improving pose estimation accuracy by incorporating hand-specific keypoints.
- Visualization: Displays detected poses with hand positions and confidence levels overlayed on the input frames.
- Python 3.8 or later
- Required Python libraries:
- ultralytics
- opencv-python
- numpy
- imutils
 
Install the dependencies using:
pip install ultralytics opencv-python numpy imutils- Trained models:
- hand.ptfor hand detection
- yolo11n-pose.ptfor pose detection
- Download these models and place them in the project directory.
 
- 
Clone the repository: git clone <repository-url> cd <repository-directory> 
- 
Ensure the required models ( hand.ptandyolo11n-pose.pt) are in the working directory.
- 
Connect a webcam or provide a video input stream. 
- 
Run the main script: python inference.py 
- 
The program starts capturing frames, detecting poses, and adding hand positions. 
- 
Key functionalities: - Press qto quit the application.
 
- Press 
The script builds upon YOLO's pose detection capabilities by adding hand keypoints as additional markers. This is achieved through the following steps:
- 
Hand Detection: - Using a fine-tuned YOLO model (hand.pt), hand positions are detected from input frames.
 
- Using a fine-tuned YOLO model (
- 
Pose Extension: - Detected hand positions are matched to existing wrist keypoints from the pose detection model (yolo11n-pose.pt).
- The wrist keypoints are extended to include additional hand markers (index 17 and 18 in the keypoints array).
 
- Detected hand positions are matched to existing wrist keypoints from the pose detection model (
- 
Visualization: - Detected poses with hands are drawn on the frame, including confidence scores for hand detection.
 
- adjust_hand_data: Prepares hand detection results for integration.
- match_hand: Matches detected hands with pose keypoints, extending pose keypoints with hand data.
- plot_poses: Visualizes the poses with extended hand keypoints and their confidence scores.
I am making this project public for anyone who might benefit from this model and approach. If you have any feedback or questions, feel free to reach out.