A powerful and extensible hand gesture detection framework for visionOS, providing declarative gesture definition and real-time hand tracking capabilities.
- π― Declarative Gesture Definition - Define gestures using simple, composable conditions
- π High Performance - Optimized detection with early-return patterns and priority-based searching
- π§ Extensible Architecture - Protocol-based design for easy custom gesture creation
- π± visionOS 2.0+ Support - Built for Apple Vision Pro with ARKit hand tracking
- π¨ Built-in Gesture Library - Common gestures like peace sign, thumbs up, pointing, etc.
- visionOS 2.0+
- Xcode 15.0+
- Swift 5.9+
Add HandGestureKit to your project through Xcode:
- File β Add Package Dependencies
- Enter the repository URL
- Select version and add to your target
Or add to your Package.swift
:
dependencies: [
.package(url: "https://github.com/yourusername/HandGestureKit.git", from: "1.0.0")
]
import HandGestureKit
import RealityKit
struct PeaceSignGesture: SingleHandGestureProtocol {
var id: String { "peace_sign" }
var displayName: String { "Peace Sign" }
var gestureName: String { "PeaceSign" }
var description: String { "Index and middle fingers extended" }
var priority: Int { 100 }
func matches(_ gestureData: SingleHandGestureData) -> Bool {
// Check if index and middle fingers are straight
guard gestureData.isFingerStraight(.index),
gestureData.isFingerStraight(.middle) else {
return false
}
// Check if other fingers are bent
return gestureData.isFingerBent(.thumb) &&
gestureData.isFingerBent(.ring) &&
gestureData.isFingerBent(.little)
}
}
struct HandTrackingView: View {
var body: some View {
RealityView { content in
// Register the tracking system
HandGestureTrackingSystem.registerSystem()
// Set up hand tracking entities
await setupHandTracking(content)
}
}
private func setupHandTracking(_ content: RealityViewContent) async {
// Request hand tracking permission
let session = ARKitSession()
await session.requestAuthorization(for: [.handTracking])
// Start spatial tracking
let spatialSession = SpatialTrackingSession()
let config = SpatialTrackingSession.Configuration(tracking: [.hand])
await spatialSession.run(config)
// Create hand anchor entities...
}
}
struct GestureDetectionSystem: System {
static let query = EntityQuery(where: .has(HandTrackingComponent.self))
func update(context: SceneUpdateContext) {
for entity in context.entities(matching: Self.query, updatingSystemWhen: .rendering) {
guard let handComponent = entity.components[HandTrackingComponent.self] else { continue }
// Create gesture data from hand tracking
let gestureData = SingleHandGestureData(from: handComponent)
// Check for gestures
let peaceSign = PeaceSignGesture()
if peaceSign.matches(gestureData) {
print("Peace sign detected!")
}
}
}
}
BaseGestureProtocol
- Base protocol for all gesture typesSingleHandGestureProtocol
- Protocol for single-hand gesturesTwoHandsGestureProtocol
- Protocol for two-hand gesturesSignLanguageProtocol
- Protocol for sign language gesturesSerialGestureProtocol
- Protocol for sequential gesture patterns
SingleHandGestureData
- Encapsulates hand tracking data with convenience methodsHandTrackingComponent
- RealityKit component for hand tracking entitiesGestureDetector
- Main gesture detection engine with priority-based matching
SingleHandGestureData provides numerous helper methods:
// Finger state checks
gestureData.isFingerStraight(.index) // Check if index finger is straight
gestureData.isFingerBent(.thumb) // Check if thumb is bent
gestureData.areAllFingersStraight() // Check if all fingers are straight
// Palm direction
gestureData.isPalmFacing(.forward) // Check palm orientation
gestureData.isPalmFacing(.up)
// Finger direction
gestureData.isFingerPointing(.index, direction: .forward)
// Complex conditions
gestureData.areAllFingersExceptBent([.index, .middle]) // All except specified are bent
Gestures are evaluated in priority order (lower values = higher priority):
struct HighPriorityGesture: SingleHandGestureProtocol {
var priority: Int { 10 } // Evaluated first
// ...
}
struct LowPriorityGesture: SingleHandGestureProtocol {
var priority: Int { 1000 } // Evaluated last
// ...
}
struct ClapGesture: TwoHandsGestureProtocol {
func matches(_ gestureData: HandsGestureData) -> Bool {
// Check if palms are facing each other
guard gestureData.arePalmsFacingEachOther else { return false }
// Check distance between palms
return gestureData.palmDistance < 0.1 // Within 10cm
}
}
The framework includes several optimization strategies:
- Early Return Pattern - Conditions are checked in order of selectivity
- Gesture Indexing - Gestures are pre-sorted by priority
- Lazy Evaluation - Complex calculations only when necessary
- Caching - Results are cached within frame updates
Check out the /Example
directory for a complete visionOS app demonstrating:
- Basic gesture detection
- Custom gesture creation
- Real-time visualization
- Performance monitoring
To run the example:
cd Example
open Package.swift # Opens in Xcode
# Build and run for visionOS Simulator or Device
We welcome contributions! Please see our Contributing Guidelines for details.
- Clone the repository
- Open
Package.swift
in Xcode - Build and test on visionOS simulator or device
swift test
HandGestureKit is available under the MIT license. See the LICENSE file for more info.
- Built for Apple Vision Pro and visionOS
- Inspired by natural human-computer interaction research
- Thanks to all contributors and the visionOS developer community
- π§ Email: [email protected]
- π Issues: GitHub Issues
- π¬ Discussions: GitHub Discussions