The most popular way of interfacing with most computer systems is a mouse and keyboard. Hand gestures are an intuitive and effective touchless way to interact with computer systems. However, hand gesture-based systems have seen low adoption among end-users primarily due to numerous technical hurdles in detecting in-air gestures accurately. This paper presents Hand Gesture Detection for American Sign Language using K-Nearest Neighbor with Mediapipe, a framework developed to bridge this gap. The framework learns to detect gestures from demonstrations, it is customizable by end-users, and enables users to interact in real-time with computers having only RGB cameras, using gestures.
Copyrights © 2021