Google has announced the creation of an app to read hand gestures. The solution is also available as a separate algorithm for other developers to use it in their own apps.
The algorithm was implemented in the framework for building multimodal applied ML pipelines MediaPipe. According to Google, MediaPipe approach “provides high-fidelity hand and finger tracking by employing machine learning (ML) to infer 21 3D keypoints of a hand from just a single frame”. The solution works in real-time and can detect and interpret two hands at the same time. Thanks to MediaPipe modular components called Calculators the app deals with model inference, media processing algorithms, and data transformations across a wide variety of devices and platforms.
The current version of AI hand detecting apps has improved functionality. Earlier versions couldn’t detect other parts of the hand because of finger-bending or other unnatural movements. Now the app presented by Google imposes a graph on 21 points across the fingers, palm and back of the hand. This provides a comprehensive scanning of the whole hand even while moving it.
Google representatives are excited about future improvements of the app. The decision to make the algorithm available for other developers will benefit the whole technology world and help deaf people. It will allow other experts to improve and extend the algorithm with other robust solutions.