Our SDK has the following specifications.
We offer a powerful camera-based hand tracking and gesture recognition SDK for free evaluation. Technical support is also provided free of charge.
The evaluation SDK supports 3rd party RGB, NIR, and/or Depth cameras in front-facing and ego-centric positions. It uses a 21 joints-based skeleton for hand tracking and gesture recognition. Positional coordinates for all skeletal joints are provided in real-time. Each hand is assigned a unique ID along with face, side, angle, size, and (pointing) direction values. The SDK is able to track overlapping hands as well as retain hand IDs.
Sample gestures are included in static, dynamic, trigger, and drawing categories. Specifically, the SDK supports pointing, thumbs, OK, victory, open palm, and fist gestures in static category. It supports tapping, swiping, and zooming gestures in dynamic category. Drawing gestures in the form of letters, number, symbols, shapes, and characters are also provided. Each supported gesture can be performed in a variety of ways.
The evaluation SDK also supports body tracking using joints skeleton approach. Body skeleton covers 13 key joints in upper body such as eyes, ears, nose, shoulders, elbows, wrists, and palm center. It is linked to a 21 joints skeleton for each hand attached to the body, thereby, allowing hand movements and gestures to be assigned to a specific individual in a group.
Evaluation SDK can be deployed on ARM or Intel based platforms running Windows, Linux, Mac OS, iOS, and Android. It is written in C++ but comes with plugins for C-Sharp, Java, Python, and Unity. The maximum supported distance from camera for hand tracking purposes is a function of its PPI (i.e. imager resolution) and FOV (i.e. lens).