GestureRecognitionToolkit  Version: 0.1.0
The Gesture Recognition Toolkit (GRT) is a cross-platform, open-source, c++ machine learning library for real-time gesture recognition.
Gesture Recognition Toolkit

Introduction

The Gesture Recognition Toolkit (GRT) is a cross-platform, open-source, c++ machine learning library for real-time gesture recognition. The GRT has been designed to:

  • be easy to use and integrate into your existing c++ projects
  • be compatible with any type of sensor or data input
  • be easy to rapidly train with your own gestures
  • be easy to extend and adapt with your own custom processing or feature extraction algorithms (if needed)

The GRT features a large number of algorithms that can be used to:

  • recognize static postures (such as if a user has their hands in a specific posture or if a device fitted with an accelerometer is being held in a distinct orientation)
  • recognize dynamic temporal gestures (such as a swipe or tap gesture)
  • perform regression (i.e. continually map an input signal to an output signal, such as mapping the angle of a user's hands to the angle a steering wheel should be turned in a driving game)

You can access the main GRT wiki here.