A group of scientists led by Prof. Anil Prabhakar of the Assistive Technologies Laboratory are pioneering a device for gesture recognition to aid child development.

In short :- iGest is a kinematic sensor based system that is designed to learn a child’s existing motor capacity through a gesture recognition algorithm. Movement is captured with a kinematic sensor (accelerometers, gyroscopes) installed in a wearable device. Through sensors, gross movement – arm movements, head movement and gestures are recognized and recorded. The system then learns gesture models that are natural to the child. These are then associated with a dictionary of sentences or actions and this data can be transmitted to a phone. This data can be used to assist the child in using a certain device or even to reinforce motor skills by encouraging arm movement to play video games. The unique contribution of this system is that it harnesses existing motor capacity and movement in children to enable communication.

Details :-
iGest is a wearable gesture to speech device. It tracks the gestures of the speech impaired people and speaks for them. This helps people with speech disorders like cerebral palsy, dysarthria and cluttering. Typical operation of these devices requires controlled motor interaction. This presumes a level of consistency that cannot be achieved for severe motor impairments particularly athetosis (constant writhing movements) and ataxia (excessive involuntary movements at the end of a gesture). The problem is compounded when the child is visually impaired, a condition found in 60% of children with CP. Visual impairment does not allow precise motor control due to lack of visual feedback.

Android API for iGest

Android API for iGest

iGest Unit

iGest Unit

Successful gesture recognition

Successful gesture recognition

iGest also helps physiotherapist to monitor occupational therapy. Monitoring the treatment gives better idea to the therapist about the motor skills of the subject which helps them decide the effectiveness of the procedure underwent. Subjects can undergo occupational therapy without the presence of therapist, the data collected during the therapy is used by the therapist at his convenience to view the efficiency of the treatment prescribed and decide accordingly.
iGest is equipped with a three axis accelerometer, gyroscope and magnetometer to record inertial data across all three spatial dimensions of the subject. The differential raw data generated by the sensors is used to create a direction cosine matrix that provides information regarding the orientation of the limb. The resulting angular value along with a time-stamp is transmitted and/or stored in a memory. The android application in the phone recognizes the gestures and converts it to a predefined voice. In physiotherapy, the tracked body movements are stored and later viewed by the therapist to get the idea of therapy undergone. The device monitors real time and gives feedback to the patient about their training exercise.