A system that can recognize human gestures could provide a new way for people with physical disabilities to interact with computers. A related system for the able bodied could also be used to make virtual worlds more realistic. The system is described in detail in a forthcoming issue of the International Journal of Arts and Technology.
Manolya Kavakli of the Virtual and Interactive Simulations of Reality Research Group, at Macquarie University, Sydney, Australia, explains how standard input devices - keyboard and computer mouse, do not closely mimic natural hand motions such as drawing and sketching. Moreover, these devices have not been developed for ergonomic use nor for people with disabilities.
She and her colleagues have developed a computer system architecture that can carry out "gesture recognition". In this system, the person wears "datagloves" which have illuminated LEDs that are tracked by two pairs of computer webcams working to produce an all-round binocular view. This allows the computer to monitor the person's hand or shoulder movements. This input can then be fed to a program, a game, or simulator, or to control a character, an avatar, in a 3D virtual environment.
"We developed two gesture recognition systems: DESigning In virtual Reality (DesIRe) and DRiving for disabled (DRive). DesIRe allows any user to control dynamically in real-time simulators or other programs. DRive allows a quadriplegic person to control a car interface using input from just two LEDs on an over-shoulder garment. For more precise gestures, a DataGlove user can gesture using their fingers.
The system architecture include the following components: Vizard Virtual Reality Toolkit, an immersive projection system (VISOR), an optical tracking system (specifically the Precision Position Tracker (PPT) system) and a data input system, Kavakli explains. The DataGlove input is quite simplistic at the moment, but future work will lead to an increase in sensitivity to specific gestures, such as grasping, strumming, stroking, and other hand movements.
Source: Inderscience Publishers