fbpx

COGAIN – Gaming Without your Hands

Stephen Vickers of De Montfort University, Leicester, UK, has developed a new technology for people with severe motor disabilities, allowing them to play 3D computer games using only their eyes. The new technology brings hope to disabled gaming-lovers who usually cannot enjoy new games that use advanced, three-dimensional graphics.
Playing
Playing “Breakout” without hands
(Credit: cogain.org)

Since the 1990s, people with conditions such as Motor Neuron Disease (MND) used gaze technology to control 2D desktop environments and to communicate using visual keyboards. Typically, users guide a cursor with their eyes, staring at objects for a time to emulate a mouse click. However, this manner is too laborious and cannot match the speed and accuracy of real-time 3D games, according to Vickers, the lead researcher of the project.

Virtual worlds require gamers to perform a whole suite of commands, including moving their character or avatar, altering their viewpoint on the scene, manipulating objects, and communicating with other players. The new technology, developed by the team as part of the EU-funded project Communication by Gaze Interaction (COGAIN), enables disabled gamers to accomplish these commands more easily than before. Furthermore, according to Vickers, there is a privacy issue which has been solved as well: they might prefer not to reveal their disability in the virtual world. “Even though a user in, say, Second Life, might look as if they are able-bodied, if they can’t operate and communicate as fast as everyone else, they could be perceived as having a disability,” he said.
Eye-gaze systems bounce infrared light from LEDs at the bottom of a computer monitor and track a person’s eye movements using stereo infrared cameras. The new system can calculate where on a screen the user is looking, and it has an accuracy of about 5 mm. Vickers’ software includes the traditional point and click interface, but includes extra functions to speed up certain commands.
In order to switch between different functions, the user needs to glance momentarily off-screen in a particular direction. This method allows the user to switch between modes, such as the avatar rotation or different viewpoints. In order to avoid unintentionally selecting an item while looking around the screen, a feature of “gaze gesture” was implemented which can temporarily turn off the eye-gaze functions altogether. “The eyes are perceptual organs, not designed for pointing and selecting,” explains Vickers. “You can’t turn them off, like you can lift your hand off the mouse.”
 Playing 'The Castle' with no hands  (Credit: cogain.org)
Playing ‘The Castle’
with no hands (Credit: cogain.org)

The developments are “hugely important”, according to Mick Donegan, COGAIN partner who works with severely disabled children and adults at Oxford-based charity, the ACE Centre. He explains that enabling the disabled to express themselves and engage with people in ways that they can not experience in real life can have a positive effect on their self-esteem and motivation.

TFOT has also covered Ambient Corporation’s voiceless phone call, which can help the disabled, and the SIAFU, a concept PC specially designed for blind people. Other related TFOT stories are on the Tai-Chi, a futuristic interface which will allow users to convert virtually any tangible objects into interactive surfaces, and Microsoft’s LucidTouch, a mobile electronic unit that allows the user to control onscreen applications by touching the back of the see-through device.
Vickers hopes that the project will begin its software trials within the next year. For more information about this project see COGAIN’s website.

Related Posts