Commands Through Movement

The components of a PrimeSense controller. Source: PrimeSense

The components of a PrimeSense controller. Source: PrimeSense
Researchers at PrimeSense in Tel Aviv, Israel are creating new systems that allow television sets and other consumer electronics to interpret body movements as commands. Using cameras, infrared depth sensors, and specialized microchips, these new systems track motion by correlating information about how light bounces within a projected infrared grid and images from cameras. The detected motions can then be used as commands for gaming systems, televisions, and other electronics.

The first commercial use of PrimeSense technology is the Microsoft Kinect gaming system which creates and controls avatars based on the tracked movements of users. Following that successful test of their technology, PrimeSense is partnering with Asus to manufacture the WAVI Xtion television controller.

Cameras are placed in front of a television and detect when users wave their hands in front of the set. This launches an onscreen menuing system which allows users to make selections by pointing with their hands, move between options by gesturing to the left or right, and use other gestures to rewind video clips, adjust the volume, and perform other basic controller tasks.

The system uses USB 2.0 both for power and for its data communications. It can detect movements in a direct line of sight between 0.8 and 3.5 meters away from the device. The basic device is rectangular with curved edges with a small footprint of 5.5 inches (14 centimeters) long, 1.4 inches (3.5 centimeters) wide, and 2 inches (5 centimeters) deep.

PrimeSense is also working on control systems for computers and plans to introduce a $200 prototyping kit for developers who wish to experiment with other potential uses of the technology or integrate gestural controls into their applications. Potential applications could extend well beyond gaming and consumer electronics once individual developers are free to integrate gestural controls into their own systems.

Right now, the PrimeSense controller mainly recognizes and distinguishes between different motions. It cannot identify different hand positions. PrimeSense hopes to develop another version of their system capable of detecting different hand positions, thus greatly increasing the number of commands available to users of the system.

TFOT has previously reported on other innovative input and control systems including the LucidTouch tablet that allows users to control applications by touching the back of the clear device, a “tangible acoustic interface” that will allows users to turn almost any object into an input device, and the COGAIN system that tracks eye movements and translates them into commands.

Read more about the PrimeSense partnership with Asus in this press release. Learn more about their technology more generally on the PrimeSense product technology page.