Advertisment

Control your personal computer with your mind

author-image
CIOL Bureau
Updated On
New Update

LAS VEGAS, USA: Tech visionaries have long dreamed of the day when PCs, TVs and phones can be controlled with a wave of the hand or even the blink of an eye.

Advertisment

"Natural user interface" technologies on display at last week's Consumer Electronics Show suggest that vision is inching closer to the mainstream -- tearing down barriers between user and device, and dispensing with unwieldy keyboards and remotes.

Also read: Now, monitor your hi-tech house sitting at office

The technology, stoked in the public imagination by the sci-fi hit film "Minority Report", could be approaching a tipping point and can empower applications beyond simple user commands, according to industry insiders.

Advertisment

"Control everything without touching it -- it's moving that way faster than ever," said Janine Kutliroff, CEO and founder of Omek Interactive, an Israeli company that makes software for gesture recognition through 3-D sensors, so you can play games or manipulate a TV just by moving your hands and body. (For a video of the technology click: bit.ly/fb9ZVi )

"Cameras are going to get smaller and cheaper. There's a lot of competitive technology out there," said Kutliroff, whose company was one of a handful showing off gesture control technologies at CES in Las Vegas.

Also read: Try this iPhone app to know how you look

Advertisment

The opportunities for using sensors, cameras and voice recognition to make everyday objects 'intelligent' is almost endless, promoters of the new technology say.

"We see a whole world where machines of any kind interact with you," said Uzi Breier, chief marketing officer for PrimeSense, an Israeli company that built some of the technology behind Microsoft's Kinect system, and has teamed up with Taiwan's Asustek to bring gesture-controlled TV and computer functions to the living room screen this year ( bit.ly/dHwAGq ).

His company is providing sensors for iRobot Corp's latest robots, including a new Roomba automatic vacuum cleaner that can actually 'see' dirt and head for it, rather than aimlessly trundling around the room.

Advertisment

Also read: iPhone app to replace the stethoscope

He predicts sensors will soon be used in homes, so heating and cooling systems can recognize who is in a room and set the temperature to predetermined levels, or into cars to adjust settings according to the driver.

Omek is helping pioneer 'digital signage' -- signs in stores that interact with shoppers and can initiate conversations when people walk close, or hover in an area.

Advertisment

Kutliroff sees this technology reaching far into the commercial sphere, creating virtual salespeople, providing virtual clothes-fitting, or aiding physical rehabilitation by sending 3-D images of a patient in real time to a remote therapist. Security is also a potential avenue, as cameras learn to identify people based on the way their bodies move.

"These are the applications that are going to push this to be more than just another game console," said Kutliroff.

{#PageBreak#}

Advertisment

Microsoft in front?

Gaming has so far been the only mainstream use of gesture-recognition technology. Microsoft Corp's Kinect add-on for the Xbox -- which allows gamers to move avatars on screen just through body motions -- has already sold 8 million units in just over two months on the market. Its three demo booths were one of CES's most popular destinations.

The logical step is for Microsoft to bring the same technology to its Windows operating system, allowing users to manipulate documents or move photos around a screen or projection, as Tom Cruise does in Minority Report. ( bit.ly/1mt24z )

Advertisment

But the world's largest software company is not talking about any plans it may have in that direction. For now, its ambition is to fine tune Kinect and bring the avatar into social on-screen situations.

"The next step will be facial expressions, when you move your mouth for example," said Jose Pinero, an executive at Microsoft's Entertainment Services business, at CES. "Our intention is to bring this technology to games and into other entertainment options."

In the meantime, Microsoft's research labs are working on intriguing new possibilities, such as its 'Skinput' project, a way of controlling devices just by touching your arm or hand in different places ( bit.ly/btRdR6 ), and its LightSpace project for manipulating virtual documents, the closest it has come to the Minority Report scenario ( bit.ly/hLI6HH ).

Though neither of these are products, and may never be, Microsoft has signaled ambitions in gesture-recognition with its October purchase of chipmaker and software company Canesta, which specializes in 3-D sensing technology.

Not so fast

Not everyone is convinced that the revolution is nigh.

Many still like touching things, and the mouse will likely remain the tool of choice for precise jobs, from desktop publishing to graphic design, said IDC analyst Al Hilwa.

"These user interfaces will continue to be the most effective for many areas," said Hilwa. "But overall, technology marches towards more diversity and alternatives."

Several other companies at CES showed off alternative uses for hands-free interaction with computers.

Belgium's Softkinetic-Optrima demonstrated a home video-conferencing system employing 3-D cameras that means you don't have to sit close to the camera, and also a gesture-based TV and media control system. ( bit.ly/hvzoAS )

Softkinetic, like Omek, uses "time of flight" technology, which works out motion based on the time it takes light beams to travel to an object and back to a sensor. Kinect uses a "structured light" system, which projects a grid of dots onto a scene and works out an image from the distortion, while PrimeSense uses its chips to build an image from 'coded' light reflected from objects.

On a different track, Norway's Elliptic Labs demonstrated a way of manipulating an iPad with close-up hand gestures, based on reflected sound waves, or ultrasonics, rather than light. ( bit.ly/fXyjUH )

Many look forward to the day computers and other devices have to learn about you, not other way round.

"We will move from an era from thinking about computers as things with screens and keyboards to an era where the computer becomes invisible and pervasive and it works on your behalf and there's no learning curve, or very slight," said Peter Haynes, senior director, advanced strategies and research at Microsoft. "The next generation of computing has to be computing without the learning curve."

tech-news