Skinput turns your arm into a touchscreen!

By : |March 1, 2010 0

BANGALORE, INDIA: Are you fed up with the tiny keypad on your cellphone or music player? Then this news is meant for you.

An international team has come up with such a system, called ‘Skinput’, which has the ability to detect the ultralow-frequency sound produced by tapping the skin with a finger, and microchip-sized projectors now found in some cellphones, according to the ‘New Scientist’.

The Skinput system, developed by Chris Harrison – Carnegie Mellon University, Desney Tan and Dan Morris – both Microsoft Research, is based on two technologies: the ability to detect the ultralow-frequency sound produced by tapping the skin with a finger, and the microchip-sized "pico" projectors now found in some cellphones.

As the researchers explain, the motivation for Skinput comes from the increasingly small interactive spaces on today’s pocket-sized mobile devices.

Skinput beams a keyboard or menu onto the user’s forearm and hand from a projector housed in an armband. An acoustic detector, also in the armband, then calculates which part of the display you want to activate.

“Devices with significant computational power and capabilities can now be easily carried on our bodies. However, their small size typically leads to limited interaction space (e.g., diminutive screens, buttons, and jog wheels) and consequently diminishes their usability and functionality,” said Harrison on his website. “Since we cannot simply make buttons and screens larger without losing the primary benefit of small size, we consider alternative approaches that enhance interactions with small mobile systems.”

He said appropriating the human body as an input device is appealing not only because we have roughly two square meters of external surface area, but also because much of it is easily accessible by our hands (e.g., arms, upper legs, torso).

“Furthermore, proprioception (our sense of how our body is configured in three-dimensional space) allows us to accurately interact with our bodies in an eyes-free manner. For example, we can readily flick each of our fingers, touch the tip of our nose, and clap our hands together without visual assistance. Few external input devices can claim this accurate, eyes-free input characteristic and provide such a large interaction area,” the researcher said.

The researchers plan to present their revolutionary work at the Computer-Human Interaction meeting in Atlanta, Georgia on April 12, 2010.

No Comments so fars

Jump into a conversation

No Comments Yet!

You can be the one to start a conversation.