NEW YORK, USA: Project Glass, the latest sci-fi concept to come out of Google's X Lab, has gotten a lot of attention online thanks to a clever demo video that shows a user donning a pair of augmented-reality eyeglasses which project a heads-up display of video chats, location check-ins, and appointment reminders.
I asked Mark Changizi, an evolutionary neurobiologist and author of The Vision Revolution, to answer some of these questions.
"The graphics are not going to look like they're floating out in front of you, because it's only being displayed to one eye," Changizi explains. Instead, the experience would be similar to "seeing through" the image of your own nose, which hovers semi-transparently in the periphery of our visual field at all times (even though we rarely pay attention to it). "Having non-corresponding images coming from each eye is actually something we are very much used to already," Changizi says. "It's not uncomfortable." So Google's one-eyed screen design seems biologically savvy.
Then again, Changizi continues, "they're presenting text to you, and in order to discern that kind of detail, you need to have it in front of your fovea"–the tiny, central part of your visual field. "That's typically 'not' where we're used to 'seeing through' parts of our own bodies, like our noses." Which means that those crisp, instant-message-like alerts won't be as simple to render as the video makes it seem.
"The more natural place to put
"There could be very broad geometrical or textural patterns that you could perceive vividly without having to literally 'look at' them," he says. This would also make the digital overlays "feel like part of your own body," rather than "pasted on" over the real world in an artificial or disorienting way.
But if Google really does plan to bring this product to market before the end of 2012, as it has claimed, it is exactly these psychological and phenomenological details that will have to be examined closely.
(Source: technologyreview.com)