Related Articles
![]() |
Shaping our Decision MakingIn the future, will our smartphones be in charge of our lives? |
![]() |
From Fiction to Reality: Back to the FutureScience is making movie magic come true |
By ItsNotMagicItsScience
Image Source: http://www.sxc.hu/photo/1162951
Search company Google took the world by surprise in April when it teased about Project Glass – a heads-up display (HUD) unit worn like a pair of glasses that would literally change your view on the world.
This video shows that the glasses allows you to do most things a smartphone could – take pictures, check the weather, get directions, and have video-call conversations. The big difference is that now the information is displayed just a few centimeters away from your eyes, and not on the phone.
The augmented reality (AR) glasses project is said to be pursued by Google X Labs, which, according to the New York Times, is “a secret laboratory where engineers and scientists are also working on robots and space elevators.”
The problems of realising Google Glasses
As much as science-fiction can inspire new inventions, there are some things that just can't be done: time-travel, instant food replicators, and hoverboards, for instance, are still firmly stuck in the realm of fiction.
While realising something like Project Glass isn't as impossible as time-travel or faster-than-light speed – at least with our current understanding of physics – there are certainly limitations that would make sure that when Google Glasses becomes real, it's not going to work as well as the concept video.
Among the major issues that make Project Glass unrealistic is the way images are displayed over our eyes. “The small screen seen in the photos cannot give the experience the video is showing,” said Pranav Mistry, an MIT Media Lab researcher in an interview with Wired.
“Current HUDs utilize a fixed lens distance of two feet,” he said. “For true augmented reality, the display would have to dynamically focus, which would require additional hardware on the glasses to read your eye.” Mistry added that even if Google was able to overcome technical barriers, you shouldn’t expect to see real AR glasses on the market for at least two years.
He knows a thing or two about the subject, having worked on the SixthSense project, a “wearable gestural interface” that allows people to interact with digital information with mere hand gestures.
Another person placing a damper on the project is Blair MacIntyre, director of the Augmented Environments Lab at Georgia Tech, criticising the difficulty of squeezing in all that information in such a small screen: “The small field of view, and placement off to the side, would result in an experience where the content is rarely on the display and hard to discover and interact with. But it’s a fine size and structure for a small head-up display.”
Adjusting to various levels of brightness is another point that MacIntyre brought up. Suppose if you were wearing a pair of Google Glasses and move from the shady indoors to the bright outdoors – the contrast of the lenses would need to adjust rapidly and perfectly so you could see both your real surroundings and the digital display. Viewing smartphones in bright sunlight is already a challenge – what more on a pair of glasses that you need to see through?
AR technology concept broker/analyst Marianne Lindsell listed out more reasons why the concept video is unrealistic. “The least realistic parts of the Google Glass video clip in my opinion are the field of view (a large FOV is needed – but can such a small device provide it?), the brightness (possibly – but there are some good techs out there), instant responsiveness, and to some extent the excellent registration (which many AR concepts depend on, but Google have cleverly side-stepped in the clip by largely avoiding such concepts),” she wrote.
But we're close... aren't we?
Despite the depressing evidence against it, there is plenty of hope that we could see people walking around with those cool-looking glasses in our lifetime. Just days after the Project Glass announcement, eyewear manufacturer Oakley said it was working on something similar in a collaborative project with Eye Safety Systems, which develops eyewear for military and government agencies.
“Obviously, you can think of many applications in the competitive field of sports,” Oakley CEO Colin Baden said. “That’s the halo point of where we would begin, but certainly you can transcend that into a variety of other applications.”
Baden said that early versions of it wouldn't be cheap and aimed for a specific market, but would have many features that one would expect of a portable HUD, including voice command control, and being able to connect to a smartphone. “There’s a lot of interesting optical issues that come up when you’re trying to create a positive experience when interacting with these devices,” Baden said. “So the technology barrier to success is significant.”
Then there is Will Powell, an AR developer for Oxford, who, within six days of seeing the video, built his own working version of Project Glass by piecing together currently available technology. He used a pair of Vuzix glasses, a pair of HD Webcams to record the video, and a headset microphone that was connected to Dragon Naturally Speaking voice-recognition software.
While the video was produced without adding in after-effects (that would be cheating!), the AR elements pictured here are added to the glasses' stereoscopic video stream – in other words, it's not what wearer sees, but rather what the camera sees. Powell also doesn't mention what CPU the glasses operated on, so for all we know it could be connected to a desktop PC.
Still, efforts by Powell brings us one step closer to a sci-fi future that was once only a figment of our imagination. And therein lies the awesomeness of pushing the boundaries of science – to chance to make dreams come true.
10/09/2012 16:14:30
thanks for share