How real is this thing — or is it just a fancy video mockup? There are some great answers here. I think it’s the best coverage on this topic as yet.
So Pavlus asked Mark Changizi, a neurobiologist and the author of The Vision Revolution, to discuss Google’s concept video of its AR glasses equipped with visual search, voice-recognition and speech to text. They recorded the commentary and put it over the Project Glass concept video Google released yesterday. Reminder. This is a demo video, mind you. It is clearly just that.
Project Glass, the latest sci-fi concept to come out of Google’s X Lab, has gotten a lot of attention online in the past 24 hours thanks to a clever demo video that shows a user donning a pair of augmented-reality eyeglasses which project a heads-up display of video chats, location check-ins, and appointment reminders.
Reactions to the product design have ranged skeptical to enthusiastic, but I was curious about the psychological and visual-cognitive aspects of the user experience. What would these “digital overlays” actually look and feel like? Would they really be as sharp and legible as the ones shown in the video? (I don’t know about you, but I can’t focus sharply on anything less than an inch away from my eyeball, which is where the eyeglasses’ tiny screen would be dangling.) Would they obstruct my vision and make me motion-sick? How would my brain make perceptual and physical sense of the graphics: where would I “look,” exactly, in order to “watch” the tiny picture-in-picture video chat shown at the conclusion of the clip?
The enterprising reporter had Mark Changizi, an evolutionary neurobiologist, talk over the realities and likelihood of what Google shows in its concept video. The results will blow you away … MORE HERE
Bottom line: The reality will likely be a lot different than they look in the concept video. This we knew. This extraordinary examination of the concept Project Glass video — check it out here — with the audio overlay from Changizi — shows why.