Stanford is redefining AR with glasses that combine metasurface waveguides and AI-driven holography, offering a revolutionary way to experience digital and real-world integration. Here’s what it all means.
One of the biggest criticisms of AR and VR, and especially Apple’s vision of what it calls “spatial computing,” is the bulk of the eyewear. There’s no doubt we’ve reached the point where some XR devices and experiences are amazing, but there’s a fairly high wall of annoyance to climb to make use of them.
Devices are heavy, ugly, and uncomfortable, and while the four-year-old Quest 2 is available for $200, prices go up and up, with the $3500 Apple Vision Pro causing wallets to implode.
Although we’ve long seen the promise of VR, and we all expect the technology to get better, we’ve mostly had to trust in the historical pace of technological advancement to provide us with assurance of a more practical future. But now, we’re starting to see real science happening that showcases how this all might be possible.
A team of researchers at Stanford University, led by engineering associate professor Gorden Wetzstein, has built a prototype of lightweight glasses that can show digital images in front of your eyes, blending them seamlessly with the real world. His team specializes in computational imaging and display technologies. They’ve been working on integrating digital information into our visual perception of the real world.
“Our headset appears to the outside world just like an everyday pair of glasses, but what the wearer sees through the lenses is an enriched world overlaid with vibrant, full-color 3D computed imagery,” says Wetzstein. “Holographic displays have long been considered the ultimate 3D technique, but it’s never quite achieved that big commercial breakthrough…Maybe now they have the killer app they’ve been waiting for all these years.”
Content Courtesy– ZDNET