Blind in 2040: Specs on Deck

Dylan Fox on 2022-11-30

A day-in-the-life of you, a blind person in 2040, assisted by a pair of high-tech augmented reality glasses called Specs

Photo by Ameer Basheer on Unsplash

As you step out into the brisk autumn day, the smells and sounds of Oakland wash over you. The mouthwatering aura of fried food from food stalls along Lake Merritt mixes with the less mouthwatering odor of the latest algal bloom’s fish victims. The buzz of cars and squawk of seagulls blend with the time and double-time footsteps of a passerby walking their dog. Underpinning this is a low but persistent rhythm — a tap-tap-tap seemingly emanating from a distant source. In fact, the sound feeds from your Specs straight to your inner ear via bone conduction contacts in the temples, leaving your hearing unimpeded. As you move your head, the Specs smoothly recalibrate to keep the sound’s source stable: a grocery store about a mile away. As you unfold your white cane and start tapping towards it, the rhythm is joined by a timely ping and an occasional distance estimate to let you know you’re heading in the right direction.

You’re not sure if your other senses have actually gotten better since you started losing your vision or if you’ve just started paying more attention to them. Either way, you’ve certainly come to rely on them. As a kid, you sat in the living room gazing out into the apple tree, gauging each fruits’ ripeness by only its color. Now, you’d be hard-pressed to differentiate an apple from a pineapple across the room. At first, it was just a bit of nearsightedness; you got some glasses and moved on. But as time went on, your peripheral vision started to close in, and things got blurrier and blurrier. You stopped driving after nearly running over a pedestrian that materialized out of your blind spot. Reading got harder, as did cooking, cleaning, and a number of other tasks. You came close to despair wondering if you’d ever feel like more than a fraction of your former self.

But you adapted. You took orientation and mobility classes, learned to identify your environment and move safely without sight. You mastered using a screen reader to navigate websites and apps. You started listening to audiobooks and podcasts, overcame your shame of asking for large print versions of things, and made peace with being blind. After all, life isn’t just for those with 20/20 vision.

Photo by CDC on Unsplash

The Specs have also made things easier. The first augmented reality headsets to make it to market back in the late 2010s were big, bulky things with paltry battery life and tiny screens. Their claim to fame was the ability to project images anywhere into your environment, but their actual understanding of that environment was pretty limited. Rather than telling you where things were, the headsets often relied on users slowly walking around to fill in environmental gaps in order to get a clear picture of what was nearby. The experience was something in between an interpretive dance and a self check-out at the store, and capturing moving objects was impossible. They were useful in controlled indoor environments, sure, but it wasn’t until a few breakthroughs in miniaturization, lidar and machine vision in the early 2030s that AR headsets became all-terrain and affordable to the everyman.

Now, nearly everyone owns a pair of Specs; some in addition to a smartphone, others choosing to replace their rectangles with spectacles. The set currently resting on the bridge of your nose has a fairly subtle design in charcoal gray, though of course they come in shapes and colors that would give Elton John pause. In addition to connecting to the digital world and giving you a screen anywhere at any size, the Specs understand the physical world: they’re able to map the environment, interact seamlessly with nearby devices and project holographic images almost indistinguishable from real objects. It took some time for the accessibility applications to catch up to the new capabilities, but now most people rarely leave home without them.

One of those holographic images appears in front of you now: a bright yellow arrow, working in tandem with the rhythm to guide you towards your destination. You can’t make it out very clearly, but the broad, bright shape, even though blurry, is bold enough to follow. Periodically it veers left or right to guide you around construction, bending sharply at turns or blinking in sync with a crossing signal. It knows your cane is helping you keep track of cracks and curbs, so it doesn’t bother with those, but it warns you of the tree branches your cane goes under. Once, it flares into bright red and sounds an alarm, warning you to jump back as a car charges out of a parking lot a hair’s breadth from your descending foot!

Photo by Scott Umstattd on Unsplash

The Specs have also helped you get to know your neighborhood since moving in last year. Before you got the Specs, the storefronts you passed were anonymous, little more than flat slabs with a door that might suddenly open onto you. Some stores had their cues, like the pungent smell of a spice shop or the roar of blow dryers at a hair salon, but many were unidentifiable short of a digital search or walking into buildings one by one and asking. Now, the Specs act as a virtual tour guide, quietly mentioning points of interest into your ears as you pass. Your Specs have gotten to know you as you’ve used them; rather than repeat the same descriptions every day, now they only give you the newest updates about sales and new openings. When you get curious, you can also ask them what’s around you. You find it nice to know whether the sounds of sports are from soccer or frisbee, or that the big splash in the lake was from a capsizing paddle boat, or even that the skaters who just blazed past you are wearing helmets and Metallica jackets.

Soon enough, you arrive at the store. Seamlessly, the Specs switch from navigation to scavenger hunt mode, delving into your notes to find your grocery list. (Of course, thanks to data privacy legislation passed after the Great Phishing Phiasco of ’34, the specs need to ask you before they can access your data or upload anything they record.) At some stores, they’d be able to download the floor plan and lead you straight to what you need, but this grocer seems to be a little more old-fashioned. Here, the Specs rely on image recognition and language processing to help you search. They read the aisle descriptions as you pass, then the types of foods and finally the specific labels of food you pick up. A quick scan of the barcode is enough to tell you whether a food has dairy or any of your other allergens. As you fill your basket, they also alert you to other shoppers, differentiating with vivid, discernible colors those heading towards you from those idling in front of the tomatoes or speeding away towards the deli. Remembering the bruises you’ve gotten from inattentive people blithely shoving heavy carts around, you’re grateful for the heads up.

Photo by Eduardo Soares on Unsplash

Your basket is full: time to get in line. You used to hate this part. Sighted people take standing in line for granted: you go to the end, you wait for the next person to move up, then you take a step to stay behind them. How hard could it be? But more than once have you found yourself standing behind a shopper instead of a queuer, accidentally whacking someone in the ankles with your cane as you try to stay at the end of the line, or waiting very patiently only to find that the register you’re waiting at has closed. Thankfully, the Specs can help here too. They examine the people waiting, find the shortest line and guide you to the end of it the same way they guided you to the store. Then, a short tone indicates when you should step forward and soon enough, the cashier is greeting you. You’re not sure they’ve even realized you’re blind.

You’re home and hungry, and since this burger won’t cook itself, you unpack the groceries and start chopping onions. You long ago mastered the recipe, so you don’t bother having the Specs recite it to you; instead, they look into the infrared to tell you when the stove is the right temperature to start cooking. As you ready the buns and condiments, they watch the burgers, chiming at a perfect medium rare.

As you put the Specs into their charging cradle and settle down to eat, you pause a moment between bites to reflect on how they have changed your life. They’re no miracle cure, that’s for sure — not the Star Trek VISOR you had once hoped they would be. But being able to borrow a pair of machine eyes, rather than ask a person or train a dog, has made you more willing to put yourself out there. You don’t have to be quite so cautious as you used to be, and it’s nice to let your mind wander on your walks instead of being constantly vigilant. They’ve given you reason to explore, knowing that knowledge that would have been hidden in photons can now be yours as well.

Photo by Javier Allegue Barros on Unsplash

UPDATE: You can now watch a presentation about the research this story was based on. Augmented Reality Obstacle Avoidance

Author’s note: I would like to thank Emily Cooper, Shiri Azenkot, Yuhang Zhao, and the other scientists whose research into low-vision applications of augmented reality helped inspire this story. I would also like to thank the many blind people I interviewed in conducting my own research on low-vision AR, especially Jesse Anderson (@IllegallySighted) who reviewed this story, as well as those who have contributed their time and effort to improving accessibility technology. If you are blind or low-vision and have any feedback on your representation in this story, please don’t hesitate to reach out.

This piece is part of Immerse’s 2023 issue centering disability innovation in documentary and emerging tech — presenting perspectives from artists, activists, scholars and technologists at the vanguard of storytelling and disability justice. You can find other featured stories and more information about the issue here.

For more news, discourse, and resources on immersive and emerging forms of nonfiction media, sign up for our monthly newsletter.

Immerse is an initiative of the MIT Open DocLab and Dot Connector Studio, and receives funding from Just Films | Ford Foundation, the MacArthur Foundation, and the National Endowment for the Arts. The Gotham Film & Media Institute is our fiscal sponsor. Learn more here. We are committed to exploring and showcasing emerging nonfiction projects that push the boundaries of media and tackle issues of social justice — and rely on friends like you to sustain ourselves and grow. Join us by making a gift today.