by Steven Kotler
January 10, 2019
from NEO Website

Spanish version

 

 

 


Various prototypes of the Buzz

on display at NeoSensory.
Photo by Eric Ruby
 

 


Neuroscientist David Eagleman

aims to give deaf people a new way to hear,

and upgrade everyone else's senses too.




I'm listening to David Eagleman talk about the future, but I'm trying not to. I'm trying to feel his words.

I don't mean this in an empathetic kind of way. I mean that his words are being played on my wrist, in real time, like a sonically induced haptic braille.

That's because I'm wearing Buzz, a device that Eagleman's startup company, NeoSensory, plans to release in a couple of months.

 

Housed inside a wristband slightly bigger than a Fitbit, the Buzz has a microphone that picks up sound and a computer chip that breaks it into eight frequency ranges.

 

Each frequency range links to a built-in micromotor. When sound from a specific range activates the corresponding motor, it buzzes slightly. It's more than a tingle but less than a bee sting.

 

For example, when Eagleman says the word "touch," his baritone drone of the "uh" sound in the middle of the word buzzes the left side of my wrist, and then the higher-pitch "ch" that ends the word buzzes the right.

Over Eagleman's shoulder is a clothing rack with about 50 VESTs hanging off it.

 

Each VEST  -  short for Versatile Extra-Sensory Transducer  -  looks like a thin wetsuit vest with shiny black circles embedded in the fabric at regular intervals.

 

These work like the Buzz wristband to break sound into frequencies that are replayed on the skin. But instead of having eight motors, there are 32 of them, evenly spaced across the chest and back.

Because the VEST and Buzz turn sound into vibrations, you might suspect that they are new tools for the deaf. And you'd be right, but only to a point. That's because these devices can do much more. While they are capable of turning sound into touch, they actually can do the same with almost any data stream.

 

Eagleman has versions that work with images, the major difference being that the microphone capturing sound is replaced by a camera capturing video.

 

He's also built versions that can detect information that typically eludes human senses.

 

There are varieties that can see in infrared and ultraviolet, two parts of the spectrum that are invisible to the human eye. Others can take live Twitter feeds or real-time stock market data and translate them into haptic sensations.


 

Researching synesthesia

 convinced Eagleman that anyone

can develop new ways of perceiving the world.

Photo by Eric Ruby
 


Here's the part that's really nutty:

With a few days of practice, anyone can learn to interpret these buzzes, resulting in either a prosthetic sense that replaces a missing one or a superpower that gives you an entirely new way to detect the world.

 

 


New peripherals

Eagleman, who has written several books about the brain, has devoted his career to understanding how people's perceptions of the world influence their experience and behavior.

 

For many years, he was a professor at Baylor University, where he ran the only time-perception laboratory in the world. These days, he's an adjunct professor at Stanford and an entrepreneur, with his new startup, NeoSensory, located just a few blocks from campus.

His inspiration for NeoSensory grew out of a longtime interest in synesthesia, a neurological condition in which one sense gets substituted for another:

 

 

 

 

 

 

People with this condition smell colors in addition to seeing them, for example.

"Our perceptions shape our reality," says Eagleman.

 

"What interested me about synesthesia is it's not a disease or disorder. The people who have this condition simply live in a slightly different reality than most of us.

 

Synesthesia is an alternative form of consciousness, a different way of perceiving the world."

Scientists use the German word umwelt  -  pronounced "oom-velt"  -  to describe the world as perceived by the senses.

 

And not every animal's umwelt is the same.

 "We have this folk psychology idea that we perceive the world as it is," explains Eagleman.

 

"But that's not true at all. Our senses limit our view of reality. Ticks are blind and deaf. They live in a world completely shaped by temperature and odor.

 

That's their umwelt, their reality. Bats live in a different world  -  they have a different umwelt, one shaped by bouncing sound waves."


An early prototype for the VEST,

sewn at NeoSensory.

Photo by Eric Ruby



It's for this reason that the philosopher Thomas Nagel argued that it's impossible to know the mind of a bat.

 

But Eagleman wasn't so sure.

 

Synesthesia made him wonder if our senses were more pliable than assumed. Plus, there had been 50 years of research showing the brain is capable of sensory substitution  -  taking in information via one sense but experiencing it with another.

"Back in 1969," says Eagleman, "a neuroscientist named Paul Bach-y-Rita built a sensory substitution device for the blind. Video signals were captured by a camera, digitized and fed into a modified dentist chair.

 

 

 

 

 

 

The chair had an array of push pins that punched the image into the small of the person's back.

 

If Bach-y-Rita held a cup in front of the camera, the person would feel its shape in their back. And patients got pretty good at this. After a bit of practice, they could identify the objects being shown to them."

 

"If you feed the brain a pattern,

eventually it'll figure out

how to decode the information."

 

 

Although the interface is different, this is roughly similar to what cochlear implants do. They take an auditory signal, digitize it, and feed it into the brain.

 

When cochlear implants were invented, not everyone was certain they would work.

"The ear is built to turn sound waves into electrochemical signals, but cochlear implants speak a slightly different dialect," Eagleman says.

We now know that cochlear implants work just fine.

 

So do retinal implants, even though they don't give signals to the brain in the same way as biological retinas do.

 

 

Each of these Buzz wristbands

picks up different kinds of signals.

Photo by Eric Ruby



All of this led Eagleman to conclude that the brain is designed to handle all sorts of data streams from all sorts of input devices. In computing terms, it's built for multiple peripherals.

The eyes are photon detectors, the ears gather sound waves, but it's the brain that takes these signals and turns them into information. Normally, evolution takes millions of years to fine-tune these peripherals.

 

It took eons to shape the bat's echolocation system or the octopus's ability to taste through its tentacles.

 

But Eagleman suspected all that time wasn't required.

"The brain is a black box," he says.

 

"It's completely cut off from the world. There are all these cables coming in from the different senses, but the brain doesn't directly experience sight, sound, or touch.

 

All it ever gets is patterns played out in electrochemical signals.

 

The brain is built to turn patterns into meaning. What the cochlear implant and Bach-y-Rita's dental chair taught us was that it doesn't seem to matter what the input device is, the peripheral.

 

If you feed the brain a pattern, eventually it'll figure out how to decode the information."

As Eagleman explains all of this, Buzz is turning his words into touch and feeding that pattern through my skin and into my brain.

 

It's also gathering noise from the room. When one of NeoSensory's engineers slams a door, I can feel that on my wrist. Not that I'm consciously able to tell these signals apart.

 

The device's motors operate every 1/16 of a second, which is faster than I can consciously process. Instead, Eagleman says, most of the learning  -  meaning my brain's ability to decode these signals  -  takes places at a subconscious level.

Not everyone agrees that this is what's happening.

 

Some researchers claim that the tactile sensitivity of the skin isn't fine enough to tell signals apart. Others argue that this kind of subconscious learning isn't even possible.

 

But Eagleman says that in a not-yet-published experiment, deaf people learned how to "hear" with a Buzz.

 

Though, as Eagleman also explains,

"hear" is a relative term. In this case, it means that people could recognize and identify the feel of 50 different words, he says.

On average, it took four training sessions, each roughly two hours long.

The Buzz, meanwhile, has worked even faster  -  though with far lower resolution. In tests, Eagleman's team played a sound, and subjects wearing the Buzz had to decide whether it was a dog barking, a car passing or a door slamming.

 

Eagleman says 80 percent of them could do this almost right away. The team also showed deaf people a video of someone talking and played one of two different soundtracks. One matched the video and one didn't.

 

Eagleman says 95 percent of the subjects could quickly identify the accurate soundtrack by going off the signals from Buzz.
 


As far as Eagleman can tell,

with the right kind of

data compression in place,

there are no real limits

to what the device can detect.
 


Eagleman says this should massively improve the lip-reading abilities of deaf people and possibly give them the ability to decode raw speech without the visual input of lip reading.

That latter prospect would be a considerable breakthrough, says Harvard neuroscientist Amir Amedi.

"People have tried," he says, "but so far no one has been able to get full speech capabilities from haptic sensation."

Even if the full ability to decode speech is not possible  -  Eagleman doesn't know, as he has yet to run long-term tests on his devices  -  Buzz's ability to provide "low-resolution hearing" will be extremely valuable, he says.

"If you're blind and I get you to 80/20 vision, sure, it's not 20/20, but it's a lot better than what you had before," Eagleman says.

 

"That's what I think we're doing for hearing."

Compare that to another technological option for people with severe hearing loss, a cochlear implant. It requires an invasive surgery and six weeks of recovery time, and it costs tens of thousands of dollars.

 

It also can take about a year for people to actually learn to hear properly with cochlear implants.

 


The VEST can light up

as well as vibrate

in response to auditory stimuli.

Photo by Eric Ruby



Buzz is expected to go on sale at the end of this year, with VEST following in 2020.

 

The wristband might cost around $600 and VEST around $1,000, but Eagleman says the prices aren't certain yet.

Eagleman believes restoring hearing to the deaf will merely be a first step. His real question isn't about sensory substitution. It's about sensory addition.

 

Can we use technology to expand our umwelts?

 

 

 


What is your dog saying?

Eagleman tells me a story about taking a walk through Santa Barbara with author and entrepreneur Rob Reid, who interviewed Eagleman on his "After On" podcast.

 

They were both wearing a version of the wristband that detects parts of the visual spectrum that humans can't see, including the infrared and ultraviolet frequencies.

 

As they strolled, neither seeing anything unusual, they both started getting really strong signals on their wrists.

 

They were being watched  -  but by what? Their wristbands allowed them to answer that question. They could track the signals back to their point of origin  -  an infrared camera attached to someone's home.

 

This may have been one of the very first times a human accidentally detected an infrared signal without the help of night vision goggles. In fact, it may be one of the first times humans accidentally expanded their umwelts.

It won't be the last. Eagleman sees dozens of uses for his devices.

"I can imagine building ones with molecular odor detectors so people can smell (what a dog smells), or a surgeon getting bio-data and not having to look up and check the monitors, or a drone pilot being able to feel pitch and yaw and learning to fly in the dark."

As far as Eagleman can tell, with the right kind of data compression in place, there are no real limits to what the device can detect.
 


Never before

has the nature of reality

been so pliable.


"I find it very intriguing," says UCSF neuroscientist Adam Gazzaley, who is familiar with the work, "in much the same way that I find VR [intriguing].

 

The potential to create truly unique experiences that can expand our perspectives, heal, and even enhance us as humans: that's the incredible potential of these new technologies."

To explore this potential further, Eagleman has also teamed up with Philip Rosedale, the creator of the virtual world 'Second Life'.

 

Rosedale's next iteration, known as 'High Fidelity', is designed for virtual reality.

 

Eagleman has a long-sleeve version of the VEST  -  known as the "exo-skin"  -  that's designed to work with it.

"If you're touched by another avatar," he explains, "you can feel their touch. Or it's raining in the VR world, you can feel the raindrops."

These devices are being released with an open API  -  meaning anyone can run their own experiments.

 

For example, my wife and I run a hospice care dog sanctuary and share our house with around 25 animals at a time. While the house is much quieter than many would suspect, there are definitely times when all our dogs start barking at once.

 

What the hell are they talking about?

This is more than an idle question. Recent research into decoding animal speech  -  mainly with dolphins and prairie dogs  -  has found that animals have far more sophisticated syntax and vocabulary than previously suspected.

 

Eagleman suspects that if I wore one of his devices around my pack of dogs for a while, sooner or later, I might be able to detect things that set them off.

For most of human existence, how our senses perceive reality and how our brain uses those perceptions to shape our world have been mysteries. We now may be at a turning point. We are entering an era where technology is not only replacing senses we've lost but giving us ones we've never had before.

 

Will we even know what it's like to be a bat? Maybe, maybe not.

 

But never before has the nature of reality been so pliable and our ability to experiment with alternate umwelts been so powerful.