One day, computers may decode your dreams

Research is probing how technology can help people by ‘reading’ their minds

man daydreaming

They aren’t quite there yet, but computers are homing in on the ability to decode the brain signaling that underlies unspoken thoughts.

michaeljung / iStockphoto

Jan Scheuermann reclines in a wheelchair. Her hands lie motionless, crossed in her lap. She can speak and move her head. But because she’s paralyzed from the neck down, she normally cannot control the rest of her body. Yet one winter day in 2012, she lifted a chocolate bar to her mouth and took a bite. How? She used her mind to move a robotic arm! And, she now recalls, “It was the best chocolate ever.”

Every brain sparks with electrical activity. Tiny nerve cells inside it, called neurons, fire off little zaps of current. It’s how they send messages to one another.

Groups of neurons firing in specific patterns control the body. They also form thoughts and feelings. Researchers are working to capture and translate the messages encoded in these patterns of electrical energy. But the brain doesn’t like to give up its secrets. A hard skull covered with skin and hair protects the soft brain. It also muffles those electrical signals.

350-inline-1-brainwaves-bci-jan-smiles-at-bar-landing-page.png
Jan Scheuermann has a rare disease that broke the connections between her brain and muscles. This left her paralyzed from the neck down. But eventually, using only thoughts, she was able to control a robotic arm to feed herself this chocolate bar.http://www.upmc.com/media/media-kit/bci/Pages/images.aspx

To get the clearest signals from neurons, doctors have to open up someone’s skull. Then, they can place sensors directly on top of the brain, or even inside it.

Once they get onto or into the brain, those sensors let scientists and engineers begin reading people’s minds.

The team that worked with Jan matched her brain signals to specific arm and hand movements. Other research teams are watching brain signals while test subjects listen to particular words, gaze at pictures or watch short movies. Later, these teams try to reverse the process. They scan recordings of brainwaves to figure out what someone had been hearing or seeing.

But what about words or images that exist only in the imagination? Can scientists learn to read our inner thoughts? Almost certainly they can, says Jack Gallant. He’s a neuroscientist at the University of California, Berkeley, who studies how the brain processes images.

“Someday we will be able to decode your dreams,” he predicts. “And someday,” he adds, “we will be able to decode your silent, internal speech.” The biggest obstacle, he says, is measuring brain activity precisely.

Exploring with Lewis, Clark — and Hector

A little less than a year before feeding herself that chocolate bar, Jan woke up from surgery. Doctors had embedded two tiny electrical sensors deep inside her brain. Each flat, metal rectangle was only about the size of a grain of rice. One hundred spikes jutted like a bed of nails from the surface of each. These spikes listened to the chatter of between 30 and 160 neurons.

“I named the implants Lewis and Clark,” Jan says, after the famous American explorers of the early 19th century. Hers, she recalls, “were going to lead an expedition into the brain and chart new territory.”

350-inline-2-braingate001.png
When embedded in someone’s brain, this tiny sensor can track the electrical activity of up to 160 separate nerve cells. Two of these sensors allowed Jan Scheuermann to control a robotic arm called Hector.Nicho Hatsoupolos

Thin wires connected the sensors to metal caps that protruded from the top of Jan’s head. Researchers could connect the metal caps to a computer system. That let the computer read signals from her brain. The computer would then use this information to control a robotic arm.

Jan had volunteered to take part in a research project at the University of Pittsburgh in Pennsylvania. Before the surgery, she hadn’t moved anything below her neck for 12 years — not even a finger or a toe. The neurons in her brain could still produce electrical signals for movement. But due to a rare disease, the signals no longer reached her arms or legs. “Moving that robotic arm became my singular goal,” Jan remembers.

She nicknamed that arm Hector. With practice, Jan and Hector learned to pick up cubes, cones and other shapes. It was like learning to ride a bike, she says. Once Jan figured out how to control Hector, the process became automatic. She’d do what anyone does when they want to pick something up — reach for it. As she reached, the implants read her brain activity, and the computer program sent a message to Hector with instructions on which way to move.

The math of movement

A team of neuroscientists had spent decades perfecting that mind-reading computer system. One was Andrew Schwartz at the University of Pittsburgh. He studies how the brain produces the signals that direct muscle movements. Before the experiment with Jan, he worked mostly with monkeys that had sensors implanted in their brains.

Schwartz would record brain activity as a monkey performed the same movement over and over. It might be something as simple as pressing a lever down to get a treat. Then, he would look for a pattern of electrical activity in the recordings. That pattern would reveal which neurons had fired when the monkey’s hand pressed down.

He repeated this type of research for many different types of motions. He also built a computer program to track these patterns. This computer model learned to match a pattern of firing neurons to a particular movement. The model now can interpret brain signals that control movement in both monkeys and people.

However, the model still needs work, says Schwartz. It doesn’t always correctly predict the muscle movements from the brain signals it reads. But Jan’s brain was able to compensate for the mismatch. Her brain actually changed its firing patterns to fit the model better, he notes.

The research didn’t stop with Jan’s putting a piece of chocolate in her mouth. Two years after her brain surgery, Jan took control of two robotic prostheses at once — a right arm and a left arm. Then, she tried flying an airplane.

Well, it was really just a flight simulator, a computer program that mimicked the experience of flying. To control this virtual plane, Jan learned to imagine that she was working with Hector. The same brain activity patterns that had controlled her arm now directly controlled the simulated airplane. Again, the process quickly became automatic. She felt like she was really flying. “I was up there. I was out of my chair and out of my broken body. I still get chills when I think about it,” Jan says.

The woman loved working with Schwartz and his team. But the human body doesn’t appreciate unwelcome intruders, such as Lewis and Clark. Over time, scar tissue may wall off the sensors. This can make it harder for them to detect brain activity. Jan’s sensors were still picking up signals after two years, but the skin was pulling away around one of the caps. Microbes could have easily gotten in and caused a dangerous infection. So in 2014, doctors removed the sensors.

To really help paralyzed people, any brain implants must be able to stay in place for at least 10 years, notes Nicho Hatsopoulos. He is a neuroscientist at the University of Chicago in Illinois. He did not work with Jan or Schwartz, but he has worked on brain-computer systems with both monkeys and people. His team is now working on adding a sense of touch to a mind-controlled robotic arm.

People using an arm such as Hector can see what they’re doing, he says. “But ultimately we want them to feel it.” This means that signals would travel both ways. Signals would move from the brain to the robotic arm, then the arm would report textures and pressures back to the brain. And that’s coming. Other researchers have already developed electronic skin that can detect heat, texture and sound.

Reading thoughts

The work of Schwartz, Hatsopoulos and others might one day help restore movement and touch to paralyzed people. Mind-reading devices might also analyze patterns in brain activity to detect someone’s thoughts. Such a device could give voices to those who have lost the ability to talk. Or, it could allow people to communicate — brain to brain — in complete silence and secrecy.

First, neuroscientists must understand how the brain interprets words and sentences. Brian Pasley of the University of California, Berkeley is working on this problem.

350-inline-3-brain-decoding-fneng-07-00014-g0002.png
The red dots represent sensors placed in a grid over the top of someone’s brain. They pick up electrical signals as someone reads the same text silently and out loud. Researchers used patterns in these electrical signals to decode words.Stephanie Martin, EPFL/ © 2014 Martin, Brunner, Holdgraf, Heinze, Crone, Rieger, Schalk, Knight and Pasley/ (CC BY)

When someone talks to you, he explains, the sound of their voice enters your ear. It then travels along nerves, as electrical signals. When it reaches a part of a brain known as the auditory cortex, “the neurons there fire in very specific patterns,” he explains. Those patterns depend on the sound of the voice and the words spoken. Other areas of the brain join in to retrieve and put meaning to the words.

To build a thought-reading device, neuroscientists would have to match those neuron-firing patterns to words. And such studies cannot rely on lab animals. Only humans speak and understand words. But few people would volunteer to go through brain surgery so that they might take part in such tests.

Luckily, some people with rare brain diseases already have a grid of electrical sensors implanted below their skulls, on the surfaces of their brains. Those sensors are already reading brain activity through a technology known as electrocorticography (Ee-LEK-tro-kor-tih-KOG-rah-fee).

Seven people with the implanted sensor worked with Pasley’s team in a 2014 study. Each volunteer read aloud a passage of text. As they read, the sensor arrays recorded brain activity. The researchers used the brain patterns picked up from the out-loud reading session to build a computer model. It matched patterns of brain activity to the sounds of words.

Later, each person read the same text silently. The team applied that same computer model to the brain activity during these silent reading sessions. The computer translated the brain activity into sounds, making spoken “words.” They came out garbled, so they did not sound exactly like the original text. Clearly, however, this model was on the right track. It pointed to the possibility of one day using brain activity to learn unspoken thoughts. 

Decoding dreams

If brain activity can reveal thoughts, what about imagined pictures?

Jack Gallant has taken important steps toward understanding how the mind represents images and movies. His team doesn’t use electrical sensors. Instead, they study brain activity with a scanner. The technology they use is called fMRI, which stands for functional magnetic resonance imaging. It does away with the need for brain surgery. However, the scanner doesn’t actually detect waves of electrical signals coursing through the brain. Instead, magnets measure the blood flow that brings chemical food to neurons. Firing neurons suck up the most food. So mapping where blood flow is highest points to where the brain is most active.

Five years ago, Gallant and his colleagues took turns watching short movies for several hours as an fMRI machine scanned their brains. This provided lots of data linking patterns of brain activity to the types of action and imagery being viewed.

The researchers then used these data to build a computer model. Later, they got in the scanner again and watched a totally different set of movies. This time, the computer had to guess what had been happening in those movies based only on the brain activity of the viewers.

(Story continues below image)

730-inline-4-brain-waves-Nishimoto.etal_.2011.Reconstruction.5panels.png
Using patterns of brain activity, a computer program attempted to reconstruct a movie that someone watched. The results are in the bottom row. The top row shows what scenes in the original movie had actually looked like.Modified from S. Nishimoto and J.L. Gallant: Current Biology 21(2011)

The resulting movies that the computer generated came out blurry. They also lacked sound. But astonishingly, they did resemble the original films. It was the first tentative success for mind-reading of moving images.

Other neuroscientists have attempted to decode dreams. Yukiyasu Kamitani is a neuroscientist at Kyoto University in Japan. Three years ago, his team asked three volunteers to fall asleep inside an fMRI scanner. As soon as a volunteer was asleep, the researchers would wake him or her up. Then, they’d quickly ask what the sleeper had been dreaming. One person said, “I hid a key in a place between a chair and a bed and someone took it.”

The researchers collected 200 of these dream descriptions from each person. Next, they found pictures of items from the dream descriptions, such as a key, a chair and a bed. The same three volunteers then got into the fMRI scanner again and looked at these pictures while they were awake.

Now the researchers had data to match each image to specific brain patterns. Using these data, they built a computer model to recognize these patterns. Then, they ran the fMRI data from the dream recordings through the model. They wanted to see if the computer model could figure out which objects had shown up in which dreams.

The model correctly identified objects in the dreams six out of every 10 tries. This indicates that neurons fire in very similar patterns whether someone sees a key while awake or just dreams of a key. So data collected as people’s brains process real images can help scientists begin to decode dreams.

Eventually, people may be able to record what runs through their imagination during sleep, Gallant says, then rerun movies of those dreams once they wake up.

Mind games

Dream movies are still science fiction. So is a device that can read all of a person’s inner thoughts and feelings. Do people really want computers digging around inside their minds? The answer will likely depend on how the technology ends up being used.

For Jan Scheuermann, it was a life-changer. “Most of the time other people have abilities I don’t have. I’m the one who can’t do anything,” she says. But she believes that one day, paralyzed people or those missing limbs will use mind-controlled robotic devices at home or on the go for daily tasks. “The work we did might not benefit people in my lifetime,” she says. “But down the road,” she says, “it will.”

Kathryn Hulick is a freelance science writer and the author of Strange But True: 10 of the World's Greatest Mysteries Explained, a book about the science of ghosts, aliens and more. She loves hiking, gardening and robots.

More Stories from Science News Explores on Tech