michaeljung / iStockphoto
Jan Scheuermann reclines in a wheelchair. Her hands lie motionless, crossed in her lap. She can speak and move her head. But because she’s paralyzed from the neck down, she normally cannot control the rest of her body. Yet one winter day in 2012, she lifted a chocolate bar to her mouth and took a bite. How? She used her mind to move a robotic arm! And, she now recalls, “It was the best chocolate ever.”
Every brain sparks with electrical activity. Tiny nerve cells inside it, called neurons, fire off little zaps of current. It’s how they send messages to one another.
Groups of neurons firing in specific patterns control the body. They also form thoughts and feelings. Researchers are working to capture and translate the messages encoded in these patterns of electrical energy. But the brain doesn’t like to give up its secrets. A hard skull covered with skin and hair protects the soft brain. It also muffles those electrical signals.
To get the clearest signals from neurons, doctors have to open up someone’s skull. Then, they can place sensors directly on top of the brain, or even inside it.
Once they get onto or into the brain, those sensors let scientists and engineers begin reading people’s minds.
The team that worked with Jan matched her brain signals to specific arm and hand movements. Other research teams are watching brain signals while test subjects listen to particular words, gaze at pictures or watch short movies. Later, these teams try to reverse the process. They scan recordings of brainwaves to figure out what someone had been hearing or seeing.
But what about words or images that exist only in the imagination? Can scientists learn to read our inner thoughts? Almost certainly they can, says Jack Gallant. He’s a neuroscientist at the University of California, Berkeley, who studies how the brain processes images.
“Someday we will be able to decode your dreams,” he predicts. “And someday,” he adds, “we will be able to decode your silent, internal speech.” The biggest obstacle, he says, is measuring brain activity precisely.
Exploring with Lewis, Clark — and Hector
A little less than a year before feeding herself that chocolate bar, Jan woke up from surgery. Doctors had embedded two tiny electrical sensors deep inside her brain. Each flat, metal rectangle was only about the size of a grain of rice. One hundred spikes jutted like a bed of nails from the surface of each. These spikes listened to the chatter of between 30 and 160 neurons.
“I named the implants Lewis and Clark,” Jan says, after the famous American explorers of the early 19th century. Hers, she recalls, “were going to lead an expedition into the brain and chart new territory.”
Thin wires connected the sensors to metal caps that protruded from the top of Jan’s head. Researchers could connect the metal caps to a computer system. That let the computer read signals from her brain. The computer would then use this information to control a robotic arm.
Jan had volunteered to take part in a research project at the University of Pittsburgh in Pennsylvania. Before the surgery, she hadn’t moved anything below her neck for 12 years — not even a finger or a toe. The neurons in her brain could still produce electrical signals for movement. But due to a rare disease, the signals no longer reached her arms or legs. “Moving that robotic arm became my singular goal,” Jan remembers.
She nicknamed that arm Hector. With practice, Jan and Hector learned to pick up cubes, cones and other shapes. It was like learning to ride a bike, she says. Once Jan figured out how to control Hector, the process became automatic. She’d do what anyone does when they want to pick something up — reach for it. As she reached, the implants read her brain activity, and the computer program sent a message to Hector with instructions on which way to move.
The math of movement
A team of neuroscientists had spent decades perfecting that mind-reading computer system. One was Andrew Schwartz at the University of Pittsburgh. He studies how the brain produces the signals that direct muscle movements. Before the experiment with Jan, he worked mostly with monkeys that had sensors implanted in their brains.
Schwartz would record brain activity as a monkey performed the same movement over and over. It might be something as simple as pressing a lever down to get a treat. Then, he would look for a pattern of electrical activity in the recordings. That pattern would reveal which neurons had fired when the monkey’s hand pressed down.
He repeated this type of research for many different types of motions. He also built a computer program to track these patterns. This computer model learned to match a pattern of firing neurons to a particular movement. The model now can interpret brain signals that control movement in both monkeys and people.
However, the model still needs work, says Schwartz. It doesn’t always correctly predict the muscle movements from the brain signals it reads. But Jan’s brain was able to compensate for the mismatch. Her brain actually changed its firing patterns to fit the model better, he notes.
The research didn’t stop with Jan’s putting a piece of chocolate in her mouth. Two years after her brain surgery, Jan took control of two robotic prostheses at once — a right arm and a left arm. Then, she tried flying an airplane.
Well, it was really just a flight simulator, a computer program that mimicked the experience of flying. To control this virtual plane, Jan learned to imagine that she was working with Hector. The same brain activity patterns that had controlled her arm now directly controlled the simulated airplane. Again, the process quickly became automatic. She felt like she was really flying. “I was up there. I was out of my chair and out of my broken body. I still get chills when I think about it,” Jan says.
The woman loved working with Schwartz and his team. But the human body doesn’t appreciate unwelcome intruders, such as Lewis and Clark. Over time, scar tissue may wall off the sensors. This can make it harder for them to detect brain activity. Jan’s sensors were still picking up signals after two years, but the skin was pulling away around one of the caps. Microbes could have easily gotten in and caused a dangerous infection. So in 2014, doctors removed the sensors.
To really help paralyzed people, any brain implants must be able to stay in place for at least 10 years, notes Nicho Hatsopoulos. He is a neuroscientist at the University of Chicago in Illinois. He did not work with Jan or Schwartz, but he has worked on brain-computer systems with both monkeys and people. His team is now working on adding a sense of touch to a mind-controlled robotic arm.
People using an arm such as Hector can see what they’re doing, he says. “But ultimately we want them to feel it.” This means that signals would travel both ways. Signals would move from the brain to the robotic arm, then the arm would report textures and pressures back to the brain. And that’s coming. Other researchers have already developed electronic skin that can detect heat, texture and sound.
The work of Schwartz, Hatsopoulos and others might one day help restore movement and touch to paralyzed people. Mind-reading devices might also analyze patterns in brain activity to detect someone’s thoughts. Such a device could give voices to those who have lost the ability to talk. Or, it could allow people to communicate — brain to brain — in complete silence and secrecy.
First, neuroscientists must understand how the brain interprets words and sentences. Brian Pasley of the University of California, Berkeley is working on this problem.
When someone talks to you, he explains, the sound of their voice enters your ear. It then travels along nerves, as electrical signals. When it reaches a part of a brain known as the auditory cortex, “the neurons there fire in very specific patterns,” he explains. Those patterns depend on the sound of the voice and the words spoken. Other areas of the brain join in to retrieve and put meaning to the words.
To build a thought-reading device, neuroscientists would have to match those neuron-firing patterns to words. And such studies cannot rely on lab animals. Only humans speak and understand words. But few people would volunteer to go through brain surgery so that they might take part in such tests.
Luckily, some people with rare brain diseases already have a grid of electrical sensors implanted below their skulls, on the surfaces of their brains. Those sensors are already reading brain activity through a technology known as electrocorticography (Ee-LEK-tro-kor-tih-KOG-rah-fee).
Seven people with the implanted sensor worked with Pasley’s team in a 2014 study. Each volunteer read aloud a passage of text. As they read, the sensor arrays recorded brain activity. The researchers used the brain patterns picked up from the out-loud reading session to build a computer model. It matched patterns of brain activity to the sounds of words.
Later, each person read the same text silently. The team applied that same computer model to the brain activity during these silent reading sessions. The computer translated the brain activity into sounds, making spoken “words.” They came out garbled, so they did not sound exactly like the original text. Clearly, however, this model was on the right track. It pointed to the possibility of one day using brain activity to learn unspoken thoughts.
If brain activity can reveal thoughts, what about imagined pictures?
Jack Gallant has taken important steps toward understanding how the mind represents images and movies. His team doesn’t use electrical sensors. Instead, they study brain activity with a scanner. The technology they use is called fMRI, which stands for functional magnetic resonance imaging. It does away with the need for brain surgery. However, the scanner doesn’t actually detect waves of electrical signals coursing through the brain. Instead, magnets measure the blood flow that brings chemical food to neurons. Firing neurons suck up the most food. So mapping where blood flow is highest points to where the brain is most active.
Five years ago, Gallant and his colleagues took turns watching short movies for several hours as an fMRI machine scanned their brains. This provided lots of data linking patterns of brain activity to the types of action and imagery being viewed.
The researchers then used these data to build a computer model. Later, they got in the scanner again and watched a totally different set of movies. This time, the computer had to guess what had been happening in those movies based only on the brain activity of the viewers.
(Story continues below image)
The resulting movies that the computer generated came out blurry. They also lacked sound. But astonishingly, they did resemble the original films. It was the first tentative success for mind-reading of moving images.
Other neuroscientists have attempted to decode dreams. Yukiyasu Kamitani is a neuroscientist at Kyoto University in Japan. Three years ago, his team asked three volunteers to fall asleep inside an fMRI scanner. As soon as a volunteer was asleep, the researchers would wake him or her up. Then, they’d quickly ask what the sleeper had been dreaming. One person said, “I hid a key in a place between a chair and a bed and someone took it.”
The researchers collected 200 of these dream descriptions from each person. Next, they found pictures of items from the dream descriptions, such as a key, a chair and a bed. The same three volunteers then got into the fMRI scanner again and looked at these pictures while they were awake.
Now the researchers had data to match each image to specific brain patterns. Using these data, they built a computer model to recognize these patterns. Then, they ran the fMRI data from the dream recordings through the model. They wanted to see if the computer model could figure out which objects had shown up in which dreams.
The model correctly identified objects in the dreams six out of every 10 tries. This indicates that neurons fire in very similar patterns whether someone sees a key while awake or just dreams of a key. So data collected as people’s brains process real images can help scientists begin to decode dreams.
Eventually, people may be able to record what runs through their imagination during sleep, Gallant says, then rerun movies of those dreams once they wake up.
Dream movies are still science fiction. So is a device that can read all of a person’s inner thoughts and feelings. Do people really want computers digging around inside their minds? The answer will likely depend on how the technology ends up being used.
For Jan Scheuermann, it was a life-changer. “Most of the time other people have abilities I don’t have. I’m the one who can’t do anything,” she says. But she believes that one day, paralyzed people or those missing limbs will use mind-controlled robotic devices at home or on the go for daily tasks. “The work we did might not benefit people in my lifetime,” she says. “But down the road,” she says, “it will.”
(for more about Power Words, click here)
auditory cortex A part of the brain in humans and other animals that processes sound.
computer model A program that runs on a computer that creates a model, or simulation, of a real-world feature, phenomenon or event.
electrocorticography (or ECoG) A technique to monitor brain health that may be used in patients with brain tumors or epilepsy. Surgeons implant a grid of electrodes directly onto the surface of the brain to record its electrical activity.
electroencephalogram or electroencephalography (abbr. EEG) A technique to detect electrical activity in the brain using electrodes that press against the outside of the head. This technique charts a series of brainwaves.
fMRI (short for functional magnetic resonance imaging) A special type of machine used to study brain activity. It uses a strong magnetic field to monitor blood flow in the brain. Tracking the movement of blood can tell researchers which brain regions are active.
Lewis and Clark Two explorers — Meriwether Lewis and William Clark — who set off in 1803, at the behest of U.S. President Thomas Jefferson, to search for a route from America’s central interior (St. Louis) to the West Coast. Made possible with the help of local Indians, the trip would take the explorers and their party over the Rocky Mountains and last some 28 months.
implant A device manufactured to replace a missing biological structure, to support a damaged biological structure, or to enhance an existing biological structure. Examples include artificial hips and knees, pacemakers, and the insulin pumps used to treat diabetes.
neuron (or nerve cell) Any of the impulse-conducting cells that make up the brain, spinal column and nervous system. These specialized cells transmit information to other neurons in the form of electrical signals.
neuroscience The science that deals with the structure or function of the brain and other parts of the nervous system. Researchers in this field are known as neuroscientists.
paralysis An inability to willfully move muscles in one or more parts of the body. In some cases, nerves that carry the signal to move may have been severed or damaged. In other cases, the brain may be the source of the problem: It may fail to understand or act on a nerve’s signal to move.
prosthesis (pl. prostheses) An artificial device that replaces a missing body part. Such a prosthetic limb, for example, would replace parts of an arm or leg. These replacement parts usually substitute for tissues missing due to injury, disease or birth defects.
sensor A device that picks up information on physical or chemical conditions — such as temperature, barometric pressure, salinity, humidity, pH, light intensity or radiation — and stores or broadcasts that information. Scientists and engineers often rely on sensors to inform them of conditions that may change over time or that exist far from where a researcher can measure them directly.
Book: M. Gay. The Brain Electric: The Dramatic High-Tech Race to Merge Minds and Machines. New York: Farrar, Strauss and Giroux, 2015.
Journal: S. Martin et al. “Decoding spectrotemporal features of overt and covert speech from the human cortex.” Frontiers in Neuroengineering. Volume 7, May 27, 2014. doi: 10.3389/fneng.2014.00014.
Journal: T. Horikawa et al. “Neural decoding of visual imagery during sleep.” Science. Vol. 340, May 3, 2013, p. 639. doi: 10.1126/science.1234330.
Journal: J. L. Collinger et al. “High-performance neuroprosthetic control by an individual with tetraplegia.” The Lancet. Volume 381, February 16, 2013, p. 557. doi: 10.1016/S0140-6736(12)61816-9.
Journal: K. Gururangan et al. “Interview with Robert Knight and Brian Pasley: Brain-Machine Interfaces: Neural Prosthetics and Patient Care.” Berkeley Scientific Journal. Volume 16, Issue 2, 2012, p. 2.
Journal: S. Nishimoto et al. “Reconstructing visual experiences from brain activity evoked by natural movies.” Current Biology. Volume 21, October 11, 2011, p. 1641. doi: 10.1016/j.cub.2011.08.031.
S. Ornes. “Brain to brain.” Science News for Students. March 15, 2013.
S. Ornes. “Scientists help amputees by getting on their nerves.” Science News for Students. February 17, 2011.
S. Gaidos. “Contemplating thought.” Science News for Students. February 20, 2009.
Word Find (click here)