This is your brain on music: Neurologists, composers and tech-geeks at the University of Washington’s DXARTS program study music and the mind — including the encephalophone, a new instrument you can play without moving a muscle.
In April of 2016, Seattle choir director and fifth-grade teacher Margaret Haney checked into the emergency room with an unusual problem — suddenly, she couldn’t sing.
Haney had been in the classroom, trying to lead her students through George Gershwin’s “Summertime” when, as she put it, “I failed miserably, like I never have … my students were giving me some funny looks.” She skipped on to “Oh, What a Beautiful Mornin’,” but said she “couldn’t find the notes to save my life — and it’s a song I’ve been singing since I was 4 years old.”
Haney thought she’d had a stroke, even though she showed no other symptoms: no physical weakness, no sagging in her face, no slurring. The physicians ordered some brain scans and discovered she was suffering from “amusia” — the inability to make music — due to a viral encephalitis infection in one section of her brain.
After the tests, she was referred to Dr. Thomas Deuel, a Swedish neurologist who plays trumpet and guitar, studied musical composition and molecular biology at Princeton University, and then jazz at New England Conservatory in Boston.
One of Deuel’s fellow physicians knew her music-minded colleague would want to see Haney — and might be able to help her with an unusual invention.
Deuel had been working with DXARTS, a University of Washington program that incubates collaborations between scientists and artists. DXARTS was launched in 2001, with an emphasis on projects that boldly crisscross borders: video, performance, music, virtual reality, robotics and all-around tech-art hacking.
Lately, Deuel had advised DXARTS on building a lab, with state-of-the-art technology to study the relationship between neurology and art (particularly music), and explore deep connections between the body and the brain.
Deuel had also teamed up with UW-based physicist Felix Darvas on a neuro-musical invention: the encephalophone (pronounced “en-sef-ah-lo-fone”), an instrument you can play simply by thinking.
Haney, Deuel said, “was a one-in-a-million case” who had the precise kind of brain lesion that might be ameliorated — though his hopes for a total cure were slim — with the encephalophone.
Deuel had helped invent the encephalophone with a double-edged purpose: to explore new frontiers in music technology and as a possible therapeutic tool for people who’d suffered from strokes or neurological problems like ALS (also known as Lou Gehrig’s disease). If people couldn’t use their limbs to make music anymore, maybe learning to make music in a new way, with different parts of their brains, could serve as a method of neurological rehabilitation. One advantage, Deuel said, is that “it’s totally noninvasive: no surgery. And it’s portable.”
Most Read Stories
- Put down that cellphone; distracted-driving law is here
- 83-year-old woman sexually assaulted in SeaTac assisted-living facility; assailant sought
- What drivers can and cannot do under Washington state's new distracted-driving law
- Trade analysis: Mariners deal a top prospect in Tyler O'Neill but leave their biggest hole unfilled
- Illicit skatepark on Green Lake’s Duck Island: Cops called on bowl built in bird habitat WATCH
To play the encephalophone, a musician wears an electroencephalogram (EEG) cap fitted with electrodes that read brain waves and transmit them to a synthesizer. The EEG caps looks like a beanie without the propeller but protrudes a cluster of wires hooked up to amplifiers and computers. The instrument is a kind of “brain-computer interface,” and sounds like an electric piano, electric strings, or whatever other kind of music the connected synthesizer can produce.
Scientists have been studying brain-computer interfaces since the ’70s, developing technologies that communicate between wired brains and external devices that, for example, allow people — or, in some studies, monkeys — to move screen cursors or robot arms with brain signals alone.
Other musicians and scientists have used EEG technology to make sound before, Deuel said, but only “passively generated sound” based on brain activity. The encephalophone, he explained, is “an EEG-controlled musical instrument.”
The instrument is still in its early experimentation phases. Deuel and his collaborators — including DXARTS co-founders Juan Pampin and Richard Karpen — are waiting for approval from the UW to test the encephalophone in clinical trials, both to see whether making music improves patients’ quality of life and whether it might even help them improve their motor skills.
When Deuel met Haney, he hoped the neuro-instrument could help her relearn how to sing.
Inside musicians’ heads
The encephalophone is just one part of the brain-music research underway at DXARTS, where Deuel is also affiliate professor.
Before Deuel began working with Haney, Pampin (a composer from Argentina) had already been overseeing what he calls a “brain-art initiative, ” which has cobbled together funding from the Mellon Foundation, the National Endowment for the Arts and the UW, including the Bergstrom Award — a grant from UW chemistry alumnus Donald Bergstrom to build bridges between art and science.
The DXARTS lab, Pampin said, has built up a small arsenal of high-precision machines to map a range of bio-neural activity: brain waves; the arm muscles of musicians while they play; how eyes move in response to stimuli; the bioelectrical activity on a person’s skin, in a technology also used for polygraph tests.
Members of the renowned, New York City-based JACK Quartet, for example, have been coming to DXARTS to wear EEG caps while playing together — or listening while colleagues play, or just thinking through a piece of music they know intimately.
“Great musicians,” said Karpen (who is also director of the UW School of Music), “can do things with their minds and bodies with a deep intelligence — that doesn’t mean they’re better people or anything like that, of course. But think of it this way: Most of us can throw a football, but most of us can’t throw a football for 80 yards to someone catching it on a dive. That’s a deep mind-body connection.”
Deft players, like the JACK Quartet, are trying to help DXARTS discover new frontiers in the relationship between the body and the brain.
“Little by little,” Pampin said, “we’ll start using those experiments to compose music.” Collecting data about the musicians’ brain activity is an attempt “to find patterns, correlations, things that happen when musicians play.”
In one previous experiment, a violinist wore an EEG cap while listening to a viola player. When the violist began to play faster, the violinist’s brain lit up with excitement. “If we tell the musicians to play certain notes together,” Pampin said, “all of them get a high level of alpha waves that are not correlated exactly, but they’re similar.” (Alpha brain waves, according to articles in publications like “Scientific American,” have been affiliated with easing symptoms of depression and epilepsy.)
While the musicians play, Pampin explained, they hear biofeedback tones: a synthetic sound, based on their own brain waves, that gets calmer and slower as they become more relaxed. And as they respond to the sound of themselves becoming more relaxed they, in turn, become even more relaxed.
“It’s a little bit like doing ‘ohm,’ ” he said. “Can you play ‘ohm’ with an instrument and get into that mood of deep listening?”
‘It takes training’
DXARTS’ split focus between music and science creates “a healthy tension” between the collaborators, Deuel said.
The encephalophone’s EEG cap charts electrical activity in the brain’s motor cortex. “When you move your right arm, the motor cortex is engaged and I can see that with the EEG,” Deuel explained. “But when you just think about moving your arm, the same signal happens — even if you’ve had a stroke or Lou Gehrig’s disease and think about moving your arm, I can see it.”
With a little practice, a musician thinking about moving an arm or hand in one way — say, extending or contracting fingers, or thinking about moving an arm up or down — can move notes up or down an eight-tone scale.
Electrical signals caused by those thoughts (or by those actions) move from the brain through the EEG cap. Those signals are then transferred to one computer (which processes the signals), then to another computer, which turns the signals into sound for a synthesizer.
“It takes training,” Deuel said. “We’re not at the place where we can show you a score by Beethoven and get it right all the time. But you have real control. It’s not random.”
With more research and fine-tuning, patients like Haney, who’ve lost the ability to make music with one part of the brain, might be able to use the encephalophone to train themselves to make music using other parts and processes of their brains.
“You have to learn to use the instrument,” Pampin emphasized, like training your brain to play a trumpet or guitar.
In some ways, Deuel explained, EEG technology is crude: “If you’re looking through a greased glass pane and someone’s walking around, you can see that someone is moving or not moving, but you probably can’t see their facial features and tell who they are. That’s where EEG technology is now.” But, unlike magnetic resonance imaging, which takes high-resolution snapshots of the brain at one moment in time, EEG technology can reveal what the brain is doing while it’s doing it. “The EEG,” Pampin said, “is a little bit of a time machine.”
A related DXARTS project, spearheaded by associate professor James Coupe — who makes artworks about surveillance — is experimenting with drones that fly and shoot video directed by the brain waves of people while they’re asleep.
The drone, Pampin hastened to add, is “very sophisticated, and has its own geographical boundary settings — it won’t go into a building or anything.”
Does DXARTS worry about the more sinister implications of honing intimate, brain-wave surveillance technology? Flying drones based on brain waves to shoot video is one thing — people flying drones with their minds for more violent or exploitative purposes is something else.
After all, some of the first research into brain-computer interfaces in the 1970s, at UCLA, was partially funded by the U.S. Department of Defense. “As you’re making music with your head, is your iPhone recording your patterns and sending them off to Apple, or Google, or whatever,” science-fiction author Greg Bear said in a GeekWire article, “so they have mental patterns as to what you’re doing when you’re doing something and thinking about buying something or going somewhere?”
Deuel said he isn’t too concerned about that possibility, mostly because EEG technology isn’t developed enough — yet. “There are very few ‘thoughts’ that we can measure at this time with EEG,” he said, “and the signal is not streaming to any form of communication by iPhone, et cetera. That’s not to say this couldn’t theoretically happen in the future.”
Deuel also has a pending patent for the BCI feedback technology that drives the encephalophone, which he said gives him some control over how its used.
‘Music inside us’
Pampin, Deuel and Karpin all stress that art has always been at the forefront of technological development: Pythagoras using math to define harmonics and scales, the inventions of Leonardo da Vinci, Beethoven thinking about the development of the contemporary piano and making strings taut.
The first popular digital synthesizer (the DX7), Pampin said, was accidentally invented by composer John Chowning “while he was at Stanford University, trying to modulate sine tones to see if he could get something like a female vocal sound. He discovered that process in ’67. And every pop band in the 1980s was using it. Without cables or patching things together, all these new sounds were possible, which people liked because it created sonic novelty.”
Whenever artists start experimenting with new computer technologies, Pampin said, there’s always some anxiety that the machines will replace the humans: “But this is human-centered research. The technology helps us see what humans can do — it’s always driven by the human-centered desire to know more about us.
“ … The mystery continues to be the human mystery — what the brain can do to make art.”
Deuel has performed on the encephalophone with a jazz ensemble, including one concert during last year’s 9e2, a series of nine evenings at King Street Station with performances and talks about the intersection of technology and culture.
“We all have music inside us,” Deuel said. “What really drives this project is the ability to marry my interest in music cognition from a neuroscience standpoint and patient care, with having that intertwined with art and performance.” Someday, he said, patients who’ve learned to play the encephalophone with the motor cortex of their brains might be able to use that training to augment physical therapy and improve their ability to move.
Pampin hopes the encephalophone will be developed enough to host a public concert of “brain performers” by late 2018. “But how far can we go in terms of the sophistication of those pieces?” he asked. “It remains to be seen.”
And Margaret Haney? Doctors at Swedish, Deuel said, treated her with antiviral medication to halt the spread of the infection — and the instrument helped relieve her amusia.
Learning to play the encephalophone “helped her make pitch. We weren’t able to completely cure her, but she was able to get back to singing again. We can’t prove that we’ve done a lot with just one patient, but it was a promising start.”