With 32 wires sprouting from a cap on his head, University of Washington research assistant C. J. Bell stared at a computer screen and thought...
With 32 wires sprouting from a cap on his head, University of Washington research assistant C.J. Bell stared at a computer screen and thought: “Red.”
Across the room, a 2-foot-tall robot called Morpheus shuffled up to a table holding a green block and a red block. Tilting his head, the machine scanned the choices with camera “eyes.”
Morpheus paused, then picked up the red block.
“He got it right,” Bell said proudly.
- Our state’s greatest gift to the nation just got canceled
- Watch: Former Mariners great Ichiro Suzuki pitches — yes, pitches — for the Marlins
- Clay Matthews tells Colin Kaepernick: ‘You ain’t Russell Wilson, bro’
- Gun violence: Don’t fear gun laws; let gun-owners help pay to fix the problem
- Evergreen High School football player critically injured during game
Most Read Stories
In fact, Morpheus has a 94 percent success rate at reading simple mental commands.
But he’s only a first step toward developing a practical household robot controlled solely by brain waves, said Rajesh Rao, leader of the UW robot team and associate professor of computer science and engineering.
“It’s a proof-of-concept demonstration,” Rao said. “The primary goal is a robot that can help disabled or paralyzed people.”
Other researchers have wired humans to machines that allow them to move a cursor on a computer screen or operate a robotic arm with their thoughts. But those connections require electrodes inside the person’s skull. With the system Rao and his colleagues have developed, the operator only suffers a bad hair day.
This feature requires Flash 8.
“The novelty here is that we’re connecting non-invasively to a humanoid robot,” Rao explained.
To prepare for the demonstration, Bell pulled on the tight-fitting cap while fellow graduate student Pradeep Shenoy filled a 4-inch syringe with conductive gel. Shenoy injected the gel into the openings in the cap, and fitted an electrode to each.
“The electrodes don’t actually touch the skull,” he explained. “They float in the goo, and the goo touches the skull.” It also gets in the hair, requiring a shampoo after every test.
The technique is called electroencephalography, or EEG. It’s the same set-up used in hospitals to monitor epileptic seizures and diagnose sleep disorders.
Nerve impulses in the brain create electrical signals that can be detected on the scalp.
But it’s kind of like viewing the world through fogged-up glasses: The nuances of brain activity don’t come through.
“We can’t really read your thoughts,” Rao said.
What the system can detect is the strong “ah ha” response generated by the brain when it experiences a flash of recognition.
“It’s like when you’re driving around in your car, looking for your favorite coffee place,” Rao explained. When you spot it, your brain registers a jolt.
On the computer screen, Bell saw a robot’s-eye view of the two blocks. He chose the red block and focused his attention on it. Both images began flashing randomly. Every time Bell saw the red block light up, his brain went: “Ah ha.” After several flashes, the computer recognized his choice and sent a signal to Morpheus via wires and a second computer.
The non-invasive approach limits the complexity of instructions that can be conveyed to the robot, Rao said. But it should allow an operator to choose from a list of pre-programmed orders, like: bring the medicine; pour a glass of water; or roll the wheelchair to the bed.
The system also can operate wirelessly, holding open the possibility of a robot lackey that can be dispatched into the world to do its owner’s mental bidding.
“That’s the science-fiction-type stuff,” Rao said.
Right now, Morpheus more resembles an Energizer bunny running low on juice than Rosie, the Jetsons’ domestic robot — or the menacing automatons of film and fiction. His gait is tortoise slow. Picking up a block requires five jerky motions of his segmented arms.
Though highly customized, he’s at heart an off-the-shelf research robot purchased for $65,000 from Fujitsu Automation in Japan. Rao and his team protect their investment by locking the robot in a gun safe each evening.
The recent bad weather delayed until Monday the robot’s return from Japan, where he had been sent for repairs.
“He just got back and he’s a little cranky,” Shenoy said.
The Japanese technicians tightened belts in the robot’s appendages, which threw off the elaborate computer instructions that allow him to navigate the world. Rawichote Chalodhorn, a visiting scientist from the University of Osaka, spent all night Monday reprogramming the robot.
Robotics is already an $11 billion-a-year industry, but applications are largely limited to industries such as automobile manufacturing. In an article in January’s Scientific American magazine, Bill Gates likens it to the computer business in 1970, when he and Paul Allen founded Microsoft.
“When I look at trends that are now starting to converge, I can envision a future in which robotic devices will become a nearly ubiquitous part of our day-to-day lives,” Gates wrote.
But Morpheus and his mind-control system have a long way to go.
After picking up the red block, the robot stood by for a second mental command telling him which of two tables to set it down on. But the brain-computer link failed, and Bell had to key in the instruction.
Morpheus dutifully turned left and headed for the selected table. He stopped, extended his arms and let go. The block hit the table’s edge and tumbled to the floor.
Sandi Doughton: 206-464-2491 or firstname.lastname@example.org