Ramses Alcaide has spent over a decade thinking about thinking.
As a Ph.D. student at the University of Michigan in 2015, he developed a brain-computer interface that would allow people to control software and physical objects with their thoughts. Today, that interface is behind plans by a Boston-based startup, Neurable, to begin shipping a set of brain-sensing headphones to let you know when you’re poised for peak productivity.
Using your thoughts to make things happen in the real world was once a thing of science fiction. Now, it’s moving into reality, and Neurable’s interface is just one of the products companies are trying to develop that would usher in a consumer revolution in electronics.
Already, brain tech allows players to manipulate avatars in video games by concentrating on parts of the screen. And Facebook last month revealed plans to interpret your intent to move a finger to trigger digital commands.
Researchers think these advances might lead to the next big tech revolution — giving human beings essentially a sixth sense: If you think it, a computer can capture it, display it and even say it aloud. Think of it as tech-boosted telekinesis, enabling you to type with your mind, share your thoughts without speaking, or navigate the internet solely by focusing on where you want to go.
The potential ramifications of that reality concern some ethicists. They note there are no laws regulating how brain tech can be applied to consumer products and that no one controls what tech companies do with the information they scoop up from your brain.
Some say that as things advance, it might even be possible for firms to alter the organ that essentially makes you you.
“The brain is what makes you human. You are your brain. If technology enters your brain, it’s entering you,” said Rafael Yuste, director of the Neurotechnology Center at Columbia University. He heads a research team that in 2019 discovered it might be possible not only to decode your thoughts but to inject memories into them as well.
For now, Neurable’s goal is to help people know when they’re the most focused and best positioned to make decisions at work as well as when they’re least likely to be productive. Tucked inside a pair of ordinary headsets, the interface is expected to unlock new categories of metrics, such as keeping track of how often you fidget, drink water and smile. It’s designed for people who want to use their time more effectively and people set on bettering their mental health, the company says.
“Think of it as a Fitbit for your brain,” Alcaide said.
But that’s not where Alcaide hopes the technology will end up. He foresees the day when you can control your smartphone without touching it with your hands or commanding it with your voice. “Want to take a call or skip a track? No need to use your hands,” a fundraising page for Neurable says, promoting the company’s gesture control system.
That technology — promising users that if you think it, it will happen — would be a monumental step in consumer electronics.
Neurotech is a catchall term broadly encompassing an industry set on connecting human brains to computers. Some applications require surgery; others don’t. It’s a compelling biotech field that’s rapidly evolving, enabling machines to interpret or alter your consciousness.
It has already shown clinical value in Parkinson’s disease patients seeking to reduce balance problems, tremors and difficulties walking, according to the Food and Drug Administration. The field of medicine shows promise for people with other neurological conditions such as ADHD, Alzheimer’s disease and epilepsy.
Scientists are using it to study depression, anxiety and fear.
But as time goes on, the research inches further away from clinical settings.
Since 2014, startups such as Muse, Dreem and BrainCo have unveiled headbands and other wearables for consumers to use at home. There’s NextMind’s mind control add-on for VR glasses and Urgotech’s headpiece meant to help your brain sleep better. Versus took a stab at brain-reading headphones with a $1,500 device meant to train you to clear your mind and reduce stress.
Neurable plans to be next. It began taking preorders Tuesday, but with cheaper headphones than Versus that will one day allow you to select music tracks using facial gestures.
Even more uses are on the way in the years ahead as companies like Facebook and Neuralink invest toward a world where brain pulses might replace clicks of a computer mouse.
The theory is that brain-computer interfaces open up a new area of economic enterprise, making it so that you can control hardware and software more quickly than with your fingers.
Mass adoption would mean that people could use their minds to toggle responses on the internet rather than tapping a smartphone screen or typing on a keyboard. It could remove the need for a smartphone entirely, replacing it with smart glasses or contact lenses that display images based on your thoughts.
“Imagine that you can type 100 words a minute by thinking. That’s the end of the keyboards,” said Yuste, the Columbia neurotech expert.
Now, the big concern is what happens when and if that science gets commercialized and deployed by tech firms with few regulations and profit-turning intentions. After all, the United States hasn’t created meaningful privacy laws in decades.
“There are already regulations in a medical context, but for noninvasive neurotech, it’s much more blurry,” said Dario Gil, senior vice president and director of research at IBM. “We believe there needs to be a much stronger dialogue to define how that data is collected [and] how it can be used. This is too important to be left as a Wild West.”
Advances in neuroscience have shown that thoughts are the result of a vast network of neurons firing across your nervous system. They are detectable, but deciphering those intricate patterns is tricky.
Every time you think or focus, neurons in your brain generate electrical signals, which travel like waves, pulsing through your skull and beyond. Sensors placed on your head can pick up these subtle signals, while algorithms work to turn the information into something meaningful, such as your intent to press “play” on a computer screen.
There’s only so much that can be achieved today, though, partly because decoding brain waves remains such a complex science.
“We currently do not have anywhere near to a full understanding of how the brain works,” said Polina Anikeeva, a neuroscientist at the Massachusetts Institute of Technology who studies bioelectronics. “We have good ideas about some fundamental functions, but even those are under investigation.”
Gadget makers have even less information to go by with devices that aren’t as accurate at picking up signals.
In the lab, scientists can study brain functions by tracking individual firing neurons and using functional magnetic resonance imaging (fMRI) machines to measure changes in the brain as they are happening. Electronics companies can’t because less invasive sensors aren’t as reliable.
Reading the brain is a delicate process. Tiny neuronal pulses have to go through your skull, scalp and hair before being picked up by a sensor. The signal is then amplified and interpreted. Each step increases the odds of outside interference. Muscle movements such as blinking are far louder and can throw off the data.
Even without all the answers, neurotech firms are hacking into parts of the organ to gain some insight into how thoughts manifest themselves, in a quest to figure out what brain-scanning services people are willing to pay for.
The most common method uses electroencephalography (EEG), which is basically electrodes placed on your head.
One way this is showing up is with NextMind’s brain-sensing wearable for developers, which can be retrofitted onto the back of caps, close to where your image-processing visual cortex is located. The machine is designed to learn what you’re focusing on and translate that into “direct brain commands” such as moving objects in video games.
“What the world wants is the easiest possible interaction with our everyday devices,” said Sid Kouider, founder and CEO of NextMind, a Paris-based company. “Having this direct connection is creating a symbiosis between you and your devices.”
But it takes time for a computer to read your thoughts and infer what you intend to do. It’s a latency issue weighing down much of the brain-sensing tech today. You can tap on a smartphone screen faster than a computer can guess that’s what you want to do and respond. You can tap a button on a gaming controller fairly quickly, too.
The lag time on NextMind’s device is under a second, which was “mind-blowing” to experience, according to Tyriel Wood, a 33-year-old YouTube vlogger who demos virtual reality games and hardware for a living. He tested the device in January and noticed a lagging response time in some cases but was impressed nonetheless.
“I could focus on an area and teleport there. But sometimes those movements may have been faster if I used a mouse or something like that,” Wood said.
Nevertheless, there’s only so much you can do with your hands. You’re limited by your finger movements, while brainpower can add another layer of internet engagement. For instance, imagine typing on a screen using your fingers while highlighting portions of the text using your mind.
Another neurotech approach is to use electromyography (EMG) to record nerve signals elsewhere on the body, or by tracking your intention to move a muscle. BrainCo, a robotics research startup, uses this to power prosthetics for amputees. Microsoft is researching the technology’s stress-management applications. Facebook wants to use it for augmented reality glasses within the next decade.
The social networking giant published its research on the topic in March so people could express their fears and concerns surrounding the technology.
“The reality is that we can’t anticipate or solve all the ethical issues associated with this technology on our own,” noted Sean Keller, research science director for Facebook Reality Labs.
A third proposition that’s not so consumer-friendly is to drill holes into your head one day, an idea heralded by Elon Musk. In 2016, the Tesla CEO launched Neuralink to enhance humans with robot artificial intelligence. He pledged $100 million toward the vision, which neuroscientists say may have value for people with illnesses and disorders affecting the brain.
Until then, the company is focused on animals. On April 8, Neuralink released a video appearing to show a monkey using the tech to play video games with its mind. The animal had a computer chip attached to its brain about six weeks before the video was shot, according to the startup. Musk says the interface will one day allow people with certain neurological conditions to control smartphones and computers with their minds, “faster than someone using thumbs.”
The company is still far from demonstrating results in humans.
While early-stage brain-sensing products from startups have been available on the internet for years, it remains unclear if a wearable neural interface will take the world by storm, how much it might cost and when such an offering might come out.
Still, the technology’s applications are seemingly endless.
What if you could truly know what your pet was thinking? What if you could seamlessly dump the images in your head onto a computer screen? What if you could transcend language, and share your ideas with someone else without saying or writing a word? Or what if a computer could organize your thoughts for you, and come up with a new movie script, term paper or workplace presentation that’s more likely to have an impact?
It might be hard not to invest.
“Wouldn’t it be great if you could have a logic AI tool sort through all the stuff in your head? It could be the ultimate personal productivity tool,” said Mary Lou Jepsen, a computer scientist and founder of Openwater, a startup focused on noninvasive internal body imaging. “By understanding how our brain works, we are going to be able to solve a lot of problems. We are going to be able to turn ourselves into something better.”