“I understand that you’re experiencing a relationship problem, is that right?”

Yes.

“OK, I’m delighted that you want to work on this, K. I’ve good and bad news for you. Which do you want to hear first?”

Good news.

“The good news is, I have very powerful skills that I can teach you! These skills have really helped a lot of people build better relationships.”

And the bad news?

“The bad news is that you would have to be willing to accept 100% of the responsibility for changing the relationship.”

Why?

“Because you’re only ever going to be able to have control over your part of it.”

This conversation may sound like a fairly typical therapist’s session, but in this case, it was directed by Woebot, a therapeutic chatbot, with the psychiatrist’s couch swapped for a smartphone screen.

Advertising

The app presents itself as an automated therapist when finding a real one can feel like a logistical and financial impossibility. At the same time, the need for therapists is only growing.

During the pandemic, about 4 in 10 adults in the United States reported that they had symptoms of anxiety or depression, according to the Kaiser Family Foundation. At the same time, the federal government warns of a critical shortage of therapists and psychiatrists. According to advocacy group Mental Health America, almost 60% of those with mental illness last year did not get treatment.

Mental health resources

If you or someone you know needs support for mental health, here’s where to find help.

Crisis Connections: Covers King County and surrounding areas with five programs focused on serving the emotional and physical needs of people across Washington state. Call 206-461-3222.

Washington 211: Free referral and informational help line that connects people to health and human services, available 24/7. Call 211.

Washington Recovery Helpline: 24-hour crisis-intervention and referral assistance for substance abuse, mental health and gambling. Call 866-789-1511.

National Suicide Prevention Lifeline: National network of local crisis centers that provides free and confidential emotional support to people in suicidal crisis or emotional distress 24 hours a day, seven days a week. Call 800-273-8255.

National Alliance on Mental Illness: The nation’s largest grassroots mental-health organization dedicated to building better lives for the millions of Americans affected by mental illness.

Mental Health America: Nonprofit dedicated to addressing the needs of those living with mental illness and to promoting the overall mental health of all Americans.

Here’s where to find diverse mental health resources in Seattle.

Woebot Health says the pandemic has driven up demand for its services. The number of its daily users doubled and is now in the tens of thousands, said company founder and president Alison Darcy, who is a psychologist.

Digital mental health has become a multibillion-dollar industry and includes more than 10,000 apps, according to an estimate by the American Psychiatric Association. The apps range from guided meditation (Headspace) and mood tracking (MoodKit) to text therapy by licensed counselors (Talkspace and BetterHelp).

But Woebot, introduced in 2017, is one of only a handful of apps that use artificial intelligence to deploy the principles of cognitive behavioral therapy, a common technique used to treat anxiety and depression. Woebot aims to use natural language processing and learned responses to mimic conversation, remember past sessions and deliver advice around sleep, worry and stress.

Advertising

“If we can deliver some of the things that the human can deliver,” Darcy said, “then we actually can create something that’s truly scalable, that has the capability to reduce the incidence of suffering in the population.”

Almost all psychologists and academics agree with Darcy on the problem: There is not enough affordable mental health care for everyone who needs it. But they are divided on her solution: Some say bot therapy can work under the right conditions, while others consider the very concept paradoxical and ineffective.

At issue is the nature of therapy itself. Can therapy by bot make people understand themselves better? Can it change long-held patterns of behavior through a series of probing questions and reflective exercises? Or is human connection essential to that endeavor?

Hannah Zeavin, author of the forthcoming book “The Distance Cure: A History of Teletherapy,” says the health care system is so broken that “it makes sense that there’s space for disruption.”

But, she added, not all disruption is equal. She calls automated therapy a “fantasy” that is more focused on accessibility and fun than actually helping people get better over the long term.

“We are an extraordinarily confessing animal; we will confess to a bot,” she said. “But is confession the equivalent of mental health care?”

Advertising

Eli turns to Woebot

Eli Spector seemed like the perfect client for therapy via artificial intelligence, or AI.

In 2019, Spector was a 24-year-old college graduate working in a neuroscience lab in Philadelphia. Having grown up with an academic father who specialized in artificial intelligence, Spector considered himself something of a technologist.

But Spector’s job was isolating and tedious, and after four stimulating years in academia, he felt bored and lonely. He couldn’t sleep well and found that his moods were consistently dark.

“I was just having a really hard time adjusting and I didn’t have any co-workers I liked,” he said. “It was just a tough period for me.”

But he wasn’t sure he wanted to bare his soul to a real person; he didn’t want to worry about anyone’s judgment or try to fit around someone else’s schedule.

Besides, he didn’t think he could find a therapist on his parents’ insurance plan that he could afford, as that could run from $100 to $200 a session. And Woebot was free and on his phone.

Sponsored

“Woebot seemed like this very low-friction way to see, you know, if this could help.”

Therapy by algorithm

Woebot’s use of cognitive behavioral therapy, or CBT, has a philosophical and practical logic to it. Unlike forms of psychotherapy that probe the root causes of psychological problems, often going back to childhood, CBT seeks to help people identify their distorted ways of thinking and understand how that affects their behavior in negative ways. By changing these self-defeating patterns, therapists hope to improve symptoms of depression and anxiety.

Because cognitive behavioral therapy is structured and skill-oriented, many mental health experts think it can be employed, at least in part, by algorithm.

“You can deliver it pretty readily in a digital framework, help people grasp these concepts and practice the exercises that help them think in a more rational manner,” said Jesse Wright, a psychiatrist who studies digital forms of CBT and is director of the University of Louisville Depression Center. “Whereas trying to put something like psychoanalysis into a digital format would seem pretty formidable.”

Wright said several dozen studies had shown that computer algorithms could take someone through a standard CBT process, step by step, and get results similar to in-person therapy. Those programs generally follow a set number of sessions and require some guidance from a human clinician.

But most smartphone apps don’t work that way, he said. People tend to use therapy apps in short, fragmented spurts, without clinician oversight. Outside of company-sponsored research, Wright said he knew of no rigorous studies of that model.

Advertising

And some automated conversations can be clunky and frustrating when the bot fails to pick up on the user’s exact meaning. Wright said AI is not advanced enough to reliably duplicate a natural conversation.

“The chances of a bot being as wise, sympathetic, empathic, knowing, creative and being able to say the right thing at the right time as a human therapist is pretty slim,” he said. “There’s a limit to what they can do, a real limit.”

John Torous, director of digital psychiatry for Beth Israel Deaconess Medical Center in Boston, said therapeutic bots might be promising, but he is worried they are being rolled out too soon, before the technology has caught up to the psychiatry.

“If you deliver CBT in these bite-size parts, how much exposure to bite-size parts equals the original?” he said. “We don’t have a good way to predict who’s going to respond to them or not — or who it’s good or bad for.” These new apps, Torous said, risk setting back other advances in digital mental health: “Do we in part end up losing trust and credibility because we’re promising what is not yet possible by any machine or any program today?”

Other mental health professionals say therapy should simply not be delivered by machine. Effective treatment involves more than just cognitive skill-building, they say. It needs a human-to-human connection. Therapists need to hear nuances, see gestures, recognize the gap between what is said and unsaid.

“These apps really shortchange the essential ingredient that — mounds of evidence show — is what helps in therapy, which is the therapeutic relationship,” said Linda Michaels, a Chicago-based therapist who is co-chair of the Psychotherapy Action Network, a professional group.

Advertising

Darcy says a well-designed bot can form an empathetic, therapeutic bond with its users, and her company recently published a study making that claim: 36,000 Woebot users responded to statements such as “I believe Woebot likes me,” “Woebot and I respect each other” and “I feel that Woebot appreciates me.”

The study’s authors — all with financial ties to the company — concluded that a significant percentage of participants perceived a “working alliance” with Woebot, a term that means the therapist and patient have formed a cooperative rapport. The study did not measure whether there actually was a working alliance.

Sherry Turkle, a clinical psychologist at the Massachusetts Institute of Technology who writes about technology and relationships, is not swayed by such evidence. For therapy to heal, she said, the therapist must have a lived experience and the ability to empathize with a patient’s pain. An app cannot do that.

“We will humanize whatever seems capable of communicating with us,” Turkle said. “You’re creating the illusion of intimacy, without the demands of a relationship. You have created a bond with something that doesn’t know it is bonding with you. It doesn’t understand a thing.”

Eli pours out his problems

Eli Spector started with Woebot in the summer of 2019.

He liked that he could open the app whenever he felt like it and pour out his thoughts of distress on his own schedule, for even a few minutes at a time. Most of the words coming out had to do with how unhappy he felt at his job.

He also took advantage of Woebot’s other features, including tracking his mood and writing in an online journal. It helped him realize how depressed he really was.

Advertising

But he had doubts about the algorithm. The bot’s advice often felt generic, like a collection of “mindfulness buzzwords,” he said. “Like, ‘Can you think more about that feeling, and what you could do differently?’”

Worse, the advice could be nonsensical.

“I would type in like, ‘My boss doesn’t appreciate the work I do’ and ‘I can’t seem to get her approval,’” Spector said. “And Woebot would be like, ‘That sounds difficult. Does this happen more in the morning or at night?’”

“It felt sort of silly,” Spector said.

Is it really therapy?

Much of the debate over therapeutic bots comes down to expectations. Do patients and clinicians understand the limitations of chatbots? Or are they expecting more than even the companies say they deliver?

On its website, Woebot promises to “automate both the process and content of therapy,” but Darcy is careful not to call Woebot medical treatment or even formal therapy.

Instead, she says, the bot delivers “digital therapeutics.” And Woebot’s terms of service call it a “pure self-help” program that is not meant for emergencies. In fact, in the event of a severe crisis, Woebot says it is programmed to recognize suicidal language and urge users to seek out a human alternative.

In that way, Woebot does not approach true therapy. Like many mental health apps, the current, free version of Woebot is not subject to strict oversight from the Food and Drug Administration because it falls under the category of “general wellness” product, receiving only FDA guidance.

Advertising

But Woebot is striving for something more. With $22 million of venture capital in hand, Woebot is seeking clearance from the FDA to develop its algorithm to help treat two psychiatric diagnoses, postpartum depression and adolescent depression, and then sell the program to health systems.

And it is here that Woebot hopes to make money, using its practical advantage over any human therapist: scale.

While other virtual-therapy companies, such as BetterHelp or Talkspace, must keep recruiting therapists to join their platforms, AI apps can take on new users without paying for extra labor. And while therapists can vary in skills and approach, a bot is consistent and doesn’t get stressed out by back-to-back sessions.

“The assumption is always that, because it’s digital, it’ll always be limited,” Darcy said. “There’s actually some opportunities that are created by the technology itself that are really challenging for us to do in traditional treatment.”

One advantage of an artificial therapist — or, as Darcy calls it, a “relational agent” — is round-the-clock access. Very few human therapists answer their phone during a 2 a.m. panic attack, as Darcy pointed out. “I think people have probably underestimated the power of being able to engage in a therapeutic technique in the moment that you need to,” she said.

But whether Woebot can be involved in medical diagnosis or treatment is up to the FDA, which is supposed to make sure the app can back up its claims and not cause harm, an agency spokesperson said.

Advertising

One possible harm, the spokesperson said, is a “missed opportunity” where someone with mental illness fails to get more effective treatment or delays treatment. “And what the consequences of those delays would look like — that’s something we’d worry about,” the spokesperson said.

AI can be problematic in other ways. For instance, Zeavin worries that racial and gender bias or privacy breaches could simply get translated into bots.

“Therapy has enough problems on its own,” Zeavin said. “And now they’ve brought all of the problems of algorithmic technology to bear.”

But even some skeptics of chatbot therapy believe it has the potential to complement the human-guided mental health system, as long as it comes with serious research.

“As the market gets saturated, the bar for evidence will get higher and higher, and that’s how people will compete,” Torous said. “So maybe we’re just in such early stages and we don’t want to punish people for being innovative and kind of trying something.”

The idea, Darcy says, is not to replace human therapists with bots; she thinks it’s important to have both. “It’s like saying if every time you’re hungry, you must go to a Michelin star restaurant, when actually a sandwich is going to be OK,” she said. “Woebot is a sandwich. A very good sandwich.”

Advertising

Eli breaks up with Woebot

After about a month, Eli Spector deleted Woebot from his phone.

He was unimpressed by the bot’s advice for beating back loneliness and despair, but he is not entirely sorry that he tried it out.

The mere act of typing out his problems was helpful. And through the process, he pinpointed what he actually needed to feel better.

“So maybe this was just evidence that I needed to, like, actually address this,” he said. “It was enough to inspire me to just take the plunge and find a flesh-and-blood therapist.”

Now, Spector pays a human psychotherapist in Philadelphia $110 a session.

They’ve been meeting on Zoom since the pandemic began, so the flesh-and-blood part is somewhat theoretical. But it’s close enough.

This article originally appeared in The New York Times.