In the nineteen-sixties, Joseph Weizenbaum, a computer scientist at M.I.T., created a computer program called Eliza. It was designed to simulate Rogerian therapy, in which the patient directs the conversation and the therapist often repeats her language back to her:
Weizenbaum made Eliza as satire. He doubted that computers could simulate meaningful human interaction. He was alarmed, therefore, when many people who tried the program found it both useful and captivating. His own secretary asked him to leave the room so that she could spend time alone with Eliza. Worse, doctors saw it as a potentially transformative tool. “Several hundred patients an hour could be handled by a computer system designed for this purpose,” three psychiatrists wrote in The Journal of Nervous and Mental Disease, in 1966. “The human therapist, involved in the design and operation of this system, would not be replaced, but would become a much more efficient man since his efforts would no longer be limited to the one-to-one patient-therapist ratio as now exists.”
Weizenbaum became an outspoken critic of artificial intelligence. “But the genie was out of the bottle,” Brian Christian, who chronicled the episode in his book “The Most Human Human,” told me. A few years later, a Stanford psychiatrist named Kenneth Colby created Parry, a program that attempted to simulate the language of a person with paranoid schizophrenia, to train students before they cared for real patients. Psychiatrists given transcripts of therapy sessions often couldn’t tell the difference between Parry and humans; in this narrow sense, the chatbot passed the Turing test. In 1972, Parry and Eliza met up for a therapy session:
Over time, programmers developed Jabberwacky, Dr. Sbaitso, and Alice (the Artificial Linguistic Internet Computer Entity). Exchanges with these chatbots were often engaging, sometimes comical, and occasionally nonsensical. But the idea that computers could serve as human confidants, expanding therapy’s reach beyond the limits of its overworked practitioners, persisted through the decades.
In 2017, Alison Darcy, a clinical research psychologist at Stanford, founded Woebot, a company that provides automated mental-health support through a smartphone app. Its approach is based on cognitive behavioral therapy, or C.B.T.—a treatment that aims to change patterns in people’s thinking. The app uses a form of artificial intelligence called natural language processing to interpret what users say, guiding them through sequences of pre-written responses that spur them to consider how their minds could work differently. When Darcy was in graduate school, she treated dozens of hospitalized patients using C.B.T.; many experienced striking improvements but relapsed after they left the hospital. C.B.T. is “best done in small quantities over and over and over again,” she told me. In the analog world, that sort of consistent, ongoing care is hard to find: more than half of U.S. counties don’t have a single psychiatrist, and, last year, a survey conducted by the American Psychological Association found that sixty per cent of mental-health practitioners don’t have openings for new patients. “No therapist can be there with you all day, every day,” Darcy said. Although the company employs only about a hundred people, it has counselled nearly a million and a half, the majority of whom live in areas with a shortage of mental-health providers.
If you are having thoughts of suicide, please call the National Suicide Prevention Lifeline at 1-800-273-TALK (8255) or text TALK to 741741.
Maria, a hospice nurse who lives near Milwaukee with her husband and two teen-age children, might be a typical Woebot user. She has long struggled with anxiety and depression, but had not sought help before. “I had a lot of denial,” she told me. This changed during the pandemic, when her daughter started showing signs of depression, too. Maria took her to see a psychologist, and committed to prioritizing her own mental health. At first, she was skeptical about the idea of conversing with an app—as a caregiver, she felt strongly that human connection was essential for healing. Still, after a challenging visit with a patient, when she couldn’t stop thinking about what she might have done differently, she texted Woebot. “It sounds like you might be ruminating,” Woebot told her. It defined the concept: rumination means circling back to the same negative thoughts over and over. “Does that sound right?” it asked. “Would you like to try a breathing technique?”