Skip to main content

The National Eating Disorder Helpline Replaced Its Staff With a Chatbot

Samara Weaving as Laura Crane on a phone in Scream VI

After helpline associates at the National Eating Disorder Association (NEDA) made the move to unionize, NEDA’s response was to fire its entire staff and replace them with a chatbot. NEDA’s helpline has been in service for 20 years and experienced a boom in the number of calls it received during the COVID-19 pandemic.

Recommended Videos

Even after the COVID-19 pandemic, the helpline is still experiencing an elevated level of calls, with an estimated 70,000 callers reaching out to NEDA’s helplines in the past year alone. The confidential hotline provides a source of peer-to-peer support for those struggling with eating disorders. During the isolation of the pandemic, helplines like this were the only kind of human support that some callers had.

In contrast to that high demand, NEDA’s hotline has a very small staff. Aside from volunteers, the hotline employs just six full-time staffers and a handful of supervisors. With the helpline seeing staggering increases in demand, staffers began to realize that their current system wasn’t sustainable. These staffers were often tasked with training and supervising as many as 200 volunteers at one time.

Meanwhile, not only have the number of calls increased, but the severity has, as well. Staffers reported an increase of “crisis-type calls” and cases of “child abuse or child neglect.” Considering that the individuals taking these calls are often just volunteers and not professionals, ongoing training and supervision are vital.

So, four of the helpline’s staffers decided to unionize to ensure that NEDA provided them with a more safe and effective work environment. NEDA’s reaction to that? A chatbot.

NEDA’s response to helpline’s unionization

One of the staffers who instigated the unionization was Abbie Harper. In a blog post on Labor Notes, she explained the helpline workers’ reasonable demands and explained, “We didn’t even ask for more money.” She said the union simply asked for better training programs, appropriate staffing, and opportunity for staffers to advance in their careers at NEDA. Four days after the union won an election for official recognition with the National Labor Relations Board, NEDA revealed during a virtual staff meeting that it was ending the helpline. By June 1, all of its staffers will be fired. Many volunteers will be let go, too, while others may be moved to other areas of NEDA.

In place of the helpline, NEDA is introducing a chatbot named Tessa. NEDA has claimed it is not a “replacement” for the helpline, but an entirely new program. The Tessa chatbot isn’t even the same as the more sophisticated ChatGPT that has arisen recently. More sophisticated artificial intelligence, like ChatGPT, uses context to generate responses to allow it to sustain a human-like conversation.

However, a more dated chatbot like Tessa can’t generate these more spontaneous responses. Instead, it has a limited number of pre-determined responses. It describes itself as a chatbot immediately and then might “walk a user through a specific series of therapeutic techniques about something like body image.” It is not a “listening ear” nor an “open-ended tool,” and it may not have a response to every question that callers have.

Can AI replace the value of human empathy?

NEDA has reportedly already begun testing Tessa. Of 700 women who tested the chatbot, 375 gave the program an “100% helpful rating.” The feedback of the other 325 women is not mentioned, though. Meanwhile, Harper has doubts about the ability of a chatbot to perform the same work that she and her colleagues did. One thing she and her colleagues have that a chatbot doesn’t have is experience. Many of NEDA’s helpline staffers and volunteers have recovered from eating disorders and have invaluable knowledge, support, and empathy to provide for those experiencing the same things they did.

NEDA VP Lauren Smolar defended the decision to replace its helpline with a chatbot because of legal liability. She explained the risks of having non-professional volunteers deal with crisis calls but didn’t touch on the increased risks that come with having a machine potentially take crisis calls. With a chatbot, there’s a strong possibility it won’t have a response for someone in crisis, while with ChatGPT there’s the chance of it going off the rails and potentially spewing harmful information. A chatbot might be a minor resource for people who are waitlisted for the helpline, but it simply can’t replace callers who are looking to have a peer-to-peer conversation and are in desperate need of human support.

The rise in helpline demand clearly shows the value of human support, so it seems very strange for NEDA to respond to that by getting rid of its helpline entirely. This is why Harper states the move was merely about “union busting” and not at all about helping individuals. Chatbots and AI can’t feel emotion, and this is where the potential for them to cause harm comes in. Even when callers are told that they’re speaking to a machine, it may feel like being confronted with yet another person who can’t empathize with their struggles.

(featured image: Paramount Pictures)

Have a tip we should know? [email protected]

Author
Rachel Ulatowski
Rachel Ulatowski is an SEO writer for The Mary Sue, who frequently covers DC, Marvel, Star Wars, YA literature, celebrity news, and coming-of-age films. She has over two years of experience in the digital media and entertainment industry, and her works can also be found on Screen Rant and Tell-Tale TV. She enjoys running, reading, snarking on YouTube personalities, and working on her future novel when she's not writing professionally. You can find more of her writing on Twitter at @RachelUlatowski.

Filed Under:

Follow The Mary Sue: