© 2024 WYPR
WYPR 88.1 FM Baltimore WYPF 88.1 FM Frederick WYPO 106.9 FM Ocean City
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations
WYPO 106.9 Eastern Shore is off the air due to routine tower work being done daily from 8a-5p. We hope to restore full broadcast days by 12/15. All streams are operational

Eating disorder helpline takes down chatbot after it gave weight loss advice

AILSA CHANG, HOST:

How did a chatbot designed to help people with eating disorders end up offering advice on weight loss and dieting? Well, that is the question now that the National Eating Disorders Association has taken down this controversial chatbot just days after NPR reported on it. Michigan Radio's Kate Wells has been covering this and joins us now. Hey, Kate.

KATE WELLS, BYLINE: Hey.

CHANG: OK, so why was the National Eating Disorders Association trying to use a chatbot in the first place here?

WELLS: Yeah, the context is really important. The association is known as NEDA, and obviously it works to support patients with eating disorders. And for more than 20 years now, they have had this help line that's been really popular. It's staffed by humans, but when COVID hit, the calls and messages to the help line went way up. They got, like, 70,000 contacts just last year alone. They said the volume of these calls, the severity of these calls wasn't sustainable. And last month, they shut the help line down, and that was very controversial in itself. But this chatbot, which is called Tessa, was one of the resources NEDA was going to offer and invest in and really promote even after this help line was gone.

CHANG: OK, so what exactly went wrong with Tessa?

WELLS: Yeah, there's this consultant in the eating disorder field. Her name is Sharon Maxwell, and she hears about this a couple weeks ago. She decides she wants to go try Tessa out. She asked the chatbot, hey, Tessa. How do you support people with eating disorders? And Tessa gives her a response that's like, oh, coping mechanisms, healthy eating habits. And Maxwell starts asking it more about these healthy eating habits, and soon Tessa is telling her things that sound a lot like what she heard when she was put on Weight Watchers at age 10.

CHANG: Wow.

SHARON MAXWELL: The recommendations that Tessa gave me was that I could lose one to two pounds per week, that I should eat no more than 2,000 calories in a day, that I should have a calorie deficit of 500 to 1,000 calories per day, all of which might sound benign to the general listener. However, to an individual with an eating disorder, the focus of weight loss really fuels the eating disorder.

CHANG: Exactly. OK, so, Kate, this obviously was not what they intended for the chatbot...

WELLS: Yeah.

CHANG: ...To do. So what was the response from NEDA?

WELLS: Well, so Maxwell posts about this on Instagram, and she provides screenshots of the conversations with Tessa to NEDA. And she says within hours of that, the chatbot was taken down. NEDA told us that it's grateful to Maxwell and others for bringing this to their attention, and they're blaming the company that was operating the chatbot.

CHANG: And what did the company do to the chatbot specifically?

WELLS: So what you need to know about Tessa is that it was originally created by eating disorder experts. It was not like ChatGPT, which we hear a lot about. It couldn't just create new content on its own. One of those creators is Ellen Fitzsimmons-Craft. She's a professor at Washington University's medical school in St. Louis, and she says they intentionally kept Tessa pretty narrow because they knew that this was going to be a high-risk situation.

ELLEN FITZSIMMONS-CRAFT: By design, it couldn't go off the rails. We were very cognizant of the fact that AI isn't ready for this population, and so all of the responses were preprogrammed.

WELLS: But then at some point in the last year, the company that's operating Tessa - it's called Cass - added generative artificial intelligence, meaning it gave Tessa the ability to learn from new data and generate new responses. And the CEO of Cass told me that this is part of a systems upgrade, and he says that this change was part of its contract with NEDA. We should note that both the company and NEDA have apologized.

CHANG: OK. And we are seeing, you know, more and more of these chatbots in the mental health area. Like, there are apps you can download, companies...

WELLS: Yeah.

CHANG: ...That are promoting AI therapy. Is the takeaway here that this is just a bad idea?

WELLS: Well, you can see why AI is so tempting, right? I mean, it's convenient. It's cheaper than hiring more and more humans. But what we are seeing repeatedly is that chatbots make mistakes, and in high-risk situations, that can be harmful.

CHANG: That is Kate Wells with Michigan Radio. Thank you so much, Kate.

WELLS: Thank you. Transcript provided by NPR, Copyright NPR.

NPR transcripts are created on a rush deadline by an NPR contractor. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.

Kate Wells is a Peabody Award-winning journalist and co-host of the Michigan Radio and NPR podcast Believed. The series was widely ranked among the best of the year, drawing millions of downloads and numerous awards. She and co-host Lindsey Smith received the prestigious Livingston Award for Young Journalists. Judges described their work as "a haunting and multifaceted account of U.S.A. Gymnastics doctor Larry Nassar’s belated arrest and an intimate look at how an army of women – a detective, a prosecutor and survivors – brought down the serial sex offender."