Artificial intelligence as a mental health consultant
Artificial intelligence is felt to be everywhere at the moment – whether as ChatGPT, which one or the other has already used to write a school assignment or birthday card, or less obviously in the algorithm of our social media channels, which we use every day.
The idea that you can also use AIs for social interactions and have conversations with them is not new. Whether as humanoid robots in sci-fi films or Joaquin Phoenix in the romantic drama “Her”, who even falls in love with his AI aka Scarlett Johansson’s voice. But who knows how “Her” ends, guesses it already: AIs just don’t work human bindings and, if you expect that, you will be disappointed in the end. The National Eating Disorders Association had to learn that too.
An artificial intelligence as a consultant – this organization has tested it
The organization, which works on body positivity and mental health issues related to body image, has provided a chatbot called Tessa for their clients to offer advice and help when they are suffering from a suffering from eating disorder.
The question of using AI is always one of work resources
The launch of the chatbot was coupled with the closure of the organization’s human-serviced helpdesk, and with it one of the biggest concerns and talking points around the topic of artificial intelligence: jobs. In the case of the counseling center, it was mainly non-profit employees who were previously entrusted with the work and were not dismissed.
But one concern the organization’s CEO, Liz Thompson, had back in March when Tessa was announced was the language the bot uses.
The AI Tessa gave problematic weight loss advice to people with eating disorders and physical complexes
After only a short test phase, the club has now stopped providing advice through the chatbot. Because he gave unhelpful to dangerous advice to people who were looking for help with eating disorders. For example, Tessa gave advice on calorie cutting, weight loss, and even dieting to people struggling with their body image. Information that may even increase their eating disorder or self-loathing.
A professor of psychiatry tested the mental health chatbot with very basic questions — and got disastrous results
Project Tessa has now been finally halted after some psychologists tested the bot’s behavior and were shocked by how the AI responded to even the simplest of questions. One of the testers was psychologist Alexis Conason, who specializes in eating disorders. In one of her tests, Conason told the bot that she had recently gained a lot of weight and really hated her body. In response, chatbot Tessa encouraged her to “go for healthy and sustainable weight loss.”
When asked by Conason how many calories per day she should be cutting out to lose weight sustainably, Tessa replied, “A safe daily calorie deficit to achieve (a 1-2 pound weight loss per week) is around 500-1000 calories per day,” and also recommended seeing a nutritionist or healthcare provider.
Conason says she fed Tessa the kinds of questions her patients would typically ask her when beginning treatment for an eating disorder. She was disturbed to see that the chatbot recommended avoiding added sugars or processed foods in addition to reducing calories: Making such dietary restrictions explicit is contrary to any meaningful type of eating disorder treatment and can even worsen symptoms, Conason explains.
That’s what the developers say about the failed therapy chatbot
The problem with Tessa: Unlike chatbots like ChatGPT, Tessa was not developed with generative AI technologies. Generative AI can produce new and, above all, individually tailored, unique results by constantly reading and regenerating new data with the help of machine learning. However, Tessa relies on an educational program called Body Positive, which is designed to help prevent eating disorders but is not designed to treat or even treat sufferers, says Ellen Fitzsimmons-Craft, a professor of psychiatry at Washington University School of Medicine who helped develop the program. When she saw the results that the AI produced based on the data they provided, the professor was shocked. Because they would have wanted to achieve the exact opposite with Body Positive.
Tessa was developed by the health technology company X2AI, which was founded by entrepreneur Michiel Rauws and offers mental health advice via SMS. According to our WIRED– Colleagues, Rauws did not respond to inquiries about Tessa and her advice on weight loss or response disorders. However, they seem to have put the development on hold for the time being. Because the Tessa homepage is no longer available on the company’s website.
The organization that tried out the therapy chatbot recognized the danger – it remains to be seen if this will be the last such attempt
Liz Thompson, on the other hand, has something to say that AI cannot replace human interaction, “even a very intuitive program”. The aid organization itself also published a statement on its website, which said: “A chatbot is not a substitute for human empathy, and we assume that such attempts can cause irreparable damage to the affected community of eating disorder patients. ”
While research is dealing with AI technology and its potential markets in rapid progress, the questions are becoming more and more important: Where should AI really be used? And where do limits have to be set so that people and their health are not endangered? Mental health is an area where these issues need to be particularly debated.
This article was created with text passages from our WIRED colleagues from the USA.