A faceless entity but full of good intentions. An oracle to guide you through the tangled webs of the mind. An ethereal companion who never interrupts and always finds the right words. Available permanently, without judgment and by simple download, at little or no cost, or even free. Since their appearance in the late 2010s, therapeutic robots – virtual entities powered by artificial intelligence (AI) for psychotherapeutic purposes – have continued to gain ground in mental health services. A utopia realized or a frightening dystopian reality, depending on your point of view.
Two pressing questions surround these psychobots – a term coined to suit our cultural lexicon. The first concerns their ability to adapt – often with unpredictable results – to the unique idiosyncrasies of each individual through the use of generative AI. The second delves into deeper philosophical territory: is it ethical for these robots to imitate human qualities? “Creating emotional intimacy by having a machine simulate empathy or compassion is manipulating people,” says Jodi Halpern, who leads a group on ethics and technology at the University of California. in Berkeley, speaking via video conference. A third question also hovers over the debate: will these intangible tools one day be able to replace flesh-and-blood psychologists?
In a patchwork landscape of poorly regulated services, mental health startups now coexist with general chatbots which, like faithful confidants or tireless companions, show the same enthusiasm for your last appointment as for congratulating you on having successfully completed a exam. Along the way, they also offer tips for managing anxiety spikes or breaking free from depressive cycles.
Wysa belongs to the first category. This chatbot focuses on cognitive behavioral therapy (CBT), the most used approach in psychological practice. Tested by EL PAÍS, Wysa – a robot already recommended by the UK public health system – guides users to reframe cognitive distortions and manage distressing emotions through structured techniques. His tone is neutral, almost clinical, and his therapeutic approach seems particularly rigid. “As soon as a person deviates from the path, whether by describing how they feel or expressing their thoughts, the robot is programmed to redirect them to the predefined path of the clinical tools that we provide,” explains John Tench , global director of the company.
The experience with Pi is markedly different. Pi belongs to a category of relational or conversational robots – Replika and Character.ai are two of the most prominent examples – that rely on advanced language models, the cornerstone of generative AI, to create surprisingly real interactions . In other words, deeply human. During the test, the robot ventured to speculate that an alleged lack of self-esteem might stem from an unhealthy mother-child relationship. He was persistent in offering his support, peppering his responses with a glut of hyperbolic expressions of affection and reassuring the user that he was always happy to help in times of need.
Between robots who navigate the intricacies of CBT with a DIY approach and those who improvise a form of psychological treatment without limits, the boundaries are anything but clear. This ambiguity extends not only to how they operate – the level of generative AI they employ – but also, and more importantly, to the claims they make to attract users.
According to Halpern, companies like Pi and Replika skirt accountability by claiming that “they are not mental health experts.” However, to his knowledge, “their advertising targets individuals who openly share on social media that they suffer from depression or severe anxiety.”
Meanwhile, among companies that explicitly claim to have a psychotherapeutic mission, there are many gray areas and half-truths. “Some openly state that they do not aim to replace human psychologists, while others exaggerate their abilities and downplay their limitations,” says Jean-Christophe Bélisle-Pipon, an ethics and AI researcher at Simon Fraser University. Canada. Last year he published an article in Borders with an unambiguous title: Your robot therapist is not your therapist.
On its website, Youper – another startup offering services similar to Wysa – describes itself as an “empathetic psychobot”. And Woebot, a competitor in this rapidly expanding market, also leaned into this inherently human concept until last year, when Halpern and others publicly criticized its misleading use of the term in major media outlets like The Washington Post.
Bélisle-Pipon argues that such false claims – often tolerated in the advertising of other technologies, such as cars that promise freedom or smartphones that claim to unlock happiness – have no place in the field of mental health . “Not only does this risk causing serious misunderstandings among vulnerable people, it also undermines the complexity and professionalism of genuine psychotherapy. Real therapy is nuanced, deeply contextual and fundamentally relational,” he emphasizes.
Better than nothing?
Miguel Bellosta Batalla, a Spanish psychoanalyst who has extensively studied the importance of the professional-patient relationship in psychotherapy, admits that he is “scared” by services that “dehumanize a real encounter.” He says that research has consistently shown that the most critical factor that influences the success of psychological treatment is precisely “the therapeutic bond” between two individuals who share fundamental human experiences, such as “the fear of death, the search for sense or fear. responsibility that freedom implies.
Even with an approach like CBT – generally considered more structured and guided than psychoanalysis or humanistic therapies – Bellosta Batalla argues that therapy sessions still involve “unforeseen events which, if handled well, can have a profound impact on the patient. Bélisle-Pipon, for his part, highlights qualities that no machine, according to him, will ever be able to reproduce: “the subtlety of reading non-verbal language, the ability to understand subjective experiences or moral intuition”.
Despite their relative novelty, robot therapists have already been the subject of in-depth studies aimed at evaluating their effectiveness. A meta-analysis published in 2023 in Nature examined the results of 15 studies, examining both robots powered by generative AI and those with more predictable response systems. The authors noted the challenges of analyzing such a diverse and rapidly evolving field, but concluded that, overall, these tools provide short-term relief from psychological discomfort without significantly improving well-being. -be long term. In other words, they offer temporary relief but fail to establish a solid foundation for a healthier mind.
Likewise, another meta-analysis published in ScienceDirect in August 2023 offered cautious conclusions. It identified a modest positive effect on people with depressive symptoms, but only a negligible impact on those with anxiety disorders.
Millions of people cannot access a psychologist, mainly for economic reasons. And in the absence of viable (read: human) alternatives, people struggling with mental health issues may wonder: Are therapy robots better than nothing? Wysa’s global director acknowledges that while the company does not aim to “replace person-to-person psychotherapy,” it can help “individuals understand and process their emotions in a stigma-free, completely anonymous space.” “.
Bélisle-Pipon considers this a valid, albeit complicated, question with no easy answer. Firstly because, in many cases, using a psychobot could “worsen the symptoms if the advice it provides is inappropriate”. Second, because allowing machines to play a greater role in such a sensitive area could pave the way for a two-tiered mental health landscape, “normalizing poor quality services instead of promoting more equitable access to genuine psychotherapy”. In this scenario, credentialed professionals would be available to those who can afford them, while others would have to seek help from impersonal voices.
Register for our weekly newsletter to get more English media coverage from EL PAÍS USA Edition