This is a cache of https://www.theguardian.com/technology/2024/dec/03/the-chatgpt-secret-is-that-text-message-from-your-friend-your-lover-or-a-robot. It is a snapshot of the page at 2024-12-04T01:20:17.715+0000.
The ChatGPT secret: is that text message from your friend, your lover – or a robot? | ChatGPT | The Guardian Skip to main contentSkip to navigationSkip to navigation
Illustration of a man talking to a robot via text prompt
Composite: Guardian Design; Yaroslav Kushta;shapecharge/Getty Images
Composite: Guardian Design; Yaroslav Kushta;shapecharge/Getty Images

The ChatGPT secret: is that text message from your friend, your lover – or a robot?

People are turning to chatbots to solve all their life problems, and they like its answers. But are they on a very slippery slope?

When Tim first tried ChatGPT, he wasn’t very impressed. He had a play around, but ended up cancelling his subscription. Then he started having marriage troubles. Seeking to alleviate his soul-searching and sleepless nights, he took up journalling and found it beneficial. From there, it was a small step to unburdening himself to the chatbot, he says: “ChatGPT is the perfect journal – because it will talk back.”

Tim started telling the platform about himself, his wife, Jill, and their recurring conflicts. They have been married for nearly 20 years, but still struggle to communicate; during arguments, Tim wants to talk things through, while Jill seeks space. ChatGPT has helped him to understand their differences and manage his own emotional responses, Tim says. He likens it to a friend “who can help translate from ‘husband’ to ‘wife’ and back, and tell me if I’m being reasonable”.

He uses the platform to draft loving texts to send to Jill, calm down after an argument and even role-play difficult conversations, prompting it to stand in for himself or Jill, so that he might respond better in the moment. Jill is aware that he uses ChatGPT for personal development, he says – if maybe not the extent. “But she’s noticed a big change in how I show up in the relationship.”

When the free-to-use chatbot was launched in November 2022, it became the fastest-growing platform in history, amassing one million users in five days. Two years later, ChatGPT is not only more powerful but increasingly commonplace. According to its developer, OpenAI, more than 200 million people are using it weekly – and not just for work.

ChatGPT is gaining popularity as a personal cheerleader, life coach and even pocket therapist. The singer Lily Allen recently said on her podcast that she uses ChatGPT to mediate text arguments with her husband, using prompts such as “add in a bit about how I think this is all actually to do with his mum”. The novelist Andrew O’Hagan said he uses another chatbot to turn people down, calling it his “new best friend”.

It shows how – steadily, but subtly – generative AI is making inroads into our personal and professional lives. “It’s everywhere, and it’s happened so quickly. We really don’t have any way of addressing or understanding it yet,” says Ella Hafermalz, an associate professor of work and technology at Vrije Universiteit Amsterdam.

In a recent study, Hafermalz and her colleagues found that in the workplace, people are increasingly turning to ChatGPT with their questions rather than asking their colleagues or manager, which can cause problems with organisational effectiveness and personal relations. “The technology is bad enough at the moment that people are getting burnt … but it is seductive,” she says.

After interviewing 50 early adopters of ChatGPT, the researchers found that people were driven to explore “out of curiosity, anxiety or both”. From tinkering around with the platform with “dumb stuff”, it was often a rapid progression to integrated daily use, says Hafermalz. “We’re seeing it in all sorts of different contexts.” She uses ChatGPT herself to proofread her writing, express herself in Dutch (her second language) and even generate bedtime stories for her children. The technology isn’t inherently negative, she says – but it does pose challenges that will be more destabilising if we don’t engage with them. Right now, she says, “people are at vastly different levels with Gen AI”, driven by private use. You don’t have to be using ChatGPT yourself to be interacting with its output.

‘It was reassuring to me that my gut response was the chatbot’s gut response.’ Photograph: Ivan Pantic/Getty Images (posed by model)

Yvette works in the charity sector, and started using ChatGPT to refine funding applications. “I don’t use it to write the whole thing, because it comes off as completely disingenuous,” she says. But she has also used it in a personal context: “My ex is not a nice person, not easy to deal with.” She does her best to keep the peace for the sake of their child, but when she received a letter informing her that he would no longer be paying child maintenance, she was furious. “I thought, ‘I’m going to have to stand up for this – it’s not right.’”

In the past, she might have spent hours crafting a text that was assertive but not emotional; this time, Yvette let loose to ChatGPT. “I ranted away, and said all of the horrible things I wanted to say … and then ChatGPT spat out this much more balanced viewpoint.”

The exercise was “quite therapeutic”, Yvette says – and nowhere near as emotionally taxing as writing the message herself. She sent ChatGPT’s suggestion to her ex-partner unchanged. “It was a bit Americanised, but I didn’t really care.” He responded with a “nasty message”, but Yvette found that she was able to resist engaging. Her ex eventually agreed to continue paying support.

The chatbot-middleman took the heat out of the interaction, enabling her to present a “better version” of herself, Yvette says. She has since used ChatGPT for support with troubles her child is having at school. “I know that it’s not going to be perfect, but even then it came back with practical tips.” Some she had already thought of, but she appreciated the validation. “It was reassuring to me that my gut response was its gut response.”

For Tim, ChatGPT has played a more active role – as “a teacher of emotional intelligence”. Since the platform introduced its “memory” function in February, it can now draw on everything that Tim has inputted about Jill and their relationship to give more personalised responses. When he asked ChatGPT to describe their individual “psychological blindspots”, it produced a lengthy list of what he saw as “99% of the drivers of conflict” in their marriage. “It nailed me perfectly,” he says.

While Tim is aware that the chatbot only gets “one side of the story”, he says it has made him a better partner, shielding Jill from his spirals. “If I get really anxious, like ‘What’s she thinking?’, I can go to ChatGPT and it says: ‘She’s doing this because of this’ … That’s the perfect thing: ChatGPT can do emotional labour all day long.”

The interactive, responsive element has enlarged his understanding of empathy, Tim adds. “Before, my version was just to imagine me, in her position … Now I’ve got a much bigger respect for emotionality.”

Previously, when Tim sought advice online, he was directed to hyper-masculine, even toxic resources. “It does sound a little bit bad to say, ‘As a man, ChatGPT helps me understand women.’ But when you think that it’s trained on everything, and so many books written by women … It has no gender; it’s all of humanity.”

Indeed, ChatGPT now knows enough about Jill to anticipate her response, Tim says. “Sometimes, if I’m going to send her a message, I’ll ask ChatGPT: ‘Given what you know about my wife, how will she interpret this?’” The chatbot might suggest a different text, which Tim always revises, but he acknowledges that on occasion, ChatGPT’s feedback has “really saved my skin”.

Tim is not in therapy; Jill doesn’t want to go together, and he’s put off by the cost. Barriers to professional help are one reason for ChatGPT’s mounting popularity as an emotional support tool. It is being used for reflective journalling, dream analysis and exercises in different therapeutic schools; there are even dedicated (and unauthorised) relationship chatbots advising in the manner of the celebrity therapist Esther Perel.

But it’s not just an accessible alternative – ChatGPT is starting to encroach on actual therapy, says the therapist Susie Masterson. “At first I felt quite affronted – like, ‘Oh no, are we going to be replaced?’” Masterson says. But having a background in tech, she has been able to accommodate clients’ enthusiasm for ChatGPT in her practice. Sometimes they bring their transcripts for discussion, or she suggests areas for research.

ChatGPT can help with reframing thoughts and situations, similar to cognitive behavioural therapy – but “some clients can start to use it as a substitute for therapy”, Masterson says. “I’ve had clients telling me they’ve already processed on their own, because of what they’ve read – it’s incredibly dangerous.” She has had to ask some clients to cease their self-experiments while in treatment with her. “It’s about you and me in the room,” she says. “You just cannot have that with text – let alone a conglomeration of lots of other people’s texts.”

Self-directed chatbot therapy also risks being counterproductive, shrinking the area of inquiry. “It’s quite affirmative; I challenge clients,” says Masterson. ChatGPT could actually cement patterns as it draws, over and again, from the same database: “The more you try to refine it, the more refined the message becomes.”

Tim found this himself. At his peak, he was spending two to three hours on ChatGPT daily, causing the chatbot to repeat itself. “I did get a little too obsessed with it,” he says. “You can start overanalysing yourself – and it’s really easy to overanalyse your wife.”

“Sometimes I find the validation is almost too much.” Photograph: filadendron/Getty Images (posed by model)

For others, however, ChatGPT’s insights are transformative and lasting. Liam found that even six years after his father died, he still felt stuck in grief. “My dad’s birthday would come around, and Father’s Day, and I’d have all these emotions swell,” he says. Having used ChatGPT as a research tool through his master’s degree, Liam began exploring it as a means of therapeutic support, telling it “like I was talking to a person” about his painful mixed feelings of resentment and loss.

Liam has been in therapy for five years, and says ChatGPT is in no way a replacement – but he was still “shocked and amazed” by the chatbot’s nuanced replies. “It validated and reflected an emotional response that was appropriate for the context, so it made me feel very safe.” Afterwards, he felt as though some internal block had dissolved: “I didn’t feel that same emotional volatility.” The experience was “deeply moving” – but, Liam adds, ChatGPT was just one strand of his processing. “Sometimes I find the validation is almost too much.” Some experimental interactions left him feeling a “bit wigged out”.

Young or isolated people may be at particular risk, however. Earlier this year, an American teenager killed himself after becoming emotionally attached to his Character.AI chatbot; his mother is now suing the company, alleging that the chatbot encouraged her son’s suicidal ideation.

As much as AI presents a way to augment our knowledge and understanding, there is a danger of dependency, says Masterson. “Everything that we do in terms of outsourcing our emotions means we’re missing an opportunity to connect with ourselves – and if we cannot connect with ourselves, how the heck do we expect to connect with someone else?”

Using ChatGPT to role-play or mediate challenging conversations may reflect fear of emotional exposure, or pressure to always be word-perfect. “To err is human. Every relationship will involve a rupture, but it’s the repair that’s important,” Masterson says. If we seek to dodge both, by “using somebody else’s platitudes, then we’re missing out on the beauty of life”.

The increasing use of AI is also causing people to second-guess their interactions, “creating a climate of suspicion”, says Rua M Williams, an assistant professor at Purdue University in Indiana, US. Last year a colleague accused Williams of having used AI in an email, pointing to its lack of “warmth”. Williams replied: “It’s not an AI. I’m just autistic.” They felt bewildered, not offended, Williams says – but it illustrates the vigilance accompanying the rise of AI. Williams’ professional writing has also been flagged. “People are looking for signs … but what they are noticing is the kinds of awkwardness or idiosyncrasies common in neurodivergent expression and English as a second language.”

These looming “side-effects” of ChatGPT are worsened by its siloed use, says Prof Hafermalz. “As this becomes very intertwined with the way people work, there’s less and less need for them to look towards other people.” For organisations, it presents existential challenges, reducing colleagues’ opportunities to collaborate and learn from one another – and managers’ ability to improve organisational functioning. Many of her interviewees were reluctant to be upfront about their ChatGPT use, concerned it would be seen as “cheating” or unprofessional – while also noting the undeniable benefit of being able “to do their work faster”.

What’s needed is open discussion about workplace use of AI and how to harness it, before it becomes too difficult to control, Hafermalz says. “The ripple effects are just getting started, and I think that keeping it covert is a surefire way for those to be more unpredictable and problematic.”

The increasing personal use of ChatGPT is harder to detect, let alone put parameters around. Having spent hours a day on ChatGPT, Tim is now down to 15 minutes, treating it as a sounding board rather than an authority on his relationship. Many questions he took to ChatGPT “probably could have been solved with a good friend group”, he says – but he links his previous compulsive prompting to social isolation. His own ties had been weakened by an international move and midlife drift. “It’s kind of sad, with this loneliness epidemic – we’re all having to get therapy from a robot.”

It could even be creating unrealistic expectations, Tim suggests: modelling “the perfect partner” and affirming people’s least charitable views of their real-life spouse. “It is a little bit dangerous, because it’s sort of half-baked – it seems like it could be so much more beneficial than it maybe is.” He recalls, at the peak of his anxious use, asking ChatGPT if he was using it too much. “It said: ‘Yeah – maybe get a therapist.’”

All case studies’ names have been changed

Most viewed

Most viewed