Nearly one in five Britons have turned to artificial intelligence for personal advice, according to a new Ipsos study exploring how people in Great Britain are using, and feeling about, AI in their daily lives.
As a profession that exists solely within human connection, that should give us pause. There’s something disquieting about people turning to, what is simply just very clever coding, for the relational stuff. Comfort, connection, companionship.
According to the survey, 18% of adults have sought personal advice from AI tools, with around one in ten using it as 'someone to talk to'. Some even describe these systems as a substitute for counselling (9%). I will admit that it's easy to see why: AI doesn’t judge; it’s available instantaneously, for free, 24/7; and it can feel disarmingly human in how it mimics empathy or curiosity. For someone who’s anxious or isolated, that might feel like enough... at least at first.
But the very fact that people are seeking relational contact from machines tells us something about what’s missing elsewhere. The emotional and connective landscape of our society is changing faster than our social systems can keep up. When waiting lists for mental health support stretch for months and months, and people spend more and more time on social media and less in human contact, it’s no surprise that some turn to tech for support: voila, a friend in your phone. A confidante in your computer. A listener in your laptop.
Two-thirds of people in the study say they use polite language when talking to AI, which is actually very interesting and very telling. Saying “please” and “thank you” as though it were a person shows how easily we anthropomorphise (give non-human objects human-like characteristics) these Large Language Models (LLMs). We instinctively treat machines as relational, because that’s what humans do: we seek connection.
I know the argument here is that this is something that people are known to do with almost everything. Cars, statues, houses, animals, plants... but the difference with those is, they don't actively try to hook you into a relationship with them. They don't take what you tell them and use it to keep you engaged, or use your private data to grow their database and make it easier to ultimately sell things to you. Or worse. Who knows where this journey will take us?
But politeness doesn’t make AI care, although it certainly makes us feel like it does. The illusion of empathy can be powerful enough to make people open up to a system that, underneath its reassuring tone, doesn’t truly understand, remember, or hold a person in mind.
This is where we, as counsellors & psychotherapists, have something essential to say.
A therapeutic relationship is about attunement, mutual presence, and a sense of being seen. Words that sound caring are such a small part of what we do. It’s an experience that cannot be simulated, because it’s co-created between the therapist and the person or people there to do the work of therapy.
Key Findings
Some of the most striking results from the survey include:
Metric What Ipsos Found Use of AI for personal advice Nearly one in five (18%) say they have used AI for advice on personal problems Using AI like a companion / substitute for therapy 11% have treated AI as someone to talk to; 9% say they’ve used it as a substitute for a counsellor Using AI in romantic / dating contexts 7% have sought romantic advice from AI; 6% have employed it in shaping dating profiles Politeness toward AI 67% say they “always” or “sometimes” use polite language (e.g. “please” “thank you”) when interacting with AI Belief in the effect of politeness 36% think politeness improves the likelihood of a helpful response; 30% believe it improves accuracy; 32% say it increases detail AI use in job-seeking Among those applying for jobs in recent years: 27% used AI to write/update their CV; 22% used it for cover letters; 20% used it to practice interview questions Secrecy and stigma 29% do not discuss their AI use with colleagues; 26% fear that if others knew, it would reflect badly on their competence Societal concerns 56% believe AI threatens the current societal structure, while only 29% believe AI’s overall effect on society is positive Human vs AI connection 59% disagree that AI can replicate human interaction; 63% disagree it’s a good substitute; 64% disagree that AI can truly feel emotion The survey sampled 2,189 adults aged 16–75, interviewed online between 18th and 20th July 2025, weighted for representativeness across Great Britain.
The Ipsos data also found that many people who use AI for work. They use it to write cover letters, prepare for interviews, draft emails and, (un?)tellingly, often hide it. Around 29% said they don’t tell colleagues, and a quarter fear that if they did, it would reflect badly on them. There’s a strange kind of shame attached to this new reliance, as if using AI is both the way to get ahead, and somehow deceitful.
It’s not hard to imagine that showing up in people's personal life, too. If someone’s been using AI to talk through their feelings, they could very well hesitate to admit it, even in therapy. They might feel embarrassed, or uncertain about what it says about them. As therapists, it’s worth gently exploring that landscape with curiosity rather than judgment. How did it feel to confide in something that isn’t human? Did it help, or did it highlight what was missing? How can they use it to complement the real work of therapy; not to distract from, detract from, or replace it?
The NCPS is actively working in this area because we see the potential impact this could have on counsellors & psychotherapists, especially at a time when many are already expected to work for free or low cost, despite the years of training and ongoing commitments to development and learning, the cost of registration, insurance, and all our other overheads. None of which AI chatbots have to concern themselves with. We should be looking after our counsellors & psychotherapists, sure, but not just for our own interests - society's relational muscles will atrophy if they're not used, and who knows what the long-term impacts of that will be. I can only say that it certainly won't be a good thing if we, a social species, forget how to live and be together.
Meg Moss - Head of Public Affairs & Advocacy, NCPS
Despite the clearly burgeoning uptake, the same survey does give us some reassurance and hope for the future. Most people don’t believe AI can truly replicate human interaction or emotion. Nearly two-thirds reject the idea that AI could feel anything, or ever be a genuine substitute for another person.
People might turn to AI for information, or even comfort. When it comes to being understood, witnessed, or changed through relationship, though, they still turn to people.
What we need to do is to better understand AI and its relationship to therapy, and to consider the journey ahead: to recognise where it’s being used, to advocate for relational safeguards in mental health support tools, and to keep making the case for why human connection is, and will always be, essential.
If you'd like to support our work in this area, please take a look at our campaign: Therapeutic Relationships: the Human Connection. You can also share our Principles for Relational Safeguards in AI Mental Health Support Tools.
Write to your MP, write to your local newspaper, talk to your friends and family about how important this is. If we keep this conversation going, maybe we can change the course of this together.
