FILE PHOTO: A message reading "AI artificial intelligence", a keyboard, and robot hands are seen in this illustration taken January 27, 2025. REUTERS/Dado Ruvic/Illustration/File Photo

Young Europeans turn to AI chatbots for emotional support, survey shows

· CNA · Join

Read a summary of this article on FAST.
Get bite-sized news via a new
cards interface. Give it a try.
Click here to return to FAST Tap here to return to FAST
FAST

May 5 : Nearly one in two young people in Europe have used AI chatbots to discuss intimate or personal matters, as the technology increasingly serves as a source of emotional support, an Ipsos BVA survey showed on Tuesday.

Of the 3,800 people surveyed, 51 per cent said it was "easy" to discuss mental health and personal issues with a chatbot. Only 49 per cent said the same about healthcare professionals and 37 per cent about psychologists.

People close to you were at the top of the list, with 68 per cent saying it was easy to discuss issues with friends and 61 per cent with parents.

The survey, commissioned by France's privacy watchdog CNIL and insurer Groupe VYV, was carried out among people aged 11 to 25 across France, Germany, Sweden and Ireland in early 2026.

CNA Games

Guess Word
Crack the word, one row at a time

Buzzword
Create words using the given letters

Mini Sudoku
Tiny puzzle, mighty brain teaser

Mini Crossword
Small grid, big challenge

Word Search
Spot as many words as you can
Show More
Show Less

The findings showcased growing concerns over young people's mental health. About 28 per cent of respondents met the threshold for suspected generalized anxiety disorder, the survey found.

Around 90 per cent of those surveyed had used artificial intelligence tools before, with many citing their constant availability and non-judgmental nature. More than three in five users described AI as a "life adviser" or a "confidant".

However, concerns over the psychological impact of AI tools have also grown over the past year, and experts have warned about the limitations of AI in detecting human emotions and safely providing emotional ⁠support.

Earlier this year, the family of a Florida man sued Google, alleging its Gemini AI chatbot contributed to his paranoia and eventual suicide.

The results of the survey were not a surprise, said Ludwig Franke Föyen, a psychologist and digital health researcher at Stockholm's Karolinska Institutet.

Current large language models can produce high-quality responses, Franke Föyen told Reuters, adding that his research suggested even licensed professionals may struggle to distinguish AI-generated advice from that of human experts.

But he warned against relying on chatbots alone for mental health support, saying general-purpose AI systems were designed for engagement and companies' goals may not align with mental healthcare needs.

"AI can offer information and support, but it should not replace human relationships or professional care," Franke Föyen said.

"If someone turns to a chatbot instead of speaking to a parent, a friend, or a mental health professional, that is a concern. We do not want technology to make people feel more alone."

Source: Reuters

Newsletter

Week in Review

Subscribe to our Chief Editor’s Week in Review

Our chief editor shares analysis and picks of the week's biggest news every Saturday.

Sign up for our newsletters

Get our pick of top stories and thought-provoking articles in your inbox

Subscribe here

Get the CNA app

Stay updated with notifications for breaking news and our best stories

Download here

Get WhatsApp alerts

Join our channel for the top reads for the day on your preferred chat app

Join here