UK regulator probes Telegram over child sexual abuse concerns
· CNA · JoinRead a summary of this article on FAST.
Get bite-sized news via a new
cards interface. Give it a try.
Click here to return to FAST Tap here to return to FAST
FAST
LONDON, April 21 : Britain's communications regulator, Ofcom, launched an investigation on Tuesday into the Telegram messaging app after evidence suggested child sexual abuse material was being shared on the platform.
The probe is part of UK efforts to crack down on children being exposed to harm online without clear accountability. While the country's 2023 Online Safety Act has set tougher standards for social media platforms such as Facebook, YouTube and TikTok, Prime Minister Keir Starmer wants them to go further.
The government has been consulting on a potential social media ban for children under 16, and Starmer met last week with social media company executives where he asked them to take more responsibility.
Ofcom said it had received evidence from the Canadian Centre for Child Protection regarding the alleged sharing of child sexual abuse material on Telegram, and had carried out its own assessment of the platform.
CNA Games
Guess Word
Crack the word, one row at a time
Buzzword
Create words using the given letters
Mini Sudoku
Tiny puzzle, mighty brain teaser
Mini Crossword
Small grid, big challenge
Word Search
Spot as many words as you can
Show More
Show Less
"In light of this, we have decided to open an investigation to examine whether Telegram has failed, or is failing, to comply with its duties in relation to illegal content," Ofcom said in a statement.
Telegram said it "categorically" denied Ofcom's accusations, adding that since 2018 it had "virtually eliminated" the public spread of child sexual abuse material on its platform through detection algorithms.
"We are surprised by this investigation and concerned that it may be part of a broader attack on online platforms that defend freedom of speech and the right to privacy," the Dubai-based company said in a statement.
Telegram was fined in February by Australia's online safety regulator for delaying answering questions about measures taken to prevent the spread of child abuse and violent extremist material.
'NOT ENOUGH' ACTION TO PREVENT HARM
Internet Watch Foundation, a British nonprofit that has been working with Telegram to help the company identify and remove harmful material, said there was more to do.
"We share concerns that bad actor networks are operating across Telegram’s ecosystem, and that not enough is being done to prevent known, detected, child sexual abuse imagery from being distributed," it said in a statement.
Ofcom said on Tuesday it had also opened investigations into Teen Chat and Chat Avenue to examine whether they were meeting their duties to prevent children from the risk of being groomed by predators.
Teen Chat and Chat Avenue did not respond to Reuters requests for comment.
Ofcom said that after engagement with the companies, it remained unsatisfied as to whether they were providing adequate protection to British children from the risk of grooming.
"These firms must do more to protect children, or face serious consequences under the Online Safety Act," Suzanne Cater, director of enforcement at Ofcom, said. Last October, Ofcom fined U.S. internet forum site 4chan 20,000 pounds ($27,020.00) for failures under the new rules.
($1 = 0.7402 pounds)
Newsletter
Week in Review
Subscribe to our Chief Editor’s Week in Review
Our chief editor shares analysis and picks of the week's biggest news every Saturday.
Sign up for our newsletters
Get our pick of top stories and thought-provoking articles in your inbox
Get the CNA app
Stay updated with notifications for breaking news and our best stories
Get WhatsApp alerts
Join our channel for the top reads for the day on your preferred chat app