Telegram to work with internet watchdog on child sexual abuse material crackdown
by Press Association · LBCThe messaging platform will use tools from the Internet Watch Foundation to help detect and remove abuse material.
Messaging platform Telegram is set to use industry-leading tools to detect child sexual abuse imagery on public parts of the platform as part of a new agreement with an online safety watchdog.
The platform is to work with the Internet Watch Foundation (IWF), a leading organisation for finding and removing child sexual abuse imagery online, and will use the firm’s tools as part of a new agreement.
Critics of Telegram have often characterised the platform as a lawless wing of the online world because of its encryption-based system used in parts of the app which means only those in a conversation can see or access private messages.
The IWF has previously confirmed thousands of reports of child sexual abuse imagery on Telegram since 2022 – including category A material, the most severe kind – which the site removed after it was reported to them.
The organisation said Telegram will now use a range of IWF services, including taking IWF hashes – unique digital fingerprints of known abuse images – to instantly spot when such material is shared in public parts of the site.
Tools to help block AI-generated abuse content will also be deployed, the IWF said, as well as those to block links to webpages known to host child sexual abuse material.
Derek Ray-Hill, interim chief executive of the IWF, said: “This is a transformational first step on a much longer journey.
“We look forward to seeing what further steps we can take together to create a world in which the spread of online sexual abuse material is virtually impossible and, when it does happen, we are able to remove it very quickly and permanently.
“Child sexual abuse imagery is a horror that blights our world wherever it exists. The children in these images and videos matter.
“I want to be able to say to every single victim that we will stop at nothing to prevent the images and videos of their suffering being spread online.
“Now, by joining the IWF, Telegram can begin deploying our world-leading tools to help make sure this material cannot be shared on the service.
“It is an important moment, and we will be working hard with Telegram to make sure this commitment continues and expands to the whole sector.”
Remi Vaughan, head of press and media relations at Telegram, said: “Telegram removes hundreds of thousands of child abuse materials each month, relying on reports and proactive moderation which includes AI, machine learning and hash-matching.
“The IWF’s datasets and tools will strengthen the mechanisms Telegram has in place to protect its public platform – and further ensure that Telegram can continue to effectively delete child abuse materials before they can reach any users.”
Responding to the announcement, Richard Collard, associate head of policy for child online safety at the NSPCC, said: “We must remember that child sexual abuse material is a result of children being groomed, coerced, and exploited by their abusers.
“It can be incredibly traumatising for victims to know that their images are being shared on social media by networks of offenders, without consequence.
“It is welcome to see Telegram embarking on this partnership with the IWF to prevent the spread of child sexual abuse material on their public platform.
“However, there should be no part of the service where perpetrators can act without detection.
“Telegram cannot continue to blind themselves to harm taking place on the encrypted parts of their app. If they want to show they are committed to protecting children, they must be proactive in identifying and removing all illegal content from their platform.”
By Press Association