British Regulator Ofcom Launches Investigation into Telegram
The British communications regulator Ofcom has launched an investigation into Telegram regarding its compliance with obligations to combat illegal content under the local Online Safety Act 2023.
According to Reuters, the investigation was prompted by data from the Canadian Centre for Child Protection regarding the spread of child sexual abuse material (CSAM) on the platform, as well as the regulator's own assessment.
If violations are found, platforms could face fines of up to £18 million or 10% of their global turnover.
Telegram's Position
• The messenger "categorically" denies Ofcom's accusations.
• The company claims that since 2018, the public distribution of CSAM on the platform has been "virtually eliminated" thanks to the work of automatic detection algorithms.
• Telegram stated they are surprised by the investigation and concerned that it might be "part of a broader attack on online platforms that defend freedom of speech and the right to privacy".
• A day earlier, Pavel Durov noted that "protecting children" has become a standard PR move for lawmakers to gain control over media narratives.
Context
This is not Telegram's first clash with regulators over CSAM. In February 2025, the Australian regulator eSafety fined the messenger $600,000 for delaying a response to a similar request. In response, the company abandoned its usual passive stance, moved to active legal defense, and sued Australia, challenging the legality of the fine.
At the end of 2024, Telegram officially agreed to cooperate with the Internet Watch Foundation (IWF) — one of the key organizations maintaining CSAM signature databases. Our team has also extensively analyzed Telegram's internal moderation processes based on reports for eSafety: the disclosed documents reveal that photos and videos in all private groups and channels on the platform are automatically scanned against known illegal materials.
Recently, the UK has been consistently ramping up pressure on social media. Earlier this year, to comply with local legislation, Telegram already had to launch an experimental facial age verification system for British users.
It is worth noting that in recent years, under the banner of child protection, European countries have been increasingly pushing requirements that could violate privacy of communications and live. For example, due to requirements from this same Ofcom, Apple was forced to disable E2E encryption for iCloud backups, and the European Union is developing the EU Chat Control law, which is similar to the British OSA in its invasiveness.
#UK #moderation #laws
The British communications regulator Ofcom has launched an investigation into Telegram regarding its compliance with obligations to combat illegal content under the local Online Safety Act 2023.
According to Reuters, the investigation was prompted by data from the Canadian Centre for Child Protection regarding the spread of child sexual abuse material (CSAM) on the platform, as well as the regulator's own assessment.
If violations are found, platforms could face fines of up to £18 million or 10% of their global turnover.
Telegram's Position
• The messenger "categorically" denies Ofcom's accusations.
• The company claims that since 2018, the public distribution of CSAM on the platform has been "virtually eliminated" thanks to the work of automatic detection algorithms.
• Telegram stated they are surprised by the investigation and concerned that it might be "part of a broader attack on online platforms that defend freedom of speech and the right to privacy".
• A day earlier, Pavel Durov noted that "protecting children" has become a standard PR move for lawmakers to gain control over media narratives.
Context
This is not Telegram's first clash with regulators over CSAM. In February 2025, the Australian regulator eSafety fined the messenger $600,000 for delaying a response to a similar request. In response, the company abandoned its usual passive stance, moved to active legal defense, and sued Australia, challenging the legality of the fine.
At the end of 2024, Telegram officially agreed to cooperate with the Internet Watch Foundation (IWF) — one of the key organizations maintaining CSAM signature databases. Our team has also extensively analyzed Telegram's internal moderation processes based on reports for eSafety: the disclosed documents reveal that photos and videos in all private groups and channels on the platform are automatically scanned against known illegal materials.
Recently, the UK has been consistently ramping up pressure on social media. Earlier this year, to comply with local legislation, Telegram already had to launch an experimental facial age verification system for British users.
It is worth noting that in recent years, under the banner of child protection, European countries have been increasingly pushing requirements that could violate privacy of communications and live. For example, due to requirements from this same Ofcom, Apple was forced to disable E2E encryption for iCloud backups, and the European Union is developing the EU Chat Control law, which is similar to the British OSA in its invasiveness.
#UK #moderation #laws
🤡28👎11👍10😡9🎉4❤3