UK Regulator Investigates Social Media Firms Over Child Abuse Content
Ofcom launches enforcement under the Online Safety Act as critics call for stronger protections against harmful material.
- Ofcom has begun enforcing the Online Safety Act, requiring social media platforms to detect and remove illegal content such as child sexual abuse material, terrorism-related content, and fraud.
- The regulator is investigating companies, including Elon Musk’s X and Meta platforms, for compliance with new legal duties to prevent and remove harmful material online.
- Companies failing to meet these obligations face fines of up to 10% of their global revenue or potential UK site blocks, with executives at risk of jail for persistent violations.
- Critics argue Ofcom’s approach is too lenient, failing to adequately protect children from harmful content and prioritizing the concerns of tech companies over public safety.
- The UK government has pledged to strengthen the Online Safety Act further to address emerging threats, including AI-generated abuse images and other harmful online practices.