Introduction
In July 2025 [1] [3], new British Child Protection Regulations will be enacted [1], mandating social media and internet platforms to prevent children’s access to harmful content or face substantial fines [1]. These regulations [2], developed under the Online Safety Act 2023 [3], are designed to enhance child safety online by imposing specific duties on online services.
Description
New British Child Protection Regulations will come into effect in July 2025 [1], requiring social media and internet platforms to block children’s access to harmful content or face significant fines [1]. Ofcom [1] [2] [3] [4], the UK’s media regulator [1], has finalized new codes under the Online Safety Act 2023 [3], which imposes child safety duties on online services [2]. These regulations mandate that services assess risks by age group and address emerging types of harm, including body stigma and depression-related content [3]. Key deadlines include the completion of children’s risk assessments by 24 July 2025 [3], with risk mitigation measures needing to be operational from 25 July 2025 [3], pending parliamentary approval of the Codes [3].
As part of the implementation of the Online Safety Act, Ofcom has published its Children’s Codes [1], which require online services likely to be accessed by children to make significant changes. These include modifications to algorithms that recommend content to younger users [1], improvements to age verification processes [1], and the necessity for companies to conduct thorough children’s risk assessments to ensure the safety of young users. Ofcom is also conducting research to understand children’s perspectives on proposals aimed at enhancing online safety [4], particularly regarding age assurance measures [4].
Key measures include:
- Safer feeds: Providers with recommender systems must filter out harmful content from children’s feeds [1].
- Effective age checks: High-risk services must implement strong age assurance to protect children from harmful material while allowing adults access to legal content. This includes understanding how UK internet users aged 16 and older access pornographic content and navigate age verification checks [4].
- More choice and support for children: Platforms must enable children to control their online experience, including blocking unwanted interactions and managing comments on their posts [1].
- Strong governance: Services must appoint a responsible individual for children’s safety and conduct annual reviews of risk management.
Ofcom has the authority to impose fines and seek court orders to restrict access to non-compliant sites or apps in the UK [1]. The NSPCC has welcomed the Codes but urges further action [1], particularly regarding private messaging apps that utilize encryption [1], limiting visibility of communications [1]. Compliance with these regulations is essential for all regulated services [2], as companies are held legally accountable for safeguarding individuals [2], particularly children [2], in the UK [1] [2].
Conclusion
The introduction of these regulations marks a significant step towards ensuring a safer online environment for children. By mandating risk assessments and the implementation of protective measures, the regulations aim to mitigate potential harms associated with online content. The future implications include a more robust framework for online safety, with ongoing research and adjustments to address emerging threats. Compliance will be crucial for online services to avoid penalties and contribute to a safer digital space for young users.
References
[1] https://www.cybersecurityintelligence.com/blog/new-british-child-protection-regulations–8399.html
[2] https://www.ofcom.org.uk/information-for-industry
[3] https://www.lexology.com/library/detail.aspx?g=69bab7f9-8415-4fe6-81e8-64fd29896773
[4] https://www.ofcom.org.uk/research-statistics-and-data