The Online Safety Act 2023 (the Act) intends to protect children and adults online. It creates a range of new duties for social media platforms and technology companies by making them legally responsible for the content they host as well as user safety on their platforms. The implementation of the Act is being rolled out in three phases, with the most recent duties coming into force on 25 July 2025. Lauranne Nolan, Associate Solicitor and Safeguarding Lead in the Keoghs Specialist Abuse Team, considers the newest duties further.
On 25 July 2025, phase two of the Act came into force. This was a pivotal date for the Act as key provisions around child safety obligations took practical effect. Companies and platforms are now required to use effective age assurance to prevent children from accessing pornography and material that promotes or encourages suicide, self-harm, and eating disorders. The Act instructs that this content must be kept off children’s feeds entirely.
Platforms will also have to suppress the spread of other forms of material that are potentially harmful to children, including the promotion of dangerous stunts, encouraging the use of harmful substances, and enabling bullying. They must also provide parents and children with clear and accessible ways to report problems online when they arise.
To implement the new duties and standards to be complied with, the Act requires companies and platforms to adhere to guidance and Codes of Practice that have been drafted and developed by Ofcom, the appointed safety regulator. Safety measures under the codes of Practice for this phase include:
Age assurance measures supported by Ofcom include:
Ofcom can deploy a range of punishments under the Act if companies do not comply. Companies can be fined up to £18 million or 10% of global turnover for breaches, whichever is larger. Sites or apps can also receive formal warnings and enforcement notices from Ofcom. For extreme breaches, Ofcom can ask a court to prevent the site or app from being available in the UK.
Criminal action can be taken against senior managers who fail to ensure companies follow information requests from Ofcom. Ofcom will also be able to hold companies and senior managers (where they are at fault) criminally liable if the provider fails to comply with Ofcom’s enforcement notices in relation to specific child safety duties or for child sexual abuse and exploitation on their service.
The technology secretary, Peter Kyle, said that the new Ofcom codes should be a “watershed moment” that will turn the tide on “toxic experiences on these platforms”.
However, the Molly Rose Foundation, a charity established by the family of British teenager Molly Russell – who took her own life in 2017 aged 14 after viewing harmful content online – said the measures did not go far enough and that they were overly cautious. It has called for additional changes such as blocking dangerous online challenges and requiring platforms to proactively search for, and take down, depressive and body image-related content.
The third phase of the Act relates to transparency, user empowerment and other duties on categorised services; an exact date for the rollout of this phase has not yet been set but is stated to be early 2026. As with the first two phases, the third will be driven by the requirements of the Act and relevant secondary legislation.
The fact remains that the Act is a significant piece of legislation with the UK being one of the first democratic countries to impose such strict content controls on technology companies and social media platforms, making it a test case that will be scrutinised around the world – with technology continuing to develop at a fast pace, it is imperative that the Act remains effective.
The service you deliver is integral to the success of your business. With the right technology, we can help you to heighten your customer experience, improve underwriting performance, and streamline processes.