Home / Insight / UK Online Safety Act 2023

UK Online Safety Act 2023

10/11/2023

After years of campaigning, the UK Online Safety Act 2023 (the Act) has received Royal Assent. The Act is a significant piece of legislation with the potential to transform children’s experiences when interacting online and takes a zero-tolerance approach to protecting them by making social media platforms and technology companies legally responsible for the content they host.

They will be expected to:

  • Remove illegal content quickly or prevent it from appearing in the first place, including content promoting self-harm
  • Prevent children from accessing harmful and age-inappropriate content
  • Enforce age limits and age-checking measures
  • Ensure the risks and dangers posed to children on the largest social media platforms are more transparent, including by publishing risk assessments
  • Provide parents and children with clear and accessible ways to report problems online when they do arise

The Government has also strengthened provisions to address violence against women and girls. Through the Act, it will be easier to convict someone who shares intimate images without consent and new laws will further criminalise the non-consensual sharing of intimate deep fakes. Criminals found guilty of sharing intimate images will face up to six months in prison, but those who threaten to share such images, or share them with the intent of causing distress, alarm or humiliation, or to obtain sexual gratification, could face up to two years behind bars. Many will recall the recent case of Stephen Bear which saw him being convicted of voyeurism and two counts of disclosing private photographs and films of his ex-girlfriend, Georgia Harrison, with intent to cause distress. He was sentenced to 21 months’ imprisonment. He has also been ordered to pay Miss Harrison £207,900 in damages, which is the largest sum to be awarded in an image abuse case.

As well as its commitment to its firm protections for children, the Act empowers adults to take control of what they see online. It provides three layers of protection for internet users which will:

  • Make sure illegal content will have to be removed
  • Place a legal responsibility on social media platforms to enforce the promises they make to users when they sign up, through terms and conditions
  • Offer users the option to filter out harmful content, such as bullying, that they do not want to see online

Potential consequences

If platforms do not act quickly to prevent and remove illegal content and comply with the requirements set out in the Act they could face significant fines from Ofcom of up to £18 million or 10% of their global annual revenue, whichever is larger. In addition, technology executives face the threat of a two-year jail sentence if they persistently ignore Ofcom enforcement notices telling them they have breached their duty of care to children unless they can show that they have acted in good faith to comply proportionately with their duties. Senior employees could also be jailed if they hinder an Ofcom investigation or a request for information.

The Implementation of the Act

Ofcom has been given the responsibility of safety regulator. They will not have the power to remove content but intend to tackle the root causes and set new standards online. While the bill was progressing the Government began working closely with Ofcom to ensure changes would be implemented as quickly as possible when it became an Act of Parliament, given the extensive amount of work that is going to be required to set up the new regulatory regime.

The implementation will be a rollout of the new rules in three phases, with the timing driven by the requirements of the Act and relevant secondary legislation.

Phase one: Illegal harms duties

The draft codes and guidance in relation to these duties were published on 9 November 2023, which included:

  • Analysis of the causes and impacts of online harm, to support services in carrying out their risk assessments;
  • Draft guidance on a recommended process for assessing risk;
  • Draft codes of practice, setting out what services can do to mitigate the risk of harm; and
  • Draft guidelines on Ofcom’s approach to enforcement.

There will now be a consultation on these documents, and a plan to publish a statement on the final decisions in autumn 2024. The codes of practices will then be submitted to the Secretary of State for Science, Innovation, and Technology and, subject to their approval, laid before Parliament. It is expected that the duties will become enforceable around the end of 2024, assuming Parliament approves the codes.

Phase two: Child safety, pornography and the protection of women and girls

Child protection duties will be set out in two parts. Firstly dealing with online pornography services, it is anticipated that stakeholders will be able to read and respond to the draft guidance on age assurance from December 2023. Secondly, regulated services and other interested stakeholders will be able to read and respond to draft codes of practice relating to the protection of children in spring 2024.

Alongside this, Ofcom also expects to consult on:

  • Analysis of the causes and impacts of online harm to children; and
  • Draft risk assessment guidance focusing on children’s harms.

They expect to be able to publish draft guidance on protecting women and girls by spring 2025, when they have finalised the codes of practice on the protection of children.

Phase three: Transparency, user empowerment, and other duties on categorised services

A small proportion of regulated services will be designated Category 1, 2A or 2B services if they meet certain thresholds set out in secondary legislation to be made by Government. The final stage of implementation focuses on additional requirements that fall only on these categorised services.

Those requirements include duties to:

  • Produce transparency reports;
  • Provide user empowerment tools;
  • Operate in line with terms of service;
  • Protect certain types of journalistic content; and
  • Prevent fraudulent advertising.

In addition to the above, charities, technology firms, academics, and international government representatives all gathered at a recent event hosted by the Home Secretary to focus on how to tackle the threat of child sexual abuse material generated by Artificial Intelligence (AI) after data from the Internet Watch Foundation (IWF) showed that thousands of images depicting the worst kind of abuse could be found on the dark web and is realistic enough to be treated as real imagery under UK law.

At the event, hosted in partnership with the IWF, statistics were released that showed that in a single month, the IWF investigated more than 11,000 AI images which had been shared on a dark web child abuse forum. Almost 3,000 of these images were confirmed to breach UK law, meaning they depicted child sexual abuse. Following this, a number of ‘tech giants’ including TikTok, Snapchat, and Stability AI all signed a pledge vowing to tackle the rise of AI-generated child sexual abuse images.

Conclusion

It is expected that timelines may shift as the consultation and drafting progresses and the Government takes action to put in place the necessary secondary legislation. Furthermore, it is expected that there will be a general election at some point before the end of 2024 and this has not been factored into the current planning. Given the need for secondary legislation to establish some aspects of the new rules, and the roles of the Secretary of State for Science, Innovation and Technology, and Parliament, in approving Codes of Practice, this could create delays in the implementation process. Therefore, while the Act is a significant piece of legislation, it remains to be seen how soon it can be put into action.

 

For more information, please contact:

Lauranne Nolan - Associate

Email: LNolan@keoghs.co.uk

Stay informed with Keoghs

Sign-up

Our Expertise

Vr

Claims Technology Solutions

Disrupting claims management with innovation & technology

 

The service you deliver is integral to the success of your business. With the right technology, we can help you to heighten your customer experience, improve underwriting performance, and streamline processes.