Home / Insight / Online safety: practical and legal developments to protect children

Online safety: practical and legal developments to protect children

13/12/2022

The discussion surrounding online safety has been given more and more prominence in recent years. With the constant development of technology and social media platforms, children today are born into an online world. Following the pandemic many young people’s digital habits have become rooted, with socialising and entertaining digitally becoming the norm and screen time hard to control. While the world of technology brings with it many new and exciting opportunities it also has a darker side as it has created opportunities for perpetrators to commit child sexual abuse through the production and distribution of indecent images of children, grooming and the live streaming of abuse. Many consider it a national and global crisis.

Steps taken so far

In June 2021, the Internet Watch Foundation (IWF), an independent, non-profit charity partnered with Childline, launched an online tool called ‘Report Remove’, which can be used by anyone under the age of 18 to report a nude photo or video of themselves that has appeared online. The IWF then reviews the report and works to have the content removed if it breaks the law. It is anticipated that this tool will become increasing useful as a result of the growth in self-generated imagery – for instance, when an image has been shared with someone the originator thought they could trust, but that person has subsequently shared the image online and/or with others.

In addition to the above, in September 2021, the Information Commissioner’s office (ICO) rolled out ‘The Children’s code’ requiring online services, including websites, applications and games, to provide better privacy protections for children. The code has prompted changes being made by social media platforms, gaming websites and video streaming services, which include targeted and personalised ads being blocked for children, children’s accounts, set to private by default, adults blocked from directly messaging children, and notifications turned off at bedtime.

However, there are calls for more to be done.

IICSA

As part of its final report, IICSA proposed two recommendations specifically in relation to online safety. They are:

  1. Pre-screening
  2. Age verification

Pre-screening

The Inquiry recommended that regulated providers of internet search services and user-to-user services should pre-screen for known child sexual abuse before material is uploaded. Pre-screening enables internet companies to prevent child sexual abuse images from ever being uploaded to platforms and social media profiles. The images cannot, therefore, be viewed or shared, preventing access to the material and further dissemination.

Age verification

Also recommended is that there be a change the law to make sure that internet companies that provide online internet services and social media introduce better ways to check children’s ages. It appears to be too easy to bypass the current requirements by simply entering a different date of birth with no further verification required.

In addition to concerns about age verification, there are also concerns about how a user’s identity is verified as it is too easy to set up a fake online profile (known as catfishing) which enables perpetrators to present themselves as a child or as someone else.

Notwithstanding these recommendations, it would be unrealistic to assume that age and identity verification alone will prevent underage access to internet services and platforms or that pre-screening without proper measures in place will be enough to protect children from online harm. The WeProtect Global Alliance’s ‘Global Threat Assessment 2021’ indicated that the “sustained growth” in the scale of child sexual exploitation and abuse online is “outstripping our global capacity to respond”. There needs to be increased emphasis and focus on making children’s use of the internet safer by design and, once established, all platforms and services need to have the capacity and capability to respond to emerging patterns of child sexual abuse.

Next Steps

Online Safety Bill

The Bill had its first reading in the House of Commons in March 2022. It is now to be recommitted to a Public Bill Committee in respect of particular clauses and schedules with responses expected by 15 December 2022.

There is significant interest in getting this Bill passed as an Act of Parliament before the current session ends. However, we are now on the fourth prime minister since the idea of legislating on this area was first considered. If it is not passed by April 2023 then it will need to be started all over again in a new Parliament. If enacted it would mean companies that host user-generated content and search engines will be regulated for the first time and will have a duty of care to their users. Those duties include “to mitigate and effectively manage” the risk of harm caused by illegal content and to protect children from harm in those parts of a service they can access. Its main aims are to:

  • Prevent the spread of illegal content and activity such as images of child abuse, terrorist material and hate crimes
  • Protect children from harmful material
  • Protect adults from legal but harmful content

The definition of illegal content covers child sexual exploitation and abuse content.

There are still potential further amendments that could be made to the Bill but it is anticipated that the legislation will put the onus on technology companies to take steps to better safeguard users and empower a regulator, likely to be Ofcom, to monitor this. Firms that fail to comply with the new rules could face fines of up to £18m, or 10% of their annual global turnover, whichever is highest.

Key things that need to be addressed are:

  • Algorithms: Unregulated algorithms can bring harmful content to media accounts as when a user searches for something it means that more content of that nature is generated for viewing. Following the sad death of teenager Molly Russell who took her own life after viewing suicide and self-harm content on Instagram and Pinterest, the coroner concluded that she had died while suffering from the negative effects of online content.
  • Age verification: Measures to ensure that children are prevented from accessing social media platforms under a minimum age are recommended by IICSA, but technology companies are slow to develop these.
  • Adult content sites: There is currently no legal duty for platforms hosting such content to verify the age of users with the burden being placed on parents and carers to monitor and apply filtering tools, leaving many children without protection.
  • Virtual Reality/Augmented Reality: As technology continues to develop there are now platforms that enable the sensation of physical touch, with many reporting that they have been subjected to sexual assaults and rapes while using them. In addition, you can explore your local area if location tracking is permitted, which increases the risk of children disclosing their whereabouts.

The UK is not the only country moving towards regulation of online service providers. In Australia, the Online Safety Act 2021 (which came into force in January 2022) requires service providers to take reasonable steps to minimise access to child sexual exploitation material. The Act also requires the development of new mandatory industry codes to regulate illegal and restricted content. The codes require online platforms and service providers to detect and remove illegal content such as child sexual abuse material.

Criticisms of the Bill

One of the main criticisms of the Bill as currently drafted is that it would give Ofcom powers to impose notices on the operators of private messaging apps and other online services. The notices would give Ofcom the power to impose specific technologies that would intercept and scan private communications of UK citizens on a massive scale. It would means the UK would be one of the first democracies to place a de facto ban on end-to-end encryption for private messaging apps; no communications in the UK would be secure or private. In the current climate, this poses a critical threat to UK national security.

In addition, this proposal would mean that Ofcom would have a wider remit on mass surveillance powers of UK citizens than the UK’s spy agencies, such as Government Communications Headquarters (GCHQ), which is the UK’s intelligence, security, and cyber agency. In theory, Ofcom could impose surveillance on all private messaging users with a notice, underpinned by significant financial penalties, with less legal process or protections than GCHQ.

The risk is that technology companies will be forced to monitor and analyse private communications en masse to avoid the risk of facing significant fines. This may mean that such companies have to consider whether or not to give just UK users less protection for their private messages, or to pull out of the UK altogether if the requirements are incompatible with their own encryption technology and the services they provide.

There could also be significant harm created by undermining the privacy of direct messaging, which would be particularly dangerous for LGBTQ+ individuals and other minorities who are targeted by autocratic governments and rely on online communication for support, to express themselves and for whom encryption is essential.

This disproportionate interference with people’s privacy paints an altogether different picture of the Online Safety Bill – while it is viewed as being a law to establish accountability for online harm, it appears to open the door for powers of surveillance.

Comment

It is clear that recognition needs to be given to the importance of keeping children safe in the online world as well as the physical one however, it remains to be seen whether the Online Safety bill will be the groundbreaking legislation that many families are hoping for.

For more information, please contact Lauranne Nolan

Stay informed with Keoghs

Sign-up

Our Expertise

Vr

Claims Technology Solutions

Disrupting claims management with innovation & technology

 

The service you deliver is integral to the success of your business. With the right technology, we can help you to heighten your customer experience, improve underwriting performance, and streamline processes.