Children and young people may be exposed to many types of harmful content and activity online. As the NSPPCC has noted, some of this content is illegal, like child sexual abuse images or content promoting terrorism. Other material might not be illegal but can still harm children (eg content promoting eating disorders). Other content can be harmful because it isn’t age appropriate.

The Online Safety Act 2023

The Online Safety Act received Royal Assent in October 2023. The protection of children is one of the key aims of the act. To protect children, online platforms will have to:

  • remove illegal content quickly or prevent it from appearing in the first place – for example, child sexual abuse material and grooming.
  • prevent children from accessing harmful content – for example, content that encourages, promotes or provides instructions for suicide, self-harm and eating disorders; bullying and violent content.
  • use age age-checking measures so that children cannot access pornographic material and other and age-inappropriate content.

The Department for Science, Innovation and Technology, the government department with responsibility for online safety, has published an “explainer” giving an overview of what the act will do (May 2024), including how it will give children “age-appropriate experiences”.

Ofcom’s role

Ofcom, the UK’s communications regulator, is also the online safety regulator under the 2023 act. Ofcom intends to implement the act in three phases, with the entirety coming into force in 2026. For further detail, see Ofcom’s progress update on implementing the act (PDF)(October 2024).

DSIT published a draft statement of strategic priorities for Ofcom on 20 November 2024. This sets out the following priorities that Ofcom must have regard to when exercising its functions as the online safety regulator:

  • safety by design
  • transparency and accountability
  • agile regulation
  • inclusivity and resilience
  • technology and innovation

In May 2024, Ofcom published forty proposals that social media (and other) online services would have to take to improve children’s online safety. These included:

  • “robust age checks” –  all services which don’t ban harmful content will have to introduce age-checks to prevent children from accessing the entire site or app, or age-restricting parts of it for adults-only access
  • safer algorithms – to filter out the most harmful content from children’s feeds, and downrank other harmful content. Children must also be able to provide negative feedback so the algorithm can learn what content they don’t want to see
  • effective moderation – all must have content moderation systems and processes to take quick action on harmful content. Large search services should use a ‘safe search’ setting for children, which can’t be turned off and must filter out the most harmful content

According to Ofcom, the proposals would mean that: 

  • children will not normally be able to access pornography
  • children will be protected from seeing, and being recommended, potentially harmful content
  • children will not be added to group chats without their consent
  • it will be easier for children to complain when they see harmful content, and they can be more confident that their complaints will be acted on

A consultation on the proposals closed on 17 July 2024. Ofcom expects to finalise its proposals and publish a final statement in spring 2025. 

Further information on Ofcom’s work is available in the following: 

The ICO’s role

The Information Commissioner’s Office (ICO) oversees and enforces UK data protection law.  An ICO statutory code of practice (the “Children’s code”) sets out how online services should protect children’s information rights online (October 2022). The code includes details of how the ICO expects online services to apply age assurance measures that are appropriate for their use of children’s data.  In January 2024, the ICO published an updated opinion on age assurance for the children’s code. This now explains how organisations can meet their data protection obligations whilst also complying with the Online Safety Act 2023.

Hansard material

Hansard material on children and online safety is available from this link to the parliamentary database.

Further reading

The following websites provide further information on the harms that children and young people can encounter online and what can be done to keep them safe.  

  • Childnet, a charity working to keep children and young people safe online
  • SWGfL, a charity “dedicated to empowering the safe and secure use of technology”
  • Internet Matters, an organisation working with parents and professionals to keep children safe online
  • Report Harmful Content website, Online harms. This gives details of the types of online harms and how they can be reported. The Report Harmful Content website is provided by the UK Safer Internet Centre and operated by SWGfL
  •  5Rights Foundation, an international organisation “working with and for children for a rights-respecting digital world”

Related posts