The “digital age of consent” 

Under Article 8 of the UK General Data Protection Regulation, the age at which children can access information society services (ISS) is 13. Most online services are ISS, including social media platforms, apps, content streaming services (such as video, music or gaming services), online games, and news or educational websites.

The Information Commissioner’s Office (ICO), the body responsible for enforcing data protection law, has published a code of practice setting out how online services should protect children’s information rights online (October 2022). This includes details of how the ICO expects online services to apply age assurance measures that are appropriate for their use of children’s data. In January 2024, the ICO published an updated opinion on age assurance for the children’s code. This explains how organisations can meet their data protection obligations whilst also complying with the Online Safety Act 2023.

The Online Safety Act 2023

The protection of children is one of the key aims of the 2023 act. To protect children, social media companies must:

  • remove illegal content quickly or prevent it from appearing in the first place – for example, child sexual abuse material and grooming.
  • prevent children from accessing harmful content – for example, content that encourages, promotes or provides instructions for suicide, self-harm and eating disorders; bullying and violent content.
  • use age age-checking measures so that children cannot access pornographic material and other and age-inappropriate content.

The Department for Science, Innovation and Technology, the government department with responsibility for online safety, has published an “explainer” on what the act does (May 2024), including how it will protect children.

Ofcom is the online safety regulator. It is implementing the act in phases, as summarised on its website.

E-petition 700086

E-petition 700086 calls for social media companies to be banned from letting children under 16 create social media accounts. According to the petition, this would:

  • Stop online bullying
  • Stop children being influenced by false posts
  • Stop children seeing content that encourages violence/could be harmful for their future.

The petition states:

We believe social media is having more of a negative impact to children than a positive one. We think people should be of an age where they can make decisions about their life before accessing social media applications. We believe that we really need to introduce a minimum age of 16 to access social media for the sake of our children’s future along with their mental and physical health.

The petition has received over 127,000 signatures.

Government response

In its December 2024 response, the government said that tech companies should “take responsibility to ensure their products are safe for UK children” and that the 2023 act was a “crucial tool in holding them to account for this”. The government said it was aware of the debate as to what age children should have smartphones and access to social media. However, it was “not currently minded to support a ban for children under 16”:

…Children face a significant risk of harm online and we understand that families are concerned about their children experiencing online bullying, encountering content that encourages violence, or other content which may be harmful. We will continue to do what is needed to keep children safe online. However, this is a complicated issue. We live in a digital age and must strike the right balance so that children can access the benefits of being online and using smartphones while we continue to put their safety first. Furthermore, we must also protect the right of parents to make decisions about their child’s upbringing.

The current evidence on screentime is mixed and a systematic review by the UK Chief Medical Officers in 2019 does not show a causal link between screen-based activities and mental health problems, though some studies have found associations with increased anxiety or depression. Therefore, the government is focused on building the evidence base to inform any future action. Last month, the government commissioned a feasibility study into future research to understand the ongoing impact of smartphones and social media on children, to grow the evidence base in this area.

The government said its priority was to work with Ofcom to “effectively implement” the 2023 act so that social media users, especially children, could benefit from its protections “as soon as possible”.

Further reading

Hansard material


Related posts