Social media and smartphone use by children

In February 2024, Ofcom, the online safety regulator, reported that:

  • 99% of  children spend time online.
  • nine in 10 children own a mobile phone by the time they reach the age of 11.
  • three-quarters of social media users aged between eight and 17 have their own account or profile on at least one of the large platforms.
  • despite most platforms having a minimum age of 13, six in 10 children aged 8 to 12 who use them are signed up with their own profile.
  • almost three-quarters of teenagers between age 13 and 17 have encountered one or more potential harms online.
  • three in five secondary school-aged children have been contacted online in a way that potentially made them feel uncomfortable.
  • there is a “blurred boundary between the lives children lead online and the ‘real world’”.

What is the Government doing?

To protect children, the Online Safety Act 2023 requires social media companies to (among other things):

  • remove illegal content quickly or prevent it from appearing in the first place – for example, child sexual abuse material and grooming.
  • prevent children from accessing harmful content – for example, content that encourages, promotes or provides instructions for suicide, self-harm and eating disorders; bullying and violent content.
  • use age age-checking measures so that children cannot access pornographic material and other and age-inappropriate content.

A Government press release of 26 October 2023 summarised the main protections the Act will give users.

What is Ofcom doing?

In May 2024, Ofcom published forty proposals that social media (and other) online services would have to take to improve children’s online safety. The proposals include:

  • “Robust age checks – our draft Codes expect services to know which of their users are children in order to keep protect them from harmful content. In practice, this means that all services which don’t ban harmful content should introduce highly effective age-checks to prevent children from accessing the entire site or app, or age-restricting parts of it for adults-only access.
  • Safer algorithms – under our proposals, any service that has systems that recommend personalised content to users and is at a high risk of harmful content must design their algorithms to filter out the most harmful content from children’s feeds, and downrank other harmful content. Children must also be able to provide negative feedback so the algorithm can learn what content they don’t want to see.
  • Effective moderation – all services, like social media apps and search services, must have content moderation systems and processes to take quick action on harmful content and large search services should use a ‘safe search’ setting for children, which can’t be turned off and must filter out the most harmful content. Other broader measures require clear policies from services on what kind of content is allowed, how content is prioritised for review, and for content moderation teams to be well-resourced and trained.”

According to Ofcom, the proposals would mean that:

  • “Children will not normally be able to access pornography.
  • Children will be protected from seeing, and being recommended, potentially harmful content.
  • Children will not be added to group chats without their consent.
  • It will be easier for children to complain when they see harmful content, and they can be more confident that their complaints will be acted on”

The consultation runs until 17 July 2024.

Further information on Ofcom’s work is available in the following:

What is the Information Commissioner’s Office (ICO) doing?

The ICO oversees and enforces data protection law in the UK. An ICO statutory code of practice (the “Children’s code”) sets out how online services should protect children’s information rights online (October 2022). The code includes details of how the ICO expects online services to apply age assurance measures that are appropriate for their use of children’s data.  In January 2024, the ICO published an updated opinion on age assurance for the Children’s code. This now explains how organisations can meet their data protection obligations whilst also complying with the Online Safety Act 2023.

Press and stakeholder discussion

Hansard material

Parliamentary questions

Select committee publications


Related posts