Social media and smartphone use by children
In February 2024, Ofcom, the online safety regulator, reported that:
- 99% of children spend time online.
- nine in 10 children own a mobile phone by the time they reach the age of 11.
- three-quarters of social media users aged between eight and 17 have their own account or profile on at least one of the large platforms.
- despite most platforms having a minimum age of 13, six in 10 children aged 8 to 12 who use them are signed up with their own profile.
- almost three-quarters of teenagers between age 13 and 17 have encountered one or more potential harms online.
- three in five secondary school-aged children have been contacted online in a way that potentially made them feel uncomfortable.
- there is a “blurred boundary between the lives children lead online and the ‘real world’”.
What is the Government doing?
To protect children, the Online Safety Act 2023 requires social media companies to (among other things):
- remove illegal content quickly or prevent it from appearing in the first place – for example, child sexual abuse material and grooming.
- prevent children from accessing harmful content – for example, content that encourages, promotes or provides instructions for suicide, self-harm and eating disorders; bullying and violent content.
- use age age-checking measures so that children cannot access pornographic material and other and age-inappropriate content.
A Government press release of 26 October 2023 summarised the main protections the Act will give users.
What is Ofcom doing?
In May 2024, Ofcom published forty proposals that social media (and other) online services would have to take to improve children’s online safety. The proposals include:
- “Robust age checks – our draft Codes expect services to know which of their users are children in order to keep protect them from harmful content. In practice, this means that all services which don’t ban harmful content should introduce highly effective age-checks to prevent children from accessing the entire site or app, or age-restricting parts of it for adults-only access.
- Safer algorithms – under our proposals, any service that has systems that recommend personalised content to users and is at a high risk of harmful content must design their algorithms to filter out the most harmful content from children’s feeds, and downrank other harmful content. Children must also be able to provide negative feedback so the algorithm can learn what content they don’t want to see.
- Effective moderation – all services, like social media apps and search services, must have content moderation systems and processes to take quick action on harmful content and large search services should use a ‘safe search’ setting for children, which can’t be turned off and must filter out the most harmful content. Other broader measures require clear policies from services on what kind of content is allowed, how content is prioritised for review, and for content moderation teams to be well-resourced and trained.”
According to Ofcom, the proposals would mean that:
- “Children will not normally be able to access pornography.
- Children will be protected from seeing, and being recommended, potentially harmful content.
- Children will not be added to group chats without their consent.
- It will be easier for children to complain when they see harmful content, and they can be more confident that their complaints will be acted on”
The consultation runs until 17 July 2024.
Further information on Ofcom’s work is available in the following:
- Online safety webpages
- Tech firms must tame toxic algorithms to protect children online, 8 May 2024
- Protection of children online, research, 8 May 2024
- New rules for online services: what you need to know, 7 May 2024
- How the Online Safety Act will help to protect children, 6 February 2024
- Search engines can act as one-click gateways to self-harm and suicide content, Ofcom news [online], 31 January 2024
- Understanding online communications among children, Ofcom [online], 9 November 2023
- Quick guide to illegal content risk assessments, 9 November 2023.
- Ofcom’s approach to implementing the Online Safety Act, 26 October 2023.
What is the Information Commissioner’s Office (ICO) doing?
The ICO oversees and enforces data protection law in the UK. An ICO statutory code of practice (the “Children’s code”) sets out how online services should protect children’s information rights online (October 2022). The code includes details of how the ICO expects online services to apply age assurance measures that are appropriate for their use of children’s data. In January 2024, the ICO published an updated opinion on age assurance for the Children’s code. This now explains how organisations can meet their data protection obligations whilst also complying with the Online Safety Act 2023.
Press and stakeholder discussion
- Statement from the Children’s Commissioner on Ofcom’s draft Children’s Code, Children’s Commissioner Statement [online], 8 May 2024
- NSPCC website, Keeping children safe online
- Brianna mum hails ‘pivotal point’ in online campaign, BBC news [online], 8 May 2024
- The Big Ambition for Online Safety, Children’s Commissioner blog, 29 April 2024
- Inside the fight for smartphone-free childhoods, New Statesman [online], 27 April 2024
- Infant School Children Spending More Time Online As ‘Platforms Turn A Blind Eye’, Molly Rose Foundation news [online], April 2024
- Brianna Ghey’s mother: my five-point social media plan to protect children, Times [online], 31 March 2024
- Screens and teens: How phones broke children’s brains, Independent [online], 25 March 2024
- Smartphones For Under-16s – An MRF Perspective, Molly Rose Foundation news [online], March 2024
- How digital media impacts child development, University of Cambridge Judge Business School [online], 20 February 2024
- Mobile phone bans in schools: Impact on achievement, British Educational Research Association, 15 February 2024
- ‘Our kids are suffering’: calls for ban on social media to protect under-16s, Guardian, 11 February 2024
- ‘Fundamentally against their safety’: the social media insiders fearing for their kids, Guardian [online], 18 January 2024
- The Online Safety Act and AI Summit: Impacts on children’s digital lives, Internet Matters [online], 8 November 2023
- The Online Safety Bill has been passed in “a momentous day for children”, NSPCC news [online], 19 September 2023
Hansard material
Parliamentary questions
Select committee publications