Documents to download

Warning: This briefing discusses suicide and self-harm, which some readers may find distressing.

The government’s 2023 National suicide prevention strategy noted that advances in technology, the internet and the availability of media resources have been “invaluable in raising awareness and improving access to support for suicide and self-harm”. However, it also said that the online world posed “new harms that national government, online platforms and media companies must work together to address”. The strategy set out five ambitions for improving online safety over the next five years:

  • Making social media and online platforms “safer places” for adults and children. This includes decreasing the likelihood that an individual is exposed to harmful suicide and self-harm content.
  • Public education “for healthy and safe usage of online platforms”.
  • Ensuring that signposting and support are prevalent across a range of platforms.
  • Exploring the benefits of technologies that can support the implementation of effective suicide prevention activity. For instance, the use of artificial intelligence in relation to suicide prevention.
  • Ensuring that the media “consistently portrays suicide and self-harm content responsibly”.

The Library’s briefings Suicide statistics and Suicide prevention: Policy and strategy provide further information about suicide prevention policies and strategies throughout the UK.

Online platforms

The Online Safety Act 2023 received Royal Assent on 26 October 2023. It aims to increase user safety as well as improving users’ ability to keep themselves safe online.

All regulated services must protect users from illegal content, such as suicide and self-harm content, that reaches the criminal threshold. There are additional duties for services likely to be accessed by children. For adults, the largest services must introduce optional tools to limit their exposure to legal content that encourages, promotes or provides instructions for suicide or self-harm. It also empowers Ofcom, the regulator, to disclose information from regulated services to coroners, if requested, following the death of a child.

Ofcom intends to implement the Act in three phases (PDF), with the entirety of the Act coming into force by the end of 2026.

The Act applies across the UK.

Social media companies

Content on user-to-user online services (for example, X, formerly Twitter, and Facebook) is also governed by the individual platform’s terms of service. For instance, X’s suicide and self-harm policy or Meta’s suicide and self-injury community standards. Stakeholders, for instance the children’s charity NSPCC, have suggested that these systems do not do enough to protect users from harmful online content. In response, some services have introduced changes to their policies. For example, on 9 January 2024, Meta (the parent company of Instagram and Facebook) announced new protections for users of its services. This included beginning to remove self-harm related content (and other age-inappropriate content) from the “Feed and Stories” of teenage users, “even if it’s shared by someone they follow”. 

Press

There are two press regulators in the UK. Many titles have signed up to the Independent Press Standards Organisation (IPSO). The IPSO Editors’ Code of Practice states:

When reporting suicide, to prevent simulative acts care should be taken to avoid excessive detail of the method used, while taking into account the media’s right to report legal proceedings.

There may be exceptions to this clause (and others in the code) where they can be demonstrated to be in the public interest.

A smaller number of publications have joined IMPRESS. The IMPRESS Standards Code states:

When reporting on suicide or self-harm, publishers must not provide excessive details of the method used, the specific location or speculate on the motives.

Both regulators are voluntary and other publications, for example the Guardian, have not joined either regulator but have appointed their own internal readers’ ombudsmen.

Organisations have also published specific media guidelines for reporting suicide. For example, the Samaritans’ media guidelines for reporting suicide, and the National Union of Journalists’ guidelines for reporting mental health and death by suicide.

Broadcasting

Ofcom, the UK’s communications regulator, has published a Broadcasting Code that sets the rules for programmes broadcast on television, radio and BBC on-demand services. Section 2 of the Code covers “harm and offence” that includes the following on violence, dangerous behaviour and suicide: 

2.4: Programmes must not include material (whether in individual programmes or in programmes taken together) which, taking into account the context, condones or glamorises violent, dangerous or seriously antisocial behaviour and is likely to encourage others to copy such behaviour.

[…]

2.5: Methods of suicide and self-harm must not be included in programmes except where they are editorially justified and are also justified by the context.

Compliance with the code is the responsibility of individual broadcasters.


Documents to download

Related posts