“Netflix” for antimicrobials: The Antimicrobial Products Subscription Model
The NHS will pay a subscription fee to pharmaceutical companies, which will give it access to new antimicrobial drugs for drug-resistant infections.
Ofcom enforces measures in the Online Safety Act to protect web users from suicide or self-harm content. It also regulates broadcast media, but not the press.
Suicide prevention: online platforms, print media and broadcasting (201 KB , PDF)
Warning: This briefing discusses suicide and self-harm, which some readers may find distressing.
The government’s 2023 National suicide prevention strategy noted that advances in technology, the internet and the availability of media resources have been “invaluable in raising awareness and improving access to support for suicide and self-harm”. However, it also said that the online world posed “new harms that national government, online platforms and media companies must work together to address”. The strategy set out five ambitions for improving online safety over the next five years:
The Library’s briefings Suicide statistics and Suicide prevention: Policy and strategy provide further information about suicide prevention policies and strategies throughout the UK.
The Online Safety Act 2023 received Royal Assent on 26 October 2023. It aims to increase user safety as well as improving users’ ability to keep themselves safe online.
All regulated services must protect users from illegal content, such as suicide and self-harm content, that reaches the criminal threshold. There are additional duties for services likely to be accessed by children. For adults, the largest services must introduce optional tools to limit their exposure to legal content that encourages, promotes or provides instructions for suicide or self-harm. It also empowers Ofcom, the regulator, to disclose information from regulated services to coroners, if requested, following the death of a child.
Ofcom intends to implement the Act in three phases (PDF), with the entirety of the Act coming into force by the end of 2026.
The Act applies across the UK.
Content on user-to-user online services (for example, X, formerly Twitter, and Facebook) is also governed by the individual platform’s terms of service. For instance, X’s suicide and self-harm policy or Meta’s suicide and self-injury community standards. Stakeholders, for instance the children’s charity NSPCC, have suggested that these systems do not do enough to protect users from harmful online content. In response, some services have introduced changes to their policies. For example, on 9 January 2024, Meta (the parent company of Instagram and Facebook) announced new protections for users of its services. This included beginning to remove self-harm related content (and other age-inappropriate content) from the “Feed and Stories” of teenage users, “even if it’s shared by someone they follow”.
There are two press regulators in the UK. Many titles have signed up to the Independent Press Standards Organisation (IPSO). The IPSO Editors’ Code of Practice states:
When reporting suicide, to prevent simulative acts care should be taken to avoid excessive detail of the method used, while taking into account the media’s right to report legal proceedings.
There may be exceptions to this clause (and others in the code) where they can be demonstrated to be in the public interest.
A smaller number of publications have joined IMPRESS. The IMPRESS Standards Code states:
When reporting on suicide or self-harm, publishers must not provide excessive details of the method used, the specific location or speculate on the motives.
Both regulators are voluntary and other publications, for example the Guardian, have not joined either regulator but have appointed their own internal readers’ ombudsmen.
Organisations have also published specific media guidelines for reporting suicide. For example, the Samaritans’ media guidelines for reporting suicide, and the National Union of Journalists’ guidelines for reporting mental health and death by suicide.
Ofcom, the UK’s communications regulator, has published a Broadcasting Code that sets the rules for programmes broadcast on television, radio and BBC on-demand services. Section 2 of the Code covers “harm and offence” that includes the following on violence, dangerous behaviour and suicide:
2.4: Programmes must not include material (whether in individual programmes or in programmes taken together) which, taking into account the context, condones or glamorises violent, dangerous or seriously antisocial behaviour and is likely to encourage others to copy such behaviour.
[…]
2.5: Methods of suicide and self-harm must not be included in programmes except where they are editorially justified and are also justified by the context.
Compliance with the code is the responsibility of individual broadcasters.
Suicide prevention: online platforms, print media and broadcasting (201 KB , PDF)
The NHS will pay a subscription fee to pharmaceutical companies, which will give it access to new antimicrobial drugs for drug-resistant infections.
Antimicrobial resistance (AMR) is a significant threat to public health. This briefing provides information on the causes and implications of the development and spread of AMR and about UK and international action to address it.
An overview of the relevant legislation, guidance and debates concerning the policing of protests.