Documents to download

Harmful online content and activity includes cyberbullying, racism, misogynistic abuse, pornography, and material promoting violence and self-harm. The Covid-19 pandemic has seen social media platforms used to spread anti-vaccine disinformation.

Critics, including parliamentary committees, academics, and children’s charities, have argued that self-regulation by internet companies is not enough to keep users safe and that statutory regulation should be introduced.

The Online Harms White Paper (April 2019)

An Online Harms White Paper (April 2019) argued that existing regulatory and voluntary initiatives had “not gone far or fast enough” to keep users safe. The Paper proposed a single regulatory framework to tackle a range of harms. At its core would be a duty of care for internet companies, including social media platforms. An independent regulator would oversee and enforce compliance with the duty. A consultation on the proposals closed in July 2019.

The White Paper received a mixed reaction. Children’s charities were positive. However, some commentators raised concerns that harms were insufficiently defined. The Open Rights Group and the Index on Censorship warned that the proposals could threaten freedom of expression.

Government response to the White Paper consultation (December 2020)

A full Government response to the White Paper consultation was published in December 2020. This confirmed that an Online Safety Bill would be introduced to impose duties on content-sharing platforms and search services to keep users safe. Ofcom would be the regulator.

Draft Online Safety Bill (May 2021)

A draft Online Safety Bill was included in the Queen’s Speech of 11 May 2021. The draft Bill was published the following day, along with Explanatory Notes, an Impact Assessment and a Delegated Powers Memorandum.

Pre-legislative scrutiny

A Joint Committee of both Houses was established in July 2021 to scrutinise the draft Bill. The Committee’s report was published on 14 December 2021. This said that the Bill was a “a key step forward” in bringing “accountability and responsibility to the internet”. However, the Committee argued that the Bill should be restructured so that its objectives were clear from the beginning.

The Committee put forward what it referred to as a “cohesive set of recommendations” to strengthen the forthcoming legislation. These included agreeing with Law Commission recommendations on new communications offences. The Committee also recommended that:

  • all pornography sites should have duties to stop children from accessing them, regardless of whether the sites hosted user-to-user content.
  • individual users should be able to complain to an Ombudsman when platforms failed to comply with their obligations.
  • a senior manager should be designated as the “safety controller” with liability for a new offence – failing to comply with their obligations when there was clear evidence of repeated and systemic failings that resulted in a significant risk of serious harm to users.

Other select committee reports

The draft Bill has also been examined in the following reports:

February/March 2022: Government announces changes to the forthcoming Bill

The Government has said that the Online Safety Bill will be introduced “as soon as possible”. In February and March 2022, the Government announced changes to the forthcoming Bill. 

Communications offences

On 4 February, the DCMS announced that it was accepting the Law Commission’s recommendations for a harm-based communications offence, a false communications offence, and a threatening communications offence. The offences would be brought into law through the Bill. The Government was considering the Commission’s other recommendations for offences relating to cyberflashing, hoax calls, encouraging or assisting self-harm, and epilepsy trolling.

Priority offences

On 7 February, the DCMS announced that it would be setting out further priority offences on the face of the Bill (offences relating to terrorism and child sexual abuse and exploitation are already listed). This was in response to recommendations from the Joint Committee on the draft Bill, the DCMS Committee and the Petitions Committee. The offences would include incitement to and threats of violence, hate crime, and financial crime. Listing these offences in the Bill would mean that companies would not have to wait for secondary legislation before taking proactive steps to tackle priority illegal content.

Protecting children from pornography

On 8 February, the DCMS announced that the Bill would be strengthened so that all providers who published or placed pornographic content on their services would need to prevent children from accessing that content. This was in response to concerns that non-user generated pornography was not within the scope of the draft Bill. 

Online abuse

On 25 February, the DCMS announced that, to tackle online abuse, including anonymous abuse, the Bill would impose two additional duties on category 1 service providers (i.e. the largest platforms):

  • a “user verification duty” would require category 1 providers to give adult users an option to verify their identity. Ofcom would publish guidance setting out how companies could fulfil the duty and the verification options that companies could use.
  • a “user empowerment tools duty” would require category 1 providers to give adults tools to control who they interacted with and the legal content they could see.

This was in response to concerns raised by the Joint Committee on the draft Bill, the DCMS Committee, and the Petitions Committee about the impact of abuse and the need to give users more control over who they interacted with.

Paid-for adverts

On 9 March, the DCMS announced that category 1 service providers and search services would have a duty to prevent the publication of paid-for fraudulent adverts. This was in response to recommendations from the Joint Committee on the draft Bill, the DCMS Committee and others.

The Government also announced a Consultation on the Online Advertising Programme. This would complement the Bill and would seek views on improving transparency and accountability across the online advertising supply chain.

Cyberflashing

On 14 March, the DCMS announced that the Bill would create a new criminal offence relating to cyberflashing. This would be constructed as recommended by the Law Commission. A new section 66A would be inserted into the Sexual Offences Act 2003 to criminalise:

intentionally sending or giving a photograph or film of any person’s genitals to another person with the intention that that person will see the genitals and be caused alarm, distress or humiliation, or for the purpose of obtaining sexual gratification and reckless as to whether the recipient will be caused alarm, distress or humiliation.

Related Library Briefing

A selection of comment on the draft Bill is available in the Library Paper, Reaction to the draft Online Safety Bill: a reading list.


Documents to download

Related posts