This Library Paper looks at Government plans to regulate content on social media.

Download the full report

What’s the problem?

There is increasing concern about harmful content and activity on social media. This includes cyberbullying, the intimidation of public figures, disinformation, material promoting violence and self-harm, and age inappropriate content.

Critics, including parliamentary committees, academics, and children’s charities, have argued that self-regulation by social media companies is not enough to keep users safe and that statutory regulation should be introduced.

The Online Harms White Paper (April 2019) – a new regulatory framework?

An Online Harms White Paper was published in April 2019. This set out the then Government’s approach for tackling “content or activity that harms individual users, particularly children, or threatens our way of life in the UK.”

According to the White Paper, existing regulatory and voluntary initiatives had “not gone far or fast enough” to keep users safe. The Paper proposed a single regulatory framework to tackle a range of harms. At its core would be a statutory duty of care for internet companies, including social media platforms. An independent regulator would oversee and enforce compliance with the duty. 

A consultation on the proposals closed on 1 July 2019.

Reaction to the White Paper

The White Paper received a mixed reaction. Children’s charities were positive. The NSPCC said that the Paper was a “hugely significant commitment” that could make the UK a “world pioneer in protecting children online”. However, some commentators raised concerns that harms were insufficiently defined and that the Paper blurred the boundary between illegal and harmful content.

The Open Rights Group and the Index on Censorship warned that the proposed framework could threaten freedom of expression.

Initial response to the White Paper consultation (February 2020)

On 12 February 2020, the Government published its initial response to the consultation on the White Paper. The response states, among other things, that the Government is minded to make Ofcom the regulator for online harms.

To protect freedom of expression, the new regulatory framework would not require companies in scope to remove specific pieces of legal content. Instead, they would be required to explicitly state what content and behaviour would be considered acceptable on their sites and to enforce this consistently and transparently. Illegal content would have to be removed “expeditiously”. Robust action would be required in relation to terrorist content and child exploitation and abuse. The regulator would not investigate or adjudicate on individual complaints.

The proposed duty of care would only apply to companies that “provide services or use functionality on their websites which facilitate the sharing of user generated content or user interactions, for example through comments, forums or video sharing”. According to the Government, fewer than 5% of UK businesses would be in scope.

The Government is still considering issues such as the enforcement powers of the regulator and senior management liability.

A final response is due in spring 2020.

In a Commons debate on 13 February 2020, the DCMS said that legislation would be introduced in this parliamentary session

Download the full report