Can governments regulate the internet? Concerns over misuse of social media, online abuse and fake news have been raised loudly in the last year. The previous Government’s policy was to work internationally to ensure the continuation of a “free, open and secure internet that supports our economic prosperity and social well-being”.

Self-governance rules online: social media websites have ‘community’ rules, while consumer feedback and reviews build consumer confidence. While there is no specific law governing internet content in the UK, laws and offences that apply offline apply equally online. For those seeking more rules, regulating an evolving area can be difficult: we risk technology developing faster than the law.

Fakes News and what to do about it

Defining fake news can be challenging. The BBC News definition is:

False information deliberately circulated by those who have scant regard for the truth but hope to advance particular (often extreme) political causes and make money out of online traffic.

They add that social media often presents real and fictional stories in a similar way, making it difficult to tell them apart.

Some argue the best way to tackle the issue is through education – helping users to assess online information critically. Some social media sites have started to tackle fake news, with Facebook showing users warning signs when they are about to read potentially false information.

Others say the influence of fake news has been overstated and efforts to tackle it could amount to censorship.

Online abuse

Several existing offences, scattered across different legislation, are used to deal with online abuse (e.g. “trolling” and threatening behaviour). The Crown Prosecution Service (CPS) has published guidelines on prosecuting such offences in relation to social media. A recent addition, aimed more specifically at online behaviour, is the new “revenge pornography” offence under section 33 of the Criminal Justice and Courts Act 2015.

The previous Parliament saw calls for specific legislation to deal with online abuse in a parliamentary debate on the issue. The Home Affairs Select Committee’s April 2017 report on hate crime pointed out that “most legal provisions in this field predate the era of mass social media use and some predate the internet itself.” It called for Government to review the entire legislative framework governing online hate speech, harassment and extremism, and for social media companies to do more to safeguard users.

The Government announced in February 2017 that a Green Paper on online safety for young people would be published this summer.

Protecting children – age verification for online pornography

The Digital Economy Act 2017 will force commercial providers of online pornography to have age controls in place if they want to make adult material available in the UK. The intention is to protect under-18s from “distressing or unrealistic images of sex”, which may harm their ability to “develop healthy personal relationships based on respect and consent”.

The British Board of Film Classification (BBFC), will (subject to parliamentary approval) act as the age verification regulator. It will direct Internet Service Providers (ISPs) to block access to websites which don’t have age verification controls in place or which make “extreme pornography” available (the possession of which is already illegal).

When the Digital Economy Bill was introduced, the Government planned to prevent underage access to material that would be rated 18 or R18 by the BBFC – i.e. the same standards would apply to adult material in the offline and online worlds.

Critics claimed the Government’s original proposals would result in censorship of “non-conventional” sex acts. The Government acknowledged these concerns and introduced amendments in the Lords, saying the Bill’s provisions were concerned with child protection, not censorship.

The BBFC will only be able to direct ISPs to block access to sites containing “extreme pornography” (provided they have age verification controls in place). This was controversial, with some Lords concerned this would still allow access to violent and abusive material.

The Government said that its internet safety strategy would look at these issues and pointed out that content behind age verification controls could still be prosecuted under legislation such as the Obscene Publications Act 1959.

Technology and the internet will continue to evolve, and how these changes should be managed and regulated will be on Parliament’s agenda for years to come.

This article is part of Key Issues 2017 – a series of briefings on the topics that will take centre stage in UK and international politics in the new Parliament.

How is content driven to users?

Social media platforms and internet search engines use algorithms (sequences of instructions) to personalise the content that each individual user sees.

Content on social media platforms includes posts generated by users and third parties, and adverts. The order in which it is displayed is typically determined by these algorithms and is based on the user’s activity and connections. Material produced by the people or organisations that a user interacts with most, and content which garners the most interactions, will appear higher in the list.

Internet search engines use algorithms to find results that are likely to be most relevant to the user. They depend on numerous factors, including the user’s previous searches and location.