
Table of contents
1. What is freedom of expression and how is it treated online?
1.1 The rights and limits to freedom of expression
Freedom of expression is the right to express and receive opinions, ideas and information. Expression and exchanges of views increasingly take place online, including through social media platforms, websites and search engines.
The right to freedom of expression is balanced by the responsibilities held by government, media and technology, and citizens. It is not an unrestricted right and is subject to legal limits. While the UN general assembly recognised in December 1948 that freedom of expression was a fundamental right to be universally protected, subsequent international agreements have recognised there can be limits to this right. For example, the European Convention on Human Rights (ECHR), adopted in 1950, was clear that the right may be limited by law. Article 10 of the convention reads that “everyone has the right to freedom of expression” but that this freedom may by subject to restrictions for a variety of reasons, including to protect the rights of others:
The exercise of these freedoms, since it carries with it duties and responsibilities, may be subject to such formalities, conditions, restrictions or penalties as are prescribed by law and are necessary in a democratic society, in the interests of national security, territorial integrity or public safety, for the prevention of disorder or crime, for the protection of health or morals, for the protection of the reputation or rights of others, for preventing the disclosure of information received in confidence, or for maintaining the authority and impartiality of the judiciary.
In the UK, the Human Rights Act 1998 brought article 10 of the ECHR into domestic law.
UK criminal or civil law applies both online and offline and can be relevant to online communication and activity. The right to freedom of expression is subject to a range of restrictions in UK law, including the:
- Malicious Communications Act 1988 and the Communications Act 2003, which criminalises “indecent or grossly offensive” messages and threats
- Public Order Act 1986, which contains offences for stirring up hatred on the grounds of race, religion or sexual discrimination
- Terrorism Act 2006, which criminalises the publication and dissemination of material that could be seen as encouraging acts of terrorism
Further information on laws that may be relevant to freedom of expression can be found in chapter 2 of the House of Lords Communications and Digital Committee report ‘Free for all? Freedom of expression in the digital age’.
1.2 Regulation of freedom of expression online
Various regulators have roles in relation to different forms of online activity. These include Ofcom, the Competition and Markets Authority and the Advertising Standards Authority. For example, this can relate to the use of individuals’ data or information and dealing with misleading or age-inappropriate online advertising.
However, there are concerns about the lack of overarching regulation and regulation covering social media and search services in the UK. This issue has been highlighted by the government in the explanatory notes to the Online Safety Bill, which state that “at present, most user-to-user and search services operating in the United Kingdom are not subject to any regulation concerning user safety”. An exception to this is the statutory video-sharing platform regime, which requires service providers to protect individuals from certain harmful content and is enforced by Ofcom.
The government introduced the online safety legislation to improve regulation, reasoning that “in light of the serious harm that content online can cause to users, more wide-reaching and comprehensive regulation of online services should be introduced”. Further details on the Online Safety Bill are set out below.
In addition, social media companies self-regulate through community standards and terms of use requirements. This focuses on material that, while not illegal, may be harmful, including bullying or misleading or indecent content. The government has also published a statutory code of practice it expects social media companies, and other hosting websites, to adhere to. It explains that the code:
[…] sets out actions that the government believes social media platforms should take to prevent bullying, insulting, intimidating and humiliating behaviours on their sites. This code of practice does not affect how illegal or unlawful content or conduct is dealt with.
Further information on digital regulation can be found in the House of Lords Library briefing ‘Digital regulation’ (18 July 2022) and the House of Commons Library briefing ‘Regulating online harms’ (15 March 2022).
1.3 Policies on freedom of expression online
Under Boris Johnson’s premiership, the government sought to address the issue of freedom of expression online through the Online Safety Bill. The bill aims to:
- preserve and enhance freedom of speech online
- increase user safety online
- improve law enforcement’s ability to tackle illegal content online
- improve users’ ability to keep themselves safe online
- improve society’s understanding of the harm landscape
The legislation addresses online content that is:
- illegal
- legal but harmful to adults
- harmful to children
The legislation was published in draft on 12 May 2021, and first introduced into the House of Commons on 17 March 2022. The bill has not yet completed its passage in the House of Commons.
The House of Lords Communications and Digital Committee launched its inquiry on freedom of expression online on 29 October 2020. It concluded its report while the draft bill was still being considered by the Joint Committee on the Draft Online Safety Bill, and before the introduction of the bill in Parliament.
Boris Johnson’s government also set out its strategy on regulating freedom of expression in the digital landscape in several policy documents, including:
- ‘Digital regulation: Driving growth and unlocking innovation’, published in July 2021. The document set out the government’s overall vision for governing digital technologies. It states three objectives: promoting competition and innovation, keeping the UK safe and secure online, and promoting a flourishing, democratic society.
- ‘A new pro-competition regime for digital markets: Consultation document’, published in July 2021. The government set out its plans for reforming the digital market. It included proposals on the Digital Markets Unit’s (DMU) objectives and how it should work with other regulators. It also proposed giving the DMU statutory powers. The DMU was set up in a non-statutory form within the Competition and Markets Authority in April 2021 to “support the rapid establishment” of the statutory competition regime for digital markets.
- ‘Online media literacy strategy’, published in July 2021. The strategy and accompanying action plan set out the government’s three-year plan to coordinate media literacy education and give users the skills needed to make safe choices online.
2. What did the committee find?
2.1 Overview
On 22 July 2021, the House of Lords Communications and Digital Committee published its report ‘Free for all? Freedom of expression in the digital age’. The report focused on how public policy could protect the right to freedom of expression online and how it should be balanced with other rights. It considered:
- how competition between platforms could be increased to the benefit of freedom of expression
- how the government should ensure illegal content is removed and legitimate opinions are not censored
- how the design of online platforms and digital citizenship education could support inclusive debate and exchange of information, ideas and opinions
The committee took the view that while freedom of expression was the “hallmark of free societies”, it was not an “unfettered right”. It argued that one person’s “abuse of their right to freedom of expression” could have a “chilling” effect on others, “leaving them less able to express themselves freely”.
The committee found that the internet had given people an “unprecedented” ability to express and share their views. It welcomed this and wanted freedom of expression strengthened online.
However, the committee found that the digital environment was dominated by a small number of private companies that had too much control. The committee criticised the companies’ freedom to ban whoever and whatever they wished, as well as to design their platforms to encourage and amplify certain types of content over others. The committee argued that social media platforms were not “neutral means” of communication. Instead, the way they are designed “shapes what users see, what they say, and how they interact”. The committee suggested that the platforms were “too often” guided by “concern for their commercial and political interests”. It stressed that the “benefits of freedom of expression must not be curtailed by these companies”.
The committee argued the government should create a new regulatory approach:
The rights and preferences of individual people must be at the heart of a new, joined-up regulatory approach, bringing together competition policy, data, design, law enforcement, and the protection of children. The UK has an opportunity to lead the world and set a standard to which fellow democracies can aspire.
2.2 Committee recommendations
The committee stated that the harm users could suffer online had received increased attention in recent years and welcomed several of the proposals put forward in the draft Online Safety Bill to address the issue. This included requiring platforms to remove illegal content and the government’s focus on protecting children from harm.
However, it disagreed with some of the other proposals in the draft legislation, such as measures to remove legal content that may be harmful to adults. The committee argued there were alternative approaches which would be more effective and better at protecting freedom of expression online. The committee made recommendations for measures in three areas:
- regulating content on social media platforms
- empowering users
- promoting choice
2.3 Regulating content
The committee found that the largest platforms had “monopolised” the market and had “unprecedented control” over what the public could say online with their power to censor views. It acknowledged that moderating views on social media was difficult, particularly since algorithms could not understand context. However, the committee argued that moderation decisions were “often unreasonably inconsistent and opaque” and sometimes influenced by commercial and political factors.
The committee made several recommendations:
- Platforms should have duties to be impartial in their moderation of political content. Any definition of “content of democratic importance” should ensure that contributions to all political debates are covered. Therefore, the definition would not be limited to debates initiated by politicians and political parties about public policy. The protections should cover the content of the platform’s terms and conditions.
- Platforms should not look to be “arbiters of the truth”. Content should only be removed in exceptional circumstances.
- Freedom of expression should be one of the underlying principles in a duty of care approach. The framework for digital regulation should be flexible and able to adapt quickly to changes. The framework should set “clear expectations” for platforms.
- Platforms should be required to remove illegal content. Ofcom should set strict timelines within which platforms should remove this kind of content. A platform would not be compliant if it systematically failed to remove illegal content or systematically removed legal content.
- The government should make sure all pornographic websites are in scope of the online safety regime and held to the highest standard. The committee argued that a “supposed inability to enforce age verification [was] no excuse”. It suggested that technological advances in age verification tools “suggest it has been a missed opportunity for the government to make clear on the face of the draft [online safety] bill” that websites hosting pornographic content will be blocked for children.
- Content that is seriously harmful to adults should be defined and criminalised through primary legislation. Content which is legal but may be harmful to adults should be addressed through strong regulation of the design of platforms, digital citizenship education and competition regulation. The committee did not support placing a duty on platforms to remove content that was legal but may be harmful to adults. It argued this could lead to “unprecedented interference” in freedom of expression.
- A joint committee of Parliament should be established to scrutinise the work of the digital regulators, independence of Ofcom and statutory instruments on digital regulation.
2.4 Empowering users
The committee heard evidence that the design of platforms shaped how its users behaved online. Witnesses argued that in some cases it was in the platforms’ interest to encourage “heated and uncivilised exchanges to hold users’ attention”. The committee welcomed design changes which encourage users to behave responsibly, such as Twitter prompting users to read articles before retweeting.
The committee recommended the following measures:
- Platforms should be required to show they have taken ‘proportionate” steps to make sure their platform design mitigates the risk of encouraging and magnifying uncivil content. The duty should apply to new and existing services.
- Platforms should be required to share information about their design with accredited researchers and have systems to share best practice with competitors.
- Platforms should give users more control over the content they see. Larger platforms should give users easily accessible toolkits of settings that allow users to decide what content they see and from whom. They should make the safest settings the default. Ofcom should have oversight of the toolkits and allow other platforms to opt in to the requirements.
- Digital citizenship should be a central part of the government’s media literacy strategy, with “proper funding”. Education in schools should cover both digital learning and appropriate behaviour online. It should promote civility and inclusion and teach how it can be practised online. This should be included in computing, physical, social and health education (PSHE) and citizenship education lessons.
- The government should commission Ofcom to research the motivations and consequences of trolling. The research should be used to inform public information campaigns that highlight the upset online abuse causes.
- Ofcom should be required to assist in coordinating digital citizenship education between civil society organisations and the digital sector. Social media companies should also highlight digital citizenship campaigns on their platforms.
2.5 Promoting choice
The committee argued that increasing competition was crucial to promoting freedom of expression online. It believed that in a more competitive market platforms would have to be more responsive to users’ concerns about freedom of expression and other rights.
The committee made the following recommendations in this area:
- The government should give statutory powers to the Digital Markets Unit (DMU). The commission suggested it was more important than the Online Safety Bill because of the “impact of competition on freedom of expression and privacy standards”.
- The DMU should have power to make structural interventions to increase competition, including working with international partners to block mergers and acquisitions, forcing companies such as Google to share click and query data with rivals and preventing companies from paying to be the default search engines on mobile phones. It also said the DMU should include human rights in its assessment of consumer welfare.
- Ofcom should be required to consider and report on competition in the market.
- The government should introduce a category in its regulatory regime for platforms which are based abroad and have very few UK users, such as local newspapers and message boards for people with niche interests. There would be less onus on these platforms to proactively prove their compliance with the new safety regime. This would avoid these smaller platforms deciding to block access from the UK to avoid the burden of the regulations.
- A mandatory bargaining code should be introduced to ensure fair negotiations between platforms and publishers. The code should cover how platforms use publishers’ content. The code should include the possibility of independent arbitration.
2.6 Government response
The government published its response to the report in October 2021. The government said it welcomed the report and was “committed to maintaining a free, open and secure internet” in line with “our democratic values”.
Setting out its strategy to increase user safety and protect pluralism online, the government highlighted several measures. These included:
- the Online Safety Bill, designed to tackle illegal content, protect children and empower users, while protecting freedom of expression online
- a media literacy strategy and safety by design guidance, giving users the skills to make safe choices online while supporting companies to build safety into their platforms
- a consultation on the pro-competition regime for digital markets, which the government said would seek views on the powers of the DMU and would legislate when parliamentary time allowed
Addressing harmful content accessed by adults, the government said that companies would not be required to remove legal content. It said the Online Safety Bill would require larger platforms to set out clearly in their terms of service whether they allowed legal content that posed a risk of harm, and how that content was treated on the platform.
On the issue of protecting children from online pornography, the government said it expected companies to use age verification technologies to prevent children from accessing services which posed the highest risk of harm. It said that companies would need to put in place these technologies or show that their alternative approach gave the same level of protection for children.
The government also stressed that companies would need to have clear systems in place to protect content that was intended to contribute to democratic political debate in the UK, whether the creator of the content was from the government or an individual political campaigner.
3. What has the government said recently?
The Online Safety Bill is currently awaiting its final day of report stage in the House of Commons. It was originally scheduled to take place on 20 July 2022. However, this was postponed following Boris Johnson’s announcement on 7 July 2022 that he was resigning as prime minister. The new prime minister, Liz Truss, has since sought to give assurances about the progress of the bill:
I can assure my right hon and learned friend that we will be proceeding with the Online Safety Bill. There are some issues that we need to deal with. What I want to make sure is that we protect the under-18s from harm and that we also make sure free speech is allowed, so there may be some tweaks required, but certainly he is right that we need to protect people’s safety online.
The Financial Times reported on 7 September 2022 that officials were working to change the definition of what was “legal but harmful”. This intends to give greater scope for people to say online what would be acceptable in person, even if someone found it offensive.
The government published the outcome of its consultation on a new pro-competition regime for digital markets in May 2022, after the committee published its report. It confirmed plans to place the DMU on a statutory footing, with the core objective to “promote competition in digital markets within and outside the UK for the benefit of consumers”. The government said it would bring forward legislation to implement the reforms when parliamentary time allowed.
The government also published its digital strategy in June 2022, which set out the government’s vision for “building a more inclusive, competitive and innovative digital economy”. As part of the strategy, the government published an initial outcome monitoring framework for digital regulation and invited views from stakeholders on how to measure regulatory outcomes. The outcomes which it is seeking to assess include:
- promoting competition and innovation
- keeping the UK safe and secure online
- promoting a flourishing democratic society, with digital technologies that support democratic engagement and preserve freedom of expression and human rights
4. Further reading and commentary
Some have raised concerns about the possible impact of the Online Safety Bill on freedom of expression. The following sources provide commentary on this matter, and the subject more generally:
- House of Commons Library, ‘Analysis of the Online Safety Bill’, 8 April 2022
- Ofcom, ‘Online Safety Bill: Ofcom’s roadmap to regulation’, 6 July 2022
- Politico, ‘New UK internet law raises free speech concerns, say civil liberties campaigners’, 29 June 2021
- Big Brother Watch, ‘Online Safety Bill latest: What can we expect to happen next?’, 1 September 2022
- Prospect, ‘Why the Online Safety Bill threatens free speech’, 16 June 2022
- Freedom House, ‘Freedom on the net 2021: United Kingdom’, accessed 17 October 2022
- Policy Institute at King’s College London, ‘Freedom of speech in the UK’s “culture war”’, May 2022
Cover image by AzamKamolov on Pixabay.