Striking the Balance between Freedom of Expression and Online Safety

2nd May 2023

The Government describes the Online Safety Bill as its way of making the UK “the safest place in the world to be online while defending free expression”, but could the Bill’s attempts to protect freedom of expression actually provoke censorship?  

Exempting “news publisher content”

Under the Bill, “news publisher content” is afforded a number of special protections, intended to protect freedom of expression and the free press. “News publisher content” is defined as:

  1. Content published by “recognised news publishers” on their own websites. Recognised news publishers are (i) The BBC; (ii) Sianel Pedwar Cymru; (iii) holders of a licence under the Broadcasting Act 1990 or 1996 who publish “news-related material” (defined under section 50 of the Bill as material consisting of news/information and opinion about current affairs; or gossip about celebrities, other public figures or other persons in the news); or (iv) any other entity which, amongst other requirements, has as its “principal purpose the publication of news-related material”;
  2. Content posted by “recognised news publishers” on Category 1 services (which, although not defined under the Bill are understood to be regulated user-to-user services with a large user base, for example, Meta or Twitter);
  3. Content posted by other users that was originally published by a “recognised news publisher”.

How does the exemption work in practice?

Recognised news publishers are under no obligation to comply with the legal duties imposed under the Bill regarding either their own published content or ‘below-the-line’ comments contributed to their own websites. This means, for example, that nothing published on the BBC website comes within the scope of the Bill.

Further, Category 1 services, are not required to moderate or remove “news publisher content” published on their platforms. For example, if the BBC posts an article on Twitter and another Twitter user (who is not a recognised news publisher) shares that article on the platform in full, Twitter will have no legal obligation under the Bill to moderate it. Screenshots, excerpts or photographs of an article or recording would therefore not benefit from the exemption.

What about specialist publishers’ content?

Despite this carve out, these protections have been criticised for being too narrow. The Professional Publishers Association, for instance, has raised concerns that other publishers – such as specialist business magazines or academic journals –  whose “principal purpose is not the publication of “news-related material”, (as defined above and under section 50 of the Bill) are unlikely to come under the scope of this protection. In practice, tech companies will be required to moderate specialist publishers’ content and remove such content, should it breach the proposed legislation. It may be, however, that such content could be afforded protection under section 15 of the Bill, as “journalistic content”.

The wider duty to protect freedom of expression

Under section 18 of the Bill, all tech companies are required to “have particular regard” for the importance of free expression.  Under sections 15 and 13, Category 1 services have a further duty to put in place systems and measures to ensure that they have “taken into account” the importance of free expression of both “journalistic content” and “content of democratic importance” when making moderation decisions with regards to content on their platforms. This means that tech companies should have policies in place that “counterbalance the importance of giving journalism free expression against other objectives which might otherwise lead to it being moderated, and implement this policy consistently”. However, the requirement for these companies to “have regard” or “take into account” are vague terms and it is unclear how far online platforms will have to (or will decide to) go in acknowledging freedom of expression in their decision-making processes.

Further, the definition of “journalistic content” under the Bill is also somewhat ambiguous. It is defined as “news publisher content” and “regulated user-generated content” created “for the purposes of journalism” which is “UK-linked”. While the Government has confirmed that this will encompass citizen journalism, neither the current Bill nor the Government provide any clarification as to what is meant by “for the purposes of journalism”.  Defining and identifying “journalistic content” will therefore likely be left either in the hands of Ofcom – the proposed legislation’s independent regulator – or the Category 1 companies themselves.

As for “content of democratic importance”, this must be “content specifically intended to contribute to democratic political debate in the United Kingdom or a part or area of the United Kingdom”. In practice, working out whether a user “specifically intended to contribute to a democratic political debate will be a tall order for the tech companies tasked with moderating content, and it is perhaps unsurprising that Meta has branded this “unworkable in practice”.

A risk of over-censorship?

In passing through the various Parliamentary stages, the Bill has gone through several rounds of review and amendment. Despite this, the Bill is far from user-friendly. Readers are having to sift through many definitions yet a lot of them remain unclear. In light of the Bill’s interpretive challenges and the penalties involved for failing to comply with the duties under it (which include imprisonment and fines of up to 10% of tech companies’ global turnover), companies may take an overly cautious approach when moderating content in order to ensure compliance, especially if these terms remain vague and open to interpretation. Freedom of expression advocates – and tech companies themselves – have raised concerns over the risk of potential over-censorship that this may bring about, running counter to “the Bill’s ambition to protect freedom of expression”.

Whilst the Bill’s expedited appeals process does offer some protection to recognised news publishers should their content be at risk of being removed (by requiring tech companies to give them notice and a chance to respond before taking action for example), other users (who are not recognised news publishers) who post journalistic content or content of democratic importance on social media platforms still run the risk of having their content removed or account banned prior to being able to appeal the decision. Given the speed of news cycles, a post-removal right of appeal for non-recognised news publishers will likely be considered insufficient by such outlets and stifle journalism.

A full understanding of how this new legislation will operate in practice is unlikely to be possible until Ofcom ‘fills in the gaps’ through its Code of Practice and the Government provides further clarity through secondary legislation. Whilst a degree of flexibility in protecting freedom of expression is a good thing, the risk of over-censorship and potential impact on free expression should not be overlooked.

Simons Muirhead Burton LLP provides advice to companies regarding the Online Safety Bill. If you have any questions, please do not hesitate to contact Emma Linch or Jeffrey Smele.