Rather than debate the merits of the Act and whether the obligations and duties it imposes are proportionate to its overall aim, the time has come for business to understand, in practical and actionable terms, what they need to do to ensure that they don’t fall foul of the new legislation.
The first thing to determine is whether you offer a service which falls within the scope of the OSA. The OSA covers two types of services:
Before breathing a sigh of relief too quickly, recognise that these are broad definitions, and capture commonplace features such as chat rooms or bulletin boards that may feature on sites/platforms where that functionality isn’t the core service. So have a look through your website and determine whether you do, in fact, have any such features. If yes, read on.
If you do find that your business – or a particular bit of functionality offered on your website/platform – is caught by the scope of the OSA, it doesn’t mean that it is time to panic just yet. The OSA is designed to impose the most significant obligations on the companies with the most significant reach. We await secondary legislation to specify exactly which businesses fall within Category 1, but for now we know that Category 1 is reserved for the highest risk, highest reaching user to user platforms. SMB predicts that this will be companies such as Meta, Tik Tok and X (that prediction does not require a crystal ball). Category 2A will cover companies with the highest reach search services (Google and Bing, etc) and category 2B will cover other platforms with high risk functionality (i.e. all of the other user to user services).
What that means is that most companies which offer functionality that falls into the categories described above will fall into category 2B. This means that while companies offering the above services are within the scope of the OSA, they will only be obliged to comply with its lower tier of obligations.
Appoint a member of staff to be responsible for understanding the legislation and for driving any changes that will be required to ensure the business is compliant. The responsible person’s responsibilities will include facilitating training of other staff members, carrying out risk assessments with input from any other relevant members of staff (see point three below for further information), monitoring the guidance and secondary legislation which continues to be published and, depending on the size and risk profile of your business, responsible for reviewing content and/or managing the complaints procedures (see points 4 and 5 below). This should be a senior member of staff, and the role should be given considerable weight and importance – responsible members of staff can be found personally liable if their business is found to be in breach of the OSA.
Once you’ve determined that your business is in scope and you have appointed someone to be responsible for compliance, this individual should take proactive action to understand your company’s risk profile. The OSA sets out 15 categories of harm that it is seeking to prevent, so you should consider whether your service is likely to facilitate the creating or sharing of any content which falls within those 15 categories. You should also consider who your user base is, and whether they are likely to be children (which for the purposes of the OSA is anyone under the age of 18) or vulnerable adults. If so, your risk profile will be increased.
Their guidance remains a work in progress, but Ofcom have published initial guidance on how to carry out these risk assessments which include following these four steps:
The process is analogous to carrying out a data protection impact assessment, as required under the UK GDPR. Businesses should, in the first instance, carry out a review of what is happening and set up a system for revisiting that conversation at a frequency which is proportionate to the risk of harm identified.
For further information, please see: https://www.ofcom.org.uk/online-safety/information-for-industry/guide-for-services/risk-assessments
Given you have a legal duty to take down harmful content, it is critical that you have a procedure for identifying harmful content. This should be both an internal mechanism for regular reviews of content which is available on your site (which can be carried out by software or by giving someone within your organisation the duty of conducting regular reviews of the website) and a public facing mechanism which allows visitors to your website to easily and quickly flag content to you which they believe to be harmful.
However, you need to keep in mind that reported content should not automatically be removed. Users of your service have a right to bring a claim against you in the event that you remove their content or ban them from your service for alleged sharing of harmful content where it transpires that there was, in fact, no actual breach. This right exists for users of services to deter companies from simply having a procedure which means any reported content is immediately removed without consideration.
That sanction is designed to protect the fundamental principle of freedom of expression (against the background of an Act which is otherwise largely concerned with policing and regulating speech) and creates an undeniably awkward tension for affected businesses. In practice, the tension created by the two duties creates a responsibility for removing content cannot be delegated to software entirely, nor one that can take a blanket approach of always saying Yes or always saying No. To satisfy it a business will most likely need a human input, and where content is valuable (or the businesses reputation in relation to content is valuable) that will need to be one who is familiar with the Act to have final responsibility for pieces of content which are nuanced or sensitive.
Depending on the risk profile you identified for your business when carrying out a Risk Assessment, you may decide that the responsible person can cover this role alone, or you may need to consider whether this is a function which is likely to be in high demand and will require a dedicated individual or team.
You will also need to establish a mechanism by which users can make complaints that the business has not removed unlawful content, has not responded to reported content, or has breached the freedom of expression and privacy of an individual by removing their content unfairly, or where the business has taken steps in relation to content which unfairly mean that content relating to that person no longer appears in search results or is given a lower priority in search results.
The procedure should be easy and straightforward to use, including by a child, and transparent. For example, you should be clear with users who will be responsible for dealing with their complaint, how long you expect to take to respond to their complaint, and the reasoning behind any decisions reached as a result of that complaint. It would be good practice to draft and make available a complaints policy which users can access on the website, similar to a privacy policy.
Technically, this should be straightforward enough to ask your web designers to add this functionality to the website. Behind the scenes is where the difficulties lie. As set out at point 5 above, the extent to which you will need a dedicated team to be responsible for handling these complaints, or whether your responsible person can absorb this role depends entirely on your risk profile and the volume of content which you process. Businesses should establish a procedure based on the risk profile identified, and be prepared to respond accordingly to scale up this role in the event that more complaints are received than expected.
Terms and conditions should be updated so that users of your service are aware that you have identified that there is a risk of harm for users who use your services, and that you will take proactive steps to remove any content which is harmful. It should be expressly stated within your terms and conditions that where users share content, it must not be harmful.
As set out in point 4, you should also highlight that users have a right to bring a claim against you if their content is removed where it was not, in fact, harmful.
You may choose to incorporate your Complaints policy (see point 5) into your terms, or make reference to them in these terms if you want to have the complaints policy as a standalone policy.
The new legislation has lurched into reality, but we are still in the dark about large swaths of the legislation and how it will work. Much of the Act’s fine detail, and regulatory expectation, is due to be filled in by Ofcom as it issues guidance over the course of this year. Businesses will need to take proactive steps to comply, but will also need to be prepared to revisit these conversations until a routine has been established which enables you to embed the duties set out in the new legislation into the day to day running of your business without becoming commercially unviable.
the time has come for business to understand, in practical and actionable terms, what they need to do to ensure that they don’t fall foul of the new legislation
If you are considering separating from your partner, the process of divorce can seem daunting. You are not alone in feeling like this and the family team at SMB are regularly asked about the process of divorce and mediation. Below we have compiled some of the most popular questions that clients ask us about mediation to help sign post you to your next steps.
Read moreAs the Horizon IT Post Office Inquiry Phase 6 comes to an end, and ever more shocking evidence appears about how sub-postmasters were unlawfully convicted as part of a massive criminal conspiracy, when considering all of the bad actors who contributed to the biggest miscarriage of justice in English legal history, it’s worth remembering that there are also people who fall into the opposite category.
Read moreIn order to thwart threatened (or successfully defend actual) defamation claims, publishing lawyers regularly advise journalist/publisher clients on the proper fulfilment of their journalistic obligations.
Read more