Regulating social media: “there needs to be reform but it must be workable”

4th March 2019

The clear message from experts debating the regulation of Big Tech is that the sector needs reform but that the plans to regulate social media more tightly must be realistic and workable. Last week’s Media Society debate (28 February) was hosted by Simons Muirhead Burton, the leading West End law firm, and featured panelists representing not only the tech industry, but broadcasters and expert commentators.

The BBC’s Technology correspondent, Rory Cellan-Jones, made the point that the debate has come a long way from a decade ago. Then there was “arrogance” on the tech side (freedom of the internet was paramount) and “ignorance” on the part of the politicians. Now politicians are much more alert to the issues but seem awed by the sheer size and economic dominance of the tech giants, whose primary aims are money and growth. Rory argued that the most effective way to regulate them might be linked to breaking them up and reducing their monopoly power.

Big tech academic expert, Martin Moore (KCL, London) supported such a move. In regard to harm on social media, Martin underlined the sheer immensity of the task: the flood of material being uploaded every minute and the huge number of monitors required to regulate it effectively. He pointed to how a form of effective regulation of social media was like the one in China, through the use of algorithmic computer programmes (so “unacceptable” material was sent but was never received). But this method was blunt and not transparent and also raised serious issues of freedom of expression if used by Western democracies. Martin said one way forward might be to develop the idea of social media companies having a specific “duty of care” towards their users, akin to that owed by sports stadium owners towards spectators. He also pointed to the lack of political will in the UK to resource institutions like the Electoral Commission properly and equip it with adequate powers to deal with the issue of “fake news” and lack of transparency in the area of political advertising.

Dorothy Byrne, Head of Current Affairs at Channel 4, eloquently voiced the concerns of ordinary British citizens. She said that social media companies were too often putting money before the safety of users. Most people in the UK had a fairly clear idea of what unacceptable “harmful” material was when posted on social media, and were losing patience with politicians and some experts who focussed on the difficulties of tighter regulation rather than finding solutions – some of which might be imperfect but were better than doing nothing.

Vinous Ali, Head of Policy for Tech UK and representing its over 900 tech members, underlined that the industry was not opposed to tighter regulations. But it must be practicable and flexible, suitable to regulate all tech companies large and small, and not just be developed as a panic response to negative media coverage and aimed at solving perceived problems at one Big Tech giant, like Facebook. The industry needed, for example, some clear definitions of what “harm” was. Her comments echoed those made by the Internet Association in its letter to government on the 1st of March.

There was a general consensus that the proposals of the DCMS Select Committee to force social media companies to adopt and follow a new code of practice or face large fines were not well thought through. The government’s forthcoming White Paper (drafts of which were now circulating around Whitehall according to panel chair, Phil Harding) was awaited with great interest but considerable scepticism. As Martin Moore commented, there was no “silver bullet” to solve all these complex policy and legal issues.