Ofcom has issued a stark warning to television services about the use of ‘deep fakes’ in programmes (see Broadcast Bulletin, 3 April 2023). Although the regulator’s new Note to Broadcasters does not refer to the ‘deep fake’ images of the ‘arrest’ of former US President Donald Trump which have been circulating on social media, Ofcom’s warning is clearly linked to them. SMB consultant and former Ofcom Director Trevor Barnes explains why Ofcom has taken this step and what TV broadcasters need to do in response.
Ofcom asks all TV channels to check their compliance processes to ensure they take into account “the potential risks involved in the use of synthetic media technologies to create broadcast content.” This is because of the serious potential risks to broadcasters and their audience posed by ‘deep fakes’.
RISK ONE is mis/disinformation. One challenge here is ‘deep fakes’ used to create fake news or other disinformation that spreads online, so broadcast journalists or documentary makers face problems selecting authentic images from online sources. If a broadcast journalist does mistakenly broadcast ‘deep fake’ pictures in the news or current affairs so viewers are misled, the results would be significant. Ofcom takes a very dim view of such cases and might fine the service. (See Rules 2.2 and 5.1, Broadcasting Code).
It is not only ‘deep fakes’ that cause Ofcom concern. Sometimes footage used in gaming is so realistic it can be confused with the real thing. This happened several years ago when a serious ITV documentary Exposure: Gaddafi and the IRA caused a furore when it was revealed to have included video game footage (captioned ‘IRA film 1988’) which the programme falsely claimed was genuine footage of the IRA shooting down a helicopter (see 23 January 2013 Broadcasting Bulletin).
I personally think the greater risk in this area is of a ‘deep fake’ being presented as genuine to viewers in programmes, or on channels, funded or produced by people with certain ideological or religious beliefs which are at odds with mainstream opinion and attracted to conspiracy theories. Such an approach, coupled with weak compliance, when presented with a ‘deep fake’ in tune with a channel’s prejudices might well lead to a serious problem.
As shown in the Exposure: Gaddafi and the IRA case, the potential harm to viewers of ‘deep fakes’ or synthetic media being presented to viewers as authentic is that it undermines trust in news, current affairs and factual programmes.
RISK TWO is a breach of Fairness and Privacy rules (Sections Seven and Eight of the Code). Audiences could mistake ‘deep fake’ images of a real person in a way that could cause unfairness to them or unwarrantably infringe on their privacy. A credible example might be if someone circulated a convincing ‘deep fake’ image of an unpopular figure in the UK suspected of shady or criminal activity being arrested and it was broadcast as being genuine.
Ofcom’s Note to Broadcasters is timely and helpful, rightly stressing that broadcasters can of course include ‘deep fakes’ or other synthetic material in programmes. They should not be frightened of doing so. BUT if so – and this is the most important point – broadcasters must ensure they comply with the Broadcasting Code.
To avoid problems in this area my compliance advice to broadcasters is straightforward: “If you include synthetic media (including ‘deep fakes’) in programmes, take measures to ensure viewers do not think they are genuine images.”
SMB’s Music team is pleased to extend its congratulations to its clients who have been nominated in the Music Business UK Awards.
Read moreThe Legal 500 guide for 2025 has now been published, with SMB remaining highly ranked in many practice areas.
Read moreIn this article, recently appointed Partner, Henrietta Ronson, shares a bit about herself, her practice and our firm.
Read more