New protections for victims of AI generated Non-Consensual Exploitative Images (A.K.A Deepfake Porn)

25th September 2023

This Autumn sees the Online Safety Bill very likely to receive Royal Assent and become the Online Safety Act 2023.

It’s a big deal; the bill is set to bring in a lot of changes for the technology sector and its regulators. It blazes a trail in terms of the nature and extent of the obligations being placed on user-to-user services and the regulatory regime which it introduces to oversee them . It also seeks to address some very specific criminal matters that have largely arisen during the time that the Bill has taken to be drafted and passed through the parliamentary process.

As my colleagues have previously discussed here and here, the particular issues are revenge porn and its technological (even more) evil-twin; non-consensual exploitative images which despite Raoul Lumb (and others’) efforts is better-known as “deepfake porn.” The stellar pace of technological development (where criminals are seemingly always ahead of commercial enterprise) and the launch of various generative AI tools to the public has left the criminal justice system unable to address the new and degrading ways available to abusers to humiliate and debase victims, most often women.

Emma Davey of My Trauma Therapy Limited, an SMB client that provides counselling either face to face or through an innovative app, MyNARA, for the victims of narcissistic abuse told us:

“Revenge Porn and Deepfake porn, and the threat to publish intimate images, is hugely damaging to the victims’ mental and physical health. The fear of being exposed so publicly causes anxiety and panic. In many cases victims feel trapped and forced to  comply with the abuser’s demands, whatever those are. The victim is then isolated   and often too afraid to come forward. 

The Criminal Justice and Courts Bill 2015 made some attempts to criminalise revenge porn and to make publishing private sexual images without consent, a crime. But that did not lead to many successful prosecutions because it was necessary to prove that the publication was done with the intention of causing the individual distress. The intention element was hard to prove when such acts could be bleakly said “to be done for a laugh…

In late June, an amendment to the Online Safety Bill was made that addresses these issues by introducing into the Sexual Offences Act 2003 two new offences. The first is a lesser offence of publication of such images without consent and the second offence of threatening to publish with the intention of causing fear to the subject or another person that the material will be published. Although a lesser offence, the publication offence is intended to be easier to prosecute and will carry a sentence of up to 6 months imprisonment.

In addition, the definition of photograph has been amended to include “an image, whether made or altered by computer graphics or in any other way, which appears to be a photograph or film.” This change now picks up images that are doctored to superimpose the likeness of one person onto the body (or seemingly body) of another. This addresses the real risk that highly sophisticated AI tools are used to merge and blend genuine innocuous images with pre-existing pornographic material or to simply create movements, voice and mannerisms to produce a flawless and realistic fake film of the victim performing intimate acts.

Industry is also working to address the problem of determining reality from illusion and last week Deepmind launched a tool for watermarking AI generated video and images.

https://www.deepmind.com/blog/identifying-ai-generated-images-with-synthid

But as highlighted by Dame Maria Miller MP, the watermarking of images is only likely to be done by “good actors” and as such Dame Miller would like to see a further extension of the new criminal legislation to  make it criminal to make or take intimate images without consent and to place a burden on technology companies to identify the victims of deepfake porn and for fines levied to be used to support victims.

https://www.maria4basingstoke.co.uk/news/maria-miller-addresses-ai-threats-women-and-girls-government-consultation