Don’t call it Deepfake Porn, it’s NCEI

9th March 2023

I don’t often have much sympathy for attempts to mandate new language, but this time I’m making an exception. Twitch has declared a new policy of refusing to call “Deepfake Porn” that name and is instead going to call it synthetic Non-Consensual Exploitative Images (“NCEI”) going forwards. I think that’s a positive step.

I confess that “Deepfake Porn” wasn’t a term I’d thought about much before reading Twitch’s new policy document. But when I started thinking about it I realised that the usual language was wrong.

To explain the rationale behind the policy, the justification is that it’s wrong to describe sexualised media created without the consent of its subjects as ‘pornography’ at all; which Twitch (I think rightly) says is a word that should only be used to describe media that features consenting subjects who knew that they were creating a pornographic work when they were being filmed/photographed. By changing the way that we think about NCEI, and stopping people from talking about it as if it were just a regular sub-genre of ‘pornography’ the aim is to get people thinking about it in the same way as other types of abusive and unlawful content. With the aim of stamping it out.

For someone who works at the UK’s leading law firm for defending free speech, agreeing with language policing is a bit of an unusual position to find myself in. My colleagues in our Media Litigation team spend their days defending journalists and publishers right to publish freely, and my own practice focuses on helping tech companies (with businesses like Twitch) to maximise the quantity of data and digital media that they can collect and use, and increasingly on helping companies to adopt new AI-driven technologies. So while I’m familiar with advising on bans on unlawful content, I’m not normally in a place where I find myself agreeing with a ban on specific words, especially words that we use to describe AI generated content.

But with NCEI, I really think it’s the right thing to do. It’s a small change that I’m going to get on board with, and which I’d encourage you to do too.

The practice of superimposing an unwilling third-party into a scene of sexual activity without their consent is something that we should see as a sexual offence, in just the same way that we see Voyeurism (the offence of recording someone without their consent, with a view to using that footage for sexual gratification) or Harassment (targeting someone with communications or actions designed to distress, upset or intimidate). My colleague Emma Linch has already talked about proposals to criminalise it here.

Put simply, it’s not a material infringement of any civil liberty to stop people from talking about NCEI in a way that legitimises it as some kind of ‘pornography’. It’s a small change that helps to reframe the way we think about a serious issue for the better. So, for once, I’m happy to see a company telling us to change the way we talk about something.

Call it NCEI.

Addressing Explicit Deepfake Content (twitch.tv)