Who owns the intellectual property in AI generated images?

2nd March 2023

Every time a new tech trend comes around I get a lot of emails asking me the same question about the same new business model. It turns out that AI, and more specifically ChatGPT, is no different.

So I figured that, this time around, I would do some ‘paying it forward’ for the UK tech community and just answer your most asked AI related question in public, on the Internet, so that you could all have it for free.

But what’s the question, I hear you ask?

Well, in the last few days I’ve seen and heard an identical question about a near identical business model about half a dozen times. I’m pretty sure that it’s happening because a lot of people are opening up ChatGPT and using the same prompt, which I think must go a little bit like this:

“Yo ChatGPT, how can I get rich? And quick?”

No ‘generate meaningful consumer value over a five year cycle’, instead use your omniscient god-like intelligence to give me the kind of whizz-bang business ideas that are going to need about, maybe, eight hours of work a week (preferably on social media) and which can let me be whacking back overpriced margaritas in a hot-tub with my shiny new trophy-wife/husband/humanoid by, say, four in the afternoon this Thursday while the whole thing runs itself.

Give your answer in the form of a pitch deck that I can send over to a VC without reading it as soon as I’m done eating lunch.”

Like I say, I don’t know the exact words you’re using, but I think I’m pretty close here.

The problem with that is that ChatGPT is telling you all to set up the exact same business, because you’re all feeding it a variation of the same prompt. I’m not going to breach client confidentiality by saying precisely what it’s telling you to do, but so far as I can tell you’re all planning on starting identical dropshipping enterprises to sell the exact same consumer item, which you all intend to festoon with one or more AI generated images as way of adding value to that generic product.

Which means that you’re all writing to me with the same question, which is “who owns an image generated by AI (which I’m about to emblazon across a common consumer item)?”

It’s a great question (and I don’t know if you can tell, but I like it a lot more than your collective business plans) and it’s only going to get more important as we use more and more generative AIs in our day to day lives. So, who owns an AI generated image, or AI generated text, or AI generated business plans? Including business plans that may be aimed at entering a rapidly crowding market that will surely be saturated long before your hot tub has warmed to even close to room temperature.

Let me tell you the answer.

Ownership of rights generated by AI

In the UK (and, because of international treaties, this is common across most of the Western World) the default rule is that the author of an original artistic work, be that an image, work of literature, or computer code, is the owner of that work. That happens automatically without the author doing any particular thing to register their rights, but it only applies to actual original ‘works’ and it doesn’t let you fence off mere ideas. But, once an author is the owner of a work they can forbid reproduction of that work, or derivatives of that work, unless they want to grant their permission (by way of licensing).

Put simply: if you make it, you own it.

Unless, and this is the important bit: (a) you have made it while working as the employee of a company, in which case the default position is that your employer owns it, or (b) you have entered into a ‘written agreement’ to transfer the copyright in your work to someone else (which is what happens when you, say, engage a creative agency, or when you work as an independent contractor).

At a first glance you might think that gives you a neat and easy answer to AI related questions: surely the ‘author’ is whoever wrote the prompt, so whoever is cranking the lever on Midjourney owns the images that come rolling out of it. It would be just like any other software tool. The person who uses Photoshop to make an image is the owner of that image. So surely that’s the same with AI tools?

Assume that’s right, and you would say that whoever writes the prompt owns the output. To my mind that’s the most logical answer.

But it might not be quite that simple, because an AI isn’t a tool like Photoshop where all of the creative work is done ‘user side’ by an individual. Instead, most of the heavy-lifting gets done ‘server side’ by an algorithm authored by the employees of a company which is responding to a request from a user. In a way that is analogous to a creative agency reacting to a client brief. Seen that way, putting in a prompt isn’t a work of authorship, it’s just a request for somebody else to deliver one. If that analysis is right, then an AI vendor could reserve all IP in generated outputs for itself (or at the very least, retain a wide range of rights to use and/or re-sell them itself).

In addition, all of the work that generative AI creates is possible because those AI’s have been trained by viewing millions of images created, and owned, by other artists online. Their outputs are, while hugely impressive on a technical level, very sophisticated attempts to copy the forms, styles and techniques which they have been trained on. Which, where those outputs look suspiciously like images created by other artists, could well be challenged as derivatives.

Seen that way, whoever wrote the code that makes up the generative AI would own the image.

So, that’s two options – and I have a clear personal preference towards the former being correct – so why can’t I just give you a clear answer one way or the other as to which one is right?

What does that mean for AI reliant businesses today?

Because generative AI is so new that we don’t yet have anything from the courts (or Parliament) which comes to a definitive view about ownership rights one way or the other. While I prefer the analysis which says that the prompt writer is the owner of the prompt output, either analysis can be logically sustained in the face of current copyright laws. What that means is that we are almost bound to have intense litigation on the topic in order to get to our final answer. That or timely, pro-active legislation from Parliament to clarify, but don’t hold your breath.

So I’m going to give you the same advice, for free, that I have given to everyone who has asked me this question over the last few weeks. If you are about to launch a business which relies on selling or licensing AI generated works, then it’s imperative that you read the terms on which the generative AI that you are using is provided under. Work on the assumption that you own the outputs, but do the diligence and inform yourself as to whether your vendor agrees with that idea.

Check your chosen vendor’s terms to find out whether it is giving you the IP in prompt-generated works (or, at least, being silent on the subject) or whether it is purporting to reserve those rights for itself. With that knowledge in hand, you can look to either (a) switch products to something more permissive, or (b) flow down that same position to your customers.

Then, get ChatGPT to build you a bit of code that will automatically alert you every time your chosen vendor updates their user terms so that you can react immediately if they ever try to pull the rug out from under your feet by changing the IP position overnight.

Also, don’t forget that copyright doesn’t let you own mere ‘ideas’. Which is why ChatGPT gets away with recommending the same one to every person that asks it how to be their own boss while working under eight-hours a week. So don’t be surprised if you find everyone else doing the same thing.