Nick Clegg says asking artists for use permission would 'kill' AI industry

33 olyellybelly 63 5/26/2025, 1:46:11 PM theverge.com ↗

Comments (63)

OgsyedIE · 1d ago
The title (at time of writing) omits the full quote in a way that becomes a glaring misquote. His quote in full has a very different meaning:

> "And by the way if you did it in Britain and no one else did it, you would basically kill the AI industry in this country overnight."

This is important to keep in mind. There are competing AI R&D orgs in dozens of different countries, not just America, China and France.

kulahan · 1d ago
A lot of people seem to think AI development happens in a vacuum, but there are a lot of international implications to consider. Even if the US had some incredibly thorough process for ensuring everyone got paid for their AI additions, many nations would just straight-up steal the data anyways. So maybe in ChatGPT, you can’t ask for a studio ghibli-style photo, but in ChinaGPT (or what-have-you), it’ll work just fine.
jimmydoe · 13h ago
China might do it, so we did it first. I love this logic, and another place to apply it is about when to use nuclear bombs.
Gud · 9h ago
The key difference is that if China uses their nukes, so will we. On them.
karaterobot · 1d ago
I'm the kind of creative whose job is threatened by AI, but if I'm being honest and cold-blooded, I'm not actually sure that I'd call what AI training does stealing—apart from any illicit acquisition of the data, that is. I'm open to being persuaded. I just have not read a really persuasive argument that when a model is trained, what it's doing is closer to stealing than it is to being influenced and then recapitulating the general form of those influences.

The latter is something we have no problem with in our culture. Every generation of artists do this. Every artist does this, in fact. It may be how creativity itself works, for all I know. But we never call it stealing, or when we do we say things like "great artists steal" and acknowledge it as a process rather than a problem.

It feels to me like when people make the "AI is stealing" argument, they're really arguing about something else and using this as a proxy for it. Not exactly sure what that other thing is. Fear of losing work and being displaced is the obvious one, and it's a valid fear. I share it! But, if that's what it is, I think we should be having that discussion directly, rather than talking around it. We always do the thing where we piss around talking about proxies, rather than going right to the agon.

birn559 · 22h ago
AIs are not creative, that's anthropomorphism. They use lossy compression to save training data. Later on, the compressed data is used to generate things that are potentially copyright violations (because they are very close to the original).
therealpygon · 3h ago
People don’t like honesty because they want to be special snowflakes, but sorry, humans aren’t that creative. The entire copyright and patent system is predicated on that fact, that you can only make money by making a thing exclusive and proprietary. That is directly caused by the fact that not only is it NOT original, but that others absolutely will invent/create the same thing eventually even without exposure to the original material, which is why people want to protect their exclusivity.

“I’m original because I add the little curls to the lines, make the lines softer, and make faces pointy rather than round” isn’t originality, nor is any of the art that people claim copyright for.

Additionally, it is a fundamental misunderstanding of AI modes if you think the models are simply lossy compression of data. That is like saying the alphabet is just lossy compression of The Odyssey.

pj_mukh · 22h ago
I don’t understand, generating (and then monetizing) things that are copyright violations is already illegal.

What we’re taking about is suing the tool that could potentially be used to generate copyright violations. Which seems silly?

Steelmanning your argument a bit more: OpenAI is selling the fact that you could potentially generate copyrightable material as a subscription, because the artwork is somewhere in the compressed data?

Abstracting the underlying tech a bit: If we imagined a printer that could faithfully represent whatever you told it to and you told it to reproduce Da Vinci’s that you later sold as originals while others told it to reproduce original forms of art (successfully), who’s at fault here, you or the printer?

Seems obvious to me it’s the user of the printer, not the printer itself, even if the printermaker is making money.

Earw0rm · 19h ago
In a strict reading, the tool itself is a copyright violation. In that an LLM is the code and its weights, and the weights contain a lossy compressed copy of a copyrighted work.

Now you might say that's an overly rigid reading of copyright law, and I might agree, but that is nevertheless the reading of it applied to other domains - music, film and so on - where incorporating even a fractional, lossy and heavily processed portion of another work is enough to trigger copyright law. There is no blanket de-minimis that says "well, it's only 5%, so it's fair use".

pj_mukh · 13h ago
The legality of Copyright material being present in the latent space of a commercialized foundation model would definitely be an interesting case!

Though most of these cases aren't litigating this. They are charging that the ability to recreate copyrightable material is the fault of the tool and not the tool user[1].

[1]: https://www.darrow.ai/resources/ai-copyright-law

birn559 · 8h ago
If using the work as training data was lawful (which hasn't been always the case as we know today) I tend to agree with you, but it might depend: creating an alteration of the work might be prohibited altogether or you might need to give proper credit to the original authors and that might already be violated when you present certain output to a user. In general, the user is responsible to obey the license of the original work. The problem is that the LLM doed not know on which works its output (mostly) is based on. So in theory, no output for which copyright laws might apply should be used whatsoever.
const_cast · 15h ago
> OpenAI is selling the fact that you could potentially generate copyrightable material as a subscription, because the artwork is somewhere in the compressed data?

Yeah, that's the express and sole purpose of the tool. You use OpenAI image generation because you want something that looks like a stock image for your website. But you don't want to pay for a stock image because you're cheap. So you'll get a close enough approximation for free. Or, uh, less.

pj_mukh · 13h ago
"express and sole purpose of the tool"

Categorically untrue. 100% of all my usage of image generation is for artistic purposes that have nothing to do with the reproduction of or even the mimicry of existing art.

This is a gaping hole in the legal case if the lawyers need this to be true.

const_cast · 13h ago
> 100% of all my usage of image generation is for artistic purposes that have nothing to do with the reproduction of or even the mimicry of existing art.

How? You're creating artwork that is based off of the training data and is meant to replicate a subset of it.

I think, just because you cannot pinpoint the exact art piece(s) that you are recreating doesn't mean that that isn't what you're doing.

pj_mukh · 10h ago
Sorry when you said "Sole purpose", I assumed you meant the ChatGPT USERS sole purpose and OpenAI's expectation of their users' Sole purpose. My sole purpose as a ChatGPT user is not to create/recreate stock photos.

"just because you cannot pinpoint the exact art piece(s)"

Precisely, and neither can the owner of the copyright. Unless of course I rub into classical copyright issues ("these two art pieces look too similar") which goes back to my original question, why isn't just enforcing existing copyright law on bad ChatGPT users enough?

If I "use" your artwork, and generate another artwork that looks nothing like yours (either from my brain or by automating it) why do you care?

tokai · 1d ago
It feels to me like you are the one arguing about something else. It doesn't matter if AI doesn't copy work wholesale or it can be viewed as how artists are inspired.

AI companies use work that they do not have the rights for, to construct their products.

popularrecluse · 23h ago
You don't need rights to view a work and caption it. You can walk into a library or gallery today and take your own notes on everything in there.
const_cast · 15h ago
But this is very obviously not what AI is doing. In essence, an LLM is taking works and compressing them and storing them in a big database, to later be reproduced with a touch of non-determinism.

It's closer to archival and compression than captioning. But they're selling it. That's a big problem - you can't just take someone else's work, sprinkle some god knows what on it, throw it in a big pot with other shit you stole, and then reproduce it later. Even if your reproduction is super convoluted and not 100% accurate.

mentalgear · 1d ago
In other words: It would punish the ones who steal. Which is normally considered a good thing in society, unless you're a Big (Tech) Company who's entire business model rests on the extraction of value (i.e. free labor) from less powerful entities.
yetihehe · 1d ago
When you extract a lot of value from small number of people, it's called theft. If you extract small amounts from millions of people, it's called banking.
ndegruchy · 1d ago
Where's the outcry and rage from the MPAA and RIAA that the infringement on their copyright is for more than all the money in the world combined?

It's fine to nail a $10m verdict against a guy with no money, but dare to tread on the beloved and bespoke * aI iNdUsTrY * and suddenly you've got better things to do?

vrighter · 1d ago
Asking people for permission to kill them would kill the hitman business.

That is not a bad thing.

pj_mukh · 1d ago
Sincere question: Do we need new laws for this?

If I use photoshop to create a straight reproduction of an artwork by someone else and monetize it, copyright law handles this.

If I use AI to do the same, same deal? At no point did someone suggest suing Adobe for building photoshop for "enabling" people to recreate art and sell it.

So what changed? The ease of use?

birn559 · 22h ago
AI uses lossy compression of data to generate output that is potentially a copyright violation. Photoshop doesn't recommend a (mostly) straight reproduction of existing work, but an AI potentially does.
pj_mukh · 22h ago
“Photoshop doesn't recommend”

Neither does OpenAI? What about an empty chat prompt “encourages” copyright abuse?

const_cast · 15h ago
The express and sole purpose of an LLM is to create output that can be found in it's input. That's what it does.

When you ask for a picture of a cup of wine, you're reproducing an image of a cup of wine that was using in training. Or, more accurately, hundreds of images sort of averaged together.

You can't say this isn't the intention because this is literally the singular use of LLM. They don't do anything else. They don't change your oil. They reproduce input. Just because you don't know exactly what the input was originally (and you, in fact, cannot find out) does not mean that isn't what it's doing.

jeisc · 7h ago
Curious that the stealing and profiteering of the human intellectual property coincides perfectly with a lawless administration in Washington DC
rifty · 17h ago
Copyright is long, but not infinite. There's little point in my opinion to permissions to train upon unless your purpose is to simply slow and postpone the inevitable.

I'd be more interested in getting AI companies training with publicly scraped datasets to have them available to other AI trainers for free. The load scrapers put on the web is out of control.

GPerson · 1d ago
I’m in complete support of anything that will kill the ‘AI’ industry.
kulahan · 1d ago
Nothing will kill it, but you might hamper one nation’s ability to work with it.
KaisoEnt · 1d ago
They tried to make this guy prime minister when I was a kid.
GardenLetter27 · 1d ago
I voted for him!

I actually agree with him on this though, AI is a bigger net benefit to humanity than some artists' IP.

guiriduro · 1d ago
When in politics, in exchange for a ministerial car, he reneged on his pledge on student grants, and in coalition enabled a punishing austerity programme and the prime minister who delivered the Brexit referendum. It would be wise to treat everything he says or supports with the utmost circumspection.
GPerson · 1d ago
AI is clearly a net negative to humanity in its current form, and almost certainly all future forms.

The fact that you refer to an actual human being as “some artist” betrays the absolute disdain you hold for humanity and in my opinion should disqualify you from polite discourse of any kind.

sach1 · 1d ago
> AI is clearly a net negative to humanity in its current form, and almost certainly all future forms.

I'm going to make an equally supported and valid statement here and say AI is clearly a net positive to humanity in its current form, and almost certainly all future forms.

You should consider providing a less hyperbolic point to argue if you are looking to have any kind of productive or healthy discussion on the internet.

GPerson · 19h ago
Well thanks for the advice, but no, you are incorrect.
sillyfluke · 1d ago
>some artists' livelihood

There I fixed it for you. If it's such a net benefit to humanity governments should provide 6-months-behind-the-state-of-the-art-model for free to the public to make up for the highway robbery of its citizens' lifelong digital output.

No comments yet

pintxo · 1d ago
If that’s true, can we agree to slap on a 10% tax on every AI company’s revenue? To account for the externalities?
tokai · 1d ago
They could on blank CDs so it should doable.
rainsford · 1d ago
If your premise was right, it makes even less sense as a justification for using other peoples' IP without compensation.

AI companies are for profit organizations in a largely capitalist world. If what they're providing is so massively more valuable to humanity than artists' IP, then the AI companies should easily be able to monetize that value in a way that allows them to fairly compensate the artists for the crucial role they're apparently playing in AI development.

Of course scale is the real problem here, because we're not just talking about AI companies compensating "some artists", but likely "all artists". This probably is financially untenable since the current value AI is providing to society seems unlikely to be greater than the value of all IP that exists in the world.

But that's not some critical feature of AI, it just speaks to the way current AI models work where you take every bit of text and images that exists and throw it into a data center full of GPUs to burn through a large city's worth of electricity until something approximating human text and image generation pops out. Arguably that approach is driven by the fact that GPUs and electricity are affordable and the input data is "free" if you're willing to steal it. I see no reason to believe less wasteful models couldn't produce good results without requiring the input being all human knowledge, and you'd save some electricity and resources in the meantime.

Edit: I'm not an AI researcher, but I also think incentivizing more efficient approaches to AI model development and training might yield more human-like AI (i.e. actual AI rather than turbo charged autocomplete). Humans also develop our cognitive abilities by absorbing external information but we manage to do it by absorbing a fraction of the information as a large-scale AI model with significantly better results. Focusing on whatever difference is behind that rather than building ever larger data centers with ever more powerful GPUs might be an interesting shift.

bediger4000 · 22h ago
We did not get to use a nuanced argument against punitive "Intellectual Property" enforcement 1995-2010. What's different now?
seanhunter · 1d ago
For people who don’t know the UK backstory, Nick Clegg is a widely-derided figure in the UK. He was the leader of the 3rd biggest party (the liberal democrats) and by making an alliance with the conservatives to let David Cameron become prime minister and then going along (as deputy prime minister) with a number of policies [1] that were seen by supporters of his party as anathema to their basic principles, saw his party utterly destroyed in the next election to the extent where they went from being a credible political force to having few enough members of parliament where they could all have fit into a US—style SUV and not been too cramped. He didn’t care because he peaced out immediately and took the Zuck dollar.

All of which to say if Nick Clegg says X there are a very large number of people who think “not X” by default. He’s even a lame duck at meta because they’re ditching him to replace him with Joel Kaplan who they think will help in their attempts to cosy up further with the Trump administration.

[1] Tuition fees for tertiary education in particular https://blogs.lse.ac.uk/politicsandpolicy/libdems-tuition-fe...

chrischen · 1d ago
We have copyright laws to protect innovation as have deemed it as a society (for the most part) to be important, but there is no fundamental right to the protection of information. So the question now is: is it worth protecting the rights of innovators at the expense of whatever benefits that AI provides?
tristanlukens · 1d ago
Good. Ask 'em
bachmeier · 1d ago
Imagine if someone stole the Windows source code, started a business selling their own version of Windows, and then claimed that the requirement to get a license from Microsoft would kill their business model.
GPerson · 1d ago
The main argument the AI defenders use nowadays to promote the development of this clearly harmful technology is that “if we don’t make the thing that almost certainly harms all humanity then China will!”

This is just Mutually Assured Destruction all over again. The only correct solution is for AI pacifists on both sides (means the people who support humanity and oppose this nonsense) to do a as much as they can sideline the AI hardliners (the people who think AI is a benefit) on both sides. Each country needs to believe the AI pacifists in the other country have relatively more support than the hardliners.

SonOfKyuss · 1d ago
If your industry relies on stealing to exist, maybe it doesn’t deserve to live
barbazoo · 23h ago
Most industries do this basically by “stealing” from the environment without paying their fair share for dealing with the environmental damage they’re doing.
chrischen · 1d ago
That "industry" is everyone's industry if AI models that rely on public data is available to the public. Then everyone can benefit.
SonOfKyuss · 23h ago
Your point would be kind of valid if the tools they build were free to use
morninglight · 20h ago
If we hired guards and implemented security it will kill our banking industry.
CrankyBear · 1d ago
This is a feature, not a bug.
billy99k · 1d ago
Funny how copyright infringement has made a come back in AI and it's somehow a bad thing. What happened to all of the pirates fighting for our right to not pay for anything?
thaumiel · 1d ago
From what I remember that was never the issue people was fighting for. It was more about the right to own what you payed for and bought, and the right to create a copy of it for yourself. Which still are a big issue. Why do I get to keep a physical book for as long as I want, but when I "buy" a digital book it is only a license.
didntcheck · 1d ago
That's a generous recollection. The right to create private backups was generally a very small part of people's arguments. Most discussions tended to be about justify torrenting media that they never owned, nor had any real intention of paying for
AlecSchueler · 1d ago
I'm as big a believer in IP reform as someone could be but the current situation is that regular people still have to follow the rules, the artists themselves still have to follow the rules, while big businesses with endless VC funding get to ignore them.

If it was the same for everyone I would have much less problem with this but it's the glaring disparity in how different parties are expected to behave that is truly repugnant.

opan · 23h ago
I don't like AI and I don't like copyright, but if I had to pick, I would definitely prefer copyright die out as it has more of a personal impact on me and my interests than AI does. Emulators and game projects getting threatened and/or shut down is a real shame. It takes some of the magic out of the world of software when we have to worry about this stuff. I do tend to use free as in freedom software, but proprietary software still manages to have an unfortunate presence around me.
mathgradthrow · 1d ago
No one has ever taken the philosophical position that they have the right to sell someone else's work as their own.
olddustytrail · 1d ago
That's literally the entire foundation of Capitalism.

Working for yourself, and selling what you produce yourself, is called Cottage Industry and is what Capitalism replaced.

techpineapple · 1d ago
I didn’t know that OpenAI employed all of the artists that they were training models in. Problem solved!
olddustytrail · 1d ago
They don't and I never said they did. And even if they did, would one company being the sole employer of artists and paying them minimum wage, really be "problem solved"?

It seems people don't like the reality of where we currently are . Don't shoot the messenger!

jll29 · 1d ago
In most areas of law, whatever is not forbidden, is permitted.

Licensing law is the opposite: you are not permitted to do anything with a protected work unless you have been granted a prior written license.

(exceptions: fair use for scientific, non-commercial research, and copyright expiry e.g. seven decades after the death of the author, depending on the jurisdiction, of course.)

OgsyedIE · 1d ago
Law is downstream of politics in every place it has ever existed, however. The truth about 'Rule of Law' is that it is something that we make, and could just as easily make differently.