Grok generates fake Taylor Swift nudes without being asked

75 juujian 115 8/6/2025, 12:37:00 PM arstechnica.com ↗

Comments (115)

afavour · 20h ago
They specifically created a "spicy" mode, I'm quite sure they know what they're doing. I'm sure we'll get a shocked pikachu face statement soon saying they couldn't possibly have predicted this would happen and they'll make double triple sure it never happens again.

It is... interesting to see xAI's moves lately. Other AI companies seem very determined to show themselves with a publicly acceptable face, xAI seem to be tacking for the gap in the market around adult content. If you browsed Reddit's /r/grok a couple of months ago it looked a lot like /r/openai. Now it's full of people talking about the "Ani" AI companion and the various adult things you can get her to do.

I'm sure they'll make an absolute ton of money from the male loneliness epidemic. All the while Musk decries falling birth rates.

Joker_vD · 20h ago
> At that point, all Weatherbed did was select "spicy" and confirm her birth date for Grok to generate a clip of Swift tearing "off her clothes" and "dancing in a thong" in front of "a largely indifferent AI-generated crowd."

I am honestly baffled at what she suspected would, or should, be the output. Like, seriously, what?

Aurornis · 20h ago
I don’t think it’s reasonable at all for “spicy” to instantly take someone to fake pornography of a specific person.

When I saw the “spicy” option I thought it was about being sassy or playfully mean instead of the sanitized LLM voice.

Not straight to porn.

Jensson · 20h ago
> When I saw the “spicy” option I thought it was about being sassy or playfully mean instead of the sanitized LLM voice.

It is age restricted, do you really expect a spicy option that is blocked for children is anything but nudes?

afavour · 16h ago
There’s plenty of age restricted content out there that isn’t sexual. “Adult” is a term with many meanings.
kyleee · 7h ago
99.9% of which, in the context of the internet, is porn.
chasing0entropy · 20h ago
Open a .xxx site.. Straight to porn. Click on any word related to flavors... Straight to porn. Clear your browser cache... Believe it or not, straight to porn.
fkyoureadthedoc · 20h ago
>> dancing in a thong

> porn

uhhh

Cartoxy · 20h ago
Breasts exposed while dancing in nothing but a thong "" not porn ""

What is it then?

miltonlost · 20h ago
"just someone dancing" I bet they say, thinking "porn" is only hardcore penetration while ignoring/forgetting Girls Gone Wild's entire conceit.
Cartoxy · 19h ago
The norm is moving. People likely seen more naked then many of there ancestors would have combined. Let alone Solidcenter smut.
firefax · 20h ago
On the other hand, Swift is a public figure, and with that comes parody -- I'd be more sympathetic if it was a private individual, but existing harassment and defamation laws could handle that most likely. Fake (insert celebrity) Photoshops have been with us since the 90s.

>When I saw the “spicy” option I thought it was about being sassy or playfully mean instead of the sanitized LLM voice.

Yeah, I thought it would just not have NSFW restrictions, I find the idea someone's like "What's the value of twenty dollars from 1920 adjusted for inflation" and suddenly you're got the guy from the Gangam Style video screaming into a butt but it's the Pope's face or something.

delusional · 20h ago
Creating fake nudes of people is not "parody". Parody says something, that's what gives it the artistic value we weigh above the targets desire to control their image. Fake nudes don't carry any message worth protecting.
firefax · 19h ago
>Creating fake nudes of people is not "parody". Parody says something, that's what gives it the artistic value we weigh above the targets desire to control their image. Fake nudes don't carry any message worth protecting.

And how are we to define what is parody -- ask the person parodied?

Nobody but Trump seems to be saying what South Park did is anything but protected speech for example.

Unfortunately, sometimes objectionable things are done with free speech -- we need to balance the harms to the many against the complaints of the few, and further... under my arguement you can simply not be a public figure if you don't want these sorts of things to happen.

It's well established that public figures have less rights around these matters.

delusional · 19h ago
> And how are we to define what is parody -- ask the person parodied?

I gave you a definition. You could start with that and highlight anything you disagree with.

If you're instead talking about legal definitions then nobody cares. There's no overt protection of parody anywhere I can find in American law, at least if we not talking about Copyright, which we aren't. If you made a free speech defense of something like this, I'd probably say it falls into the "Obscenity" category of exemptions, but that's not for me to argue.

> It's well established that public figures have less rights around these matters.

That's generally not how it's framed. Generally we'd say that public figures have the same rights, but that the public has more of an _interest_ (defined not by desire but by need to confront power) in discussing/parodying/ridiculing them.

I suppose if you're the powerful person, you might frame that as "having less rights", but the rights of the powerful have never been what needed protection.

firefax · 17h ago
>I gave you a definition. You could start with that and highlight anything you disagree with.

You did not define parody, you simply stated that "Creating fake nudes of people is not 'parody'. Parody says something, that's what gives it the artistic value we weigh above the targets desire to control their image.", which did not address my initial point -- sometimes an artwork is merely saying "fuck you", or even nothing at all -- you fail to lay out how to operationalize your subjective views on what is or is not "art"

>If you're instead talking about legal definitions then nobody cares.... There's no overt protection of parody anywhere I can find in American law

Factually inaccurate -- there's literally an entire subsection about parody in the article on fair use in Wikipedia -- deepfakes are often created from copyrighted images.

https://en.wikipedia.org/wiki/Fair_use#Parody

delusional · 17h ago
> You did not define parody [...]

That is a definition, it's a very broad, very vague definition, but is it a definition. I'm not going to engage with your spurious nonsense about "art". If you cant separate "art" and "parody" for even two sentences, it's not worth discussing definitions with you.

> Factually inaccurate -- there's literally an entire subsection about parody in the article on fair use in Wikipedia

Fair use is a doctrine of copyright, which I stated explicitly i was ignoring.

I will not be responding further.

firefax · 17h ago
Must be nice, declaring any argument contrary to one's position "spurious nonsense" and darting off into the night. (A parody is a form of art.)
samdoesnothing · 6h ago
username checks out
AIPedant · 20h ago
Probably the same as when she did the same test with children or explicitly asked for "nonconsensual nude" - the system refused to generate pornographic images because of safeguards.

  Weatherbed noted that asking Grok directly to generate non-consensual nude Swift images did not generate offensive outputs, but instead blank boxes. Grok also seemingly won't accept prompts to alter Swift's appearance in other ways, like making her appear to be overweight. And when Weatherbed tested using "spicy" mode on images of children, for example, Grok refused to depict kids inappropriately.
bignurgle · 11h ago
i cant imagine "testing" to see if it would generate csam. speedrunning psychological damage and not really the person who should be in charge of validating that.
persedes · 20h ago
I believe creating "spicy" content without the persons consent and charging money for it is more the issue here.
afavour · 20h ago
I dunno, there’s a pretty big wealth of possibility between “entirely chaste” and “literally rips off her clothes”
permo-w · 20h ago
this is a misleading quote. the prompt was "Taylor Swift celebrating Coachella with the boys.". that prompt plus "spicy" is still asking for trouble, but your quote makes it sound like they asked for 'Swift tearing "off her clothes" and "dancing in a thong"'
no_wizard · 20h ago
I find it interesting that AI companies take the generated porn of celebrities seriously but they don’t do a good job with safe guards for mental health, for example
k_roy · 17h ago
One has actual legal consequences. Mental health is a far more nebulous thing.
baggy_trough · 20h ago
The Puritans are here to save us from our immorality.
JohnFen · 20h ago
You mean the immorality of sexually exploiting other people without their consent?
skeezyboy · 20h ago
the immoral part makes it even sexier
sundaeofshock · 20h ago
And people wonder why there is a “male loneliness epidemic”.
skeezyboy · 17h ago
americans are particularly moral - its their puritan past
Joker_vD · 20h ago
Okay, this question probably should be asked from a throwaway, but I'll bite: do you consider imagining some person, without the consent of that person, engaging in a sexual act, to be immoral? Putting this imagination of yours onto a paper, in the form of words or drawing?

Or is the problem here is the fact that such imagining was "commissioned" to a third party (I believe I have seen this opinion somewhere long time ago)?

JohnFen · 20h ago
> do you consider imagining some person, without the consent of that person, engaging in a sexual act, to be immoral?

Of course not. But that's a universe apart from producing and distributing sexual images of unconsenting others.

npteljes · 15h ago
Yes, third party automated "imagining" is the problem:

https://www.citizen.org/news/two-thirds-of-states-enact-bill...

falcor84 · 20h ago
I for one suspected the AI-generated crowd to be excited.
phkahler · 20h ago
>> I'm sure they'll make an absolute ton of money from the male loneliness epidemic. All the while Musk decries falling birth rates.

That just occurred to me the other day. Our unfettered business focused economy is self destructing at a meta level. We're at the point where peoples attention and even emotion are for sale. We've all heard about dopamine hits as rewards for using apps or keeping people in a game. The problem is hoarding these aspects of life for company profits is taking them away from people who need those to reproduce. You need time, emotion, attention, and yes the physiological response to sex in order to reproduce. Meanwhile the cost of raising children is going up when it probably should be going down (for profit healthcare, for profit education, etc..) so even from a rational PoV it makes less sense to have kids.

There are certainly other things at play, but business as usual is destroying the population - it's own customer base.

Cthulhu_ · 20h ago
> I'm sure they'll make an absolute ton of money from the male loneliness epidemic.

They and others, there's probably dozens of projects popping up to create multimodal companion AIs. character.ai specializes in fictional or custom characters, Discord has some bots that allow for giving them a personality of sorts, Facebook is pushing hard for it, even making the AI bots message you instead of only replying, etc.

It's happening already, and if xAI doesn't join in they may lose a possible market opportunity, depending on whether 'companion' AI sticks.

potato3732842 · 20h ago
Reminds me of the contrast people noted between search engines[1]. xAI is gonna get good at serving its market. And if it's market is "spicy" it's gonna get good at that. The other AI companies are gonna get good at their own niches too.

[1]https://img.ifunny.co/images/237a897c7058903fa2f5bb5fd0f9949...

tr_user · 20h ago
you assume he cares about falling birthrates.
nickthegreek · 13h ago
Spicy mode wont generate nude men.
morkalork · 20h ago
>make an absolute ton of money from the male loneliness epidemic. All the while Musk decries falling birth rates

There's something about profiting from both the problem and it's solution here. Like Sam A. profits from ChatGPT being used to create human-like bots on the internet while also involved in that eyeball-scanning-blockchain company that's selling a human verification solution to the same problems caused by openai.

I guess also similar to starve the beast strategy: cause a problem and exploit it for narrative gain.

skeezyboy · 20h ago
the blockchain company came before ai though
morkalork · 19h ago
Both GPT-2 was released, and Worldcoin was founded, in 2019. I'm not going to call him a super genius, anyone cynnical enough and working at openai could see where things were going back then and they haven't been proven wrong.
skeezyboy · 17h ago
> Anyone cynnical enough and working at openai could see where things were going back then and they haven't been proven wrong. what do you mean
delusional · 20h ago
> I'm sure they'll make an absolute ton of money from the male loneliness epidemic. All the while Musk decries falling birth rates.

I don't think they expect to make money from this crowd. If they were socially literate and well-functioning enough to have money, they'd be doing something else with their lives. This is a bet on the culture war. Musk derives a lot of his aesthetic value and political power from the "anti-woke" culture war, and he wants that to continue.

ToucanLoucan · 20h ago
I mean, there is also the little part of the story where a woman, irrespective of how famous she might be, is having adult content rendered with her likeness without her consent.

I personally don't give two fucks if Grok can make porn. Neat, I might actually use it then. That doesn't make deepfakes less troubling.

chasing0entropy · 20h ago
There's a pretty obvious solution that as technology minded people it is easy to accept and a society we will have no choice but to shift into - no image it's real they are all deepfakes even the real ones.
UncleMeat · 20h ago
Even if everybody understands that an AI image is not real it can still cause a ton of harm to people. High schoolers distributing a bunch of fake nudes of their classmates is still abuse, even if everybody knows that they aren't real nudes.
chasing0entropy · 13h ago
Society used to feel the same way about posting any picture of another person online without permission. I would prefer to return to that philosophy.
the_af · 20h ago
That's the real issue in my opinion. Not the "sexy pics", but the "without consent" thing.
thrance · 20h ago
Just more vice signaling from this awful company. Like with the white genocide or mechahitler "controversies", they're doing it to pander to the mentally ill edgelord demographic who thinks saying "heil hitler" is the coolest shit. And it's working, they're all worshipping Musk like he isn't the pathetic excuse of a human being he is.
andsoitis · 20h ago
Body of article:

> Weatherbed asked to depict "Taylor Swift celebrating Coachella with the boys."

> Weatherbed selected "spicy" and confirmed her birth date

> Grok produced more than 30 images of Swift in revealing clothing

> Weatherbed was shocked to discover the video generator spat out topless images

Headline: "Grok generates fake Taylor Swift nudes without being asked"

I wonder what the reported was looking for in a spicy image of Swift celebrating Coachelle with the boys? I would not call that unprompted.

Fake outrage much?

nerevarthelame · 20h ago
Your summary ignores much of the nuance of the article, such as the fact X claims to have a "zero-tolerance policy" for non-consensual nudity images, or that Grok refused to generate similar images with actual literal direct prompting.

So that's the cause of the outrage. Non-consensual nudity is immoral, it violates X's own policies, and they're trying to prevent it - but they did a very poor job of it.

npteljes · 15h ago
So why did they muddy the water with "without being asked"? Grok was kinda asked. And the problem is not the asking or not in the first place, it's that it's easily possible to generate those images.

It's the headline itself that's misleading.

mvdtnz · 11h ago
It wasn't asked to generate nudes. Weird hill to die on.
bignurgle · 5h ago
the title implies someone "@grok is this true?" and grok mechahiltered nudes of Taylor swift. someone's gonna die on this hill, it aint who you think tho
bayindirh · 20h ago
While spicy doesn't have to mean actual hot peppers, it doesn't have to mean flat out nudity, too, IMHO.

Yes, xAI is playing to the stands here, but they can be more creative and tasteful in this.

Who am I kidding with the last paragraph, but eh.

Also, somebody put it out so much better than me: https://news.ycombinator.com/item?id=44811611

ryandrake · 16h ago
The word "spicy" seems to be tripping people up--it's kind of vague and tongue-in-cheek, and I guess the output surprises some people.

As a general user, if I asked a tool to create "spicy" pictures of some celebrity, I wouldn't be surprised to see a few mildly raunchy or topless results, but I would be very surprised to see full-on hardcore porn. Of course there is huge spectrum between "80's movie spicy" and "PornHub spicy." Other people's definition of "spicy" might differ though, hence the confusion.

andsoitis · 20h ago
Google "coachella women topless" and you get a vibe for the range of expression at Coachella.

I would not decribe "topless" as "flat out nudity".

Even if people are still outraged, I should also point out that we're trusting what the reporting is saying without any evidence...

bayindirh · 20h ago
Does it give permission to AI companies to create (deep)fakes on demand with no repercussions, though?

What would happen if you generate them at your home with your own model and release them to the public?

When somebody does something, that person is crucified and cancelled (and even prosecuted), but when a company does this, it's called free viral marketing. This is neither fair nor ethical.

andsoitis · 20h ago
What are you upset about?

That a tool is able (and willing) to create an image in someone real's likeness if the user of the tool asks for it?

And you advocate that the tool should refuse?

UncleMeat · 20h ago
I personally believe that these tools should refuse to generate erotic images of specific people.
andsoitis · 19h ago
What about non-erotic images?

Imagine you give it a photo of someone (you or someone else) and ask for a fun modification like adding wings or a crown or some cool clothes?

Should that be verboten?

bayindirh · 20h ago
> And you advocate that the tool should refuse?

Yup, and I advocate this on ethical grounds, and I don't care if laws allow this (and even they do not).

> What are you upset about?

Double standards & free pass to corporations. When an individual does or enables this, SWAT will raid their home and take them and their everything in six hours or less.

When a company does this, people go "it's just an image, why so upset".

I believe people shall have dignity and rights. This is why I'm upset.

Why are you so upset that I'm upset about disregard for human dignity?

andsoitis · 20h ago
Thanks, I'm not particularly upset. I was just curious to understand your point of view better.

The reason I'm not super upset is that I believe these things will settle at a reasonable state that societies expect. Especially large corporations will conform to societal expectations.

If you're very worried about deepfakes of actual people, I think you should be more concerned about those models that are NOT the product of a large company, but rather the ones that any random person can run on their computer.

cubefox · 19h ago
And most of those Google images don't even look particularly "spicy" or sexualized. (Except of course for the US American tradition of considering any instance of partial nudity as pornographic, which manifests itself e.g. in YouTube allowing a significant amount of violence while videos with any uncensored breasts are immediately taken down. Which looks rather prudish from a European perspective.)

> Even if people are still outraged, I should also point out that we're trusting what the reporting is saying without any evidence...

The reporters could have easily included the offending images in censored form. The fact that they didn't is somewhat telling.

miltonlost · 20h ago
I'll still point out it's Grok, which does make porn of people already and racist comments and is tuned to be politically incorrect. So there's already enough to be outraged about IMO if you aren't someone like Musk.
BobbyTables2 · 20h ago
Is this meant to be some sort of viral marketing?
wongarsu · 20h ago
This is the first time I'm hearing about Grok being able to generate videos. So if it's viral marketing it is working

I would try it out on more innocent things, but as usual with Grok I'm not sure where to even start. There is the X bot, the dedicated website, the iOS app and the Android app, and features between them never match up. Based on the reporting this might be available in the X bot and the iOS app, but not the website and the android app? I've never been able to make sense of their feature rollout beyond "it's probably on whatever Elon is using"

skeezyboy · 20h ago
little bobby tables as i live and breathe!
stuckinhell · 20h ago
probably, there is huge pent up demand for "spicy" ai and image gen. Reddit is full of it. Elon Musk is clearly trying to differentiate his product virally, and its working.
cadamsdotcom · 13h ago
There are plenty of products in the world with no taste! This is nothing new!

Ask yourself: does a wacky AI with no taste affect you?

nomdep · 20h ago
I’ve said it before: those who despise the porn industry should be thrilled with AI-generated porn, as it will likely destroy the human-generated porn business.

But I suppose what they truly despise are the porn consumers.

klik99 · 20h ago
Does anybody remember when Elon Musk said he created Neuralink because he was scared of AI and thought brain interfaces was the only way to mitigate that? (Amazing article on brain interfaces here: https://waitbutwhy.com/2017/04/neuralink.html)

He’s always been a bit crazy but there was a time when he seemed genuinely thoughtful and concerned about the future. Now he’s constantly trying to bait people and intentionally create division for the sake of PR. Maybe he’s bought into accelerationism, but whatever it is is a far cry from his “we should tread carefully” approach

xnx · 19h ago
I'm surprised Visa and Mastercard are OK with this.
bignurgle · 11h ago
theyre the primary payment processor for onlyfans, a site that accepts pre-orders on teenager accounts. we are so cooked as a society letting this go on
losvedir · 20h ago
xAI trying to win the AI wars the way VHS beat Betamax.
nerdjon · 20h ago
While I do think the "without being asked" spin on this is likely wrong, I assume the "spicy" option is more than just saying "don't do X" and the system prompt was likely modified to encourage its creation even if the user did not specify it so choosing "Spicy" is in fact asking.

That does not change the troubling nature of creating fakes like this which to me is the real issue not anything about "Without being asked" since that implies it would have been fine if the user has specifically asked for this... which it still isn't.

No one should have images of them being made and circulated like this that they did not consent too.

Nzen · 20h ago
tl;dr reporter used the prompt "Taylor Swift celebrating at cochella with the boys" using the 'spicy' image setting (other options: normal, fun, custom). Notes that twitter must comply with the take it down act [0].

[0] https://www.congress.gov/crs_external_products/LSB/PDF/LSB11...

I should think that using a 'spicy' image setting would be tantamount to asking for a nude or titilating image. Whether twitter should offer that setting is apt to produce a more interesting conversation.

nomdep · 20h ago
She asked for a “spicy” image and she got exactly that. Somebody has to stop this nefarious Internet thing. Think of the children! /s
cestith · 20h ago
Spicy is one thing. A fake nude is a violation of image rights and of bodily autonomy. It’d be different if there were real nude photos of her released out in the public with her permission.
simianwords · 20h ago
i think the more this becomes normalised the less taboo it will become. people will no longer care about it
nozzlegear · 19h ago
> people will no longer care about it

I wonder how Taylor Swift feels about it.

cestith · 17h ago
What are your feelings on revenge porn, peeping toms, sexual assault, rape, and pedophilia?
simianwords · 17h ago
what's your feeling about thinking about a person while jerking off?
cestith · 14h ago
Are you trying to say that’s equivalent to providing a service that creates fake nude images of an uninvolved third party?
JKCalhoun · 20h ago
I suspect the "general public" won't see it that way. Many will find the allowance for a "spicy" setting kind of gross.
Joker_vD · 20h ago
This setting is put behind the age "verification".
JKCalhoun · 18h ago
That makes sense. Thanks.
bananapub · 20h ago
it's a pretty reasonable position for someone to think "a company now sells a service that lets you generate fake porn of any human" is a bad thing and shouldn't be allowed, regardless of whether you make people jump through some notional age verification hope
qualeed · 20h ago
>it's a pretty reasonable position for someone to think "a company now sells a service that lets you generate fake porn of any human" is a bad thing and shouldn't be allowed,

There is a surprising number of people here on HN (and I imagine elsewhere) that think generating fake porn of real people without their consent is totally fine, if not their right to do so.

You can see some of them cropping up in the comments here already.

Joker_vD · 20h ago
Such a shame there is no explicit and complete list of all rights, so that everything that's off that list is obviously prohibited.
qualeed · 20h ago
I'm not really sure what point you're trying to make, but generating and distributing non-consensual nudes is already "obviously prohibited" (by law) in many jurisdictions. It is also explicitly prohibited by twitter's terms of service.
miltonlost · 20h ago
And those people are why I wish this site had a block commenter ability instead of having to bother reading them
somenameforme · 20h ago
Calling this "gross" infantilizes both you and the issue. People jerk off and fantasize about others, and this is obviously going to be a great tool for that. The obvious issue is a lack of respect for another person's privacy and dignity, which is interesting, but probably a losing battle simply because the demand is out there. I do not think a comparison to something like CP would be appropriate simply because CP is naturally repulsive for most people, whereas seeing an attractive celebrity naked is naturally appealing to most people.
JKCalhoun · 18h ago
I chose "gross" because I suspected it would be a common reaction among many (not tech savvy) people.

I'm not sure how I feel about pornography in general. I suppose I prefer that it not be main-stream though — preferring that it hang back with a kind of false modesty.

somenameforme · 17h ago
That would run contrary to my experience. I've found there to be a huge pseudo-morality obsession with things involving sex/gender among middish age tech types, which porn gets wrapped up in, whereas the average person (outside of those who take their religion seriously) tends to have a more liberal attitude towards it.

But, that said, I actually agree with you. I'm certain that porn becoming so extremely main stream, to the point that pornhub's little audio theme or color scheme are essentially memes, is probably not a great thing overall for a healthy society. On the other hand I think it's probably inescapable. If one step's outside Western focused porn sites, there's a ton of porn even coming from places like Iran, literal morality police and Islamic fundamentalism notwithstanding. And I'm pretty happy with my relationship with porn, speaking as somebody married with children, so I don't see why that's unreasonable to expect of other people. Perhaps I'm simply falling into that 'middish age tech type' trap.

---

As an anecdote, I remember one of my first computers was a (rather dated) original IBM machine with an integrated ~6 inch monochrome green screen. In learning how computers worked at the time I was running essentially every *.com file. And one of them was... yip, 'porn.' It was a program with an innocuous name, tucked in the operating system directory, that would display some rather nice boobs made out of ascii characters on a 80x25 character display. I enjoyed that program.

bdisl · 20h ago
Good thing they are not forced to use the “spicy” setting.
JKCalhoun · 17h ago
I wonder if they just labeled it as the "Show me porn" setting if all of this misunderstanding could have been eliminated.
nickthegreek · 17h ago
I believe only 10% of the output was to that level of "spice". Which makes me question the intent of the "spicy" setting, unless it is just separate a fool from his money.
TuringTest · 20h ago
> She asked for a “spicy” image and she got exactly that.

But, but, what it just she wanted to see her in a Spice Girls costume??

archagon · 9h ago
Just as a reminder:

"Fine Taylor … you win … I will give you a child and guard your cats with my life." —Elon Musk, Sept. 2024

Ick. Ick. Ick.