They specifically created a "spicy" mode, I'm quite sure they know what they're doing. I'm sure we'll get a shocked pikachu face statement soon saying they couldn't possibly have predicted this would happen and they'll make double triple sure it never happens again.
It is... interesting to see xAI's moves lately. Other AI companies seem very determined to show themselves with a publicly acceptable face, xAI seem to be tacking for the gap in the market around adult content. If you browsed Reddit's /r/grok a couple of months ago it looked a lot like /r/openai. Now it's full of people talking about the "Ani" AI companion and the various adult things you can get her to do.
I'm sure they'll make an absolute ton of money from the male loneliness epidemic. All the while Musk decries falling birth rates.
Cthulhu_ · 3m ago
> I'm sure they'll make an absolute ton of money from the male loneliness epidemic.
They and others, there's probably dozens of projects popping up to create multimodal companion AIs. character.ai specializes in fictional or custom characters, Discord has some bots that allow for giving them a personality of sorts, Facebook is pushing hard for it, even making the AI bots message you instead of only replying, etc.
It's happening already, and if xAI doesn't join in they may lose a possible market opportunity, depending on whether 'companion' AI sticks.
Joker_vD · 24m ago
> At that point, all Weatherbed did was select "spicy" and confirm her birth date for Grok to generate a clip of Swift tearing "off her clothes" and "dancing in a thong" in front of "a largely indifferent AI-generated crowd."
I am honestly baffled at what she suspected would, or should, be the output. Like, seriously, what?
Aurornis · 15m ago
I don’t think it’s reasonable at all for “spicy” to instantly take someone to fake pornography of a specific person.
When I saw the “spicy” option I thought it was about being sassy or playfully mean instead of the sanitized LLM voice.
Not straight to porn.
firefax · 5m ago
On the other hand, Swift is a public figure, and with that comes parody -- I'd be more sympathetic if it was a private individual, but existing harassment and defamation laws could handle that most likely. Fake (insert celebrity) Photoshops have been with us since the 90s.
>When I saw the “spicy” option I thought it was about being sassy or playfully mean instead of the sanitized LLM voice.
Yeah, I thought it would just not have NSFW restrictions, I find the idea someone's like "What's the value of twenty dollars from 1920 adjusted for inflation" and suddenly you're got the guy from the Gangam Style video screaming into a butt but it's the Pope's face or something.
chasing0entropy · 7m ago
Open a .xxx site.. Straight to porn.
Click on any word related to flavors... Straight to porn.
Clear your browser cache... Believe it or not, straight to porn.
fkyoureadthedoc · 9m ago
>> dancing in a thong
> porn
uhhh
persedes · 1m ago
I believe creating "spicy" content without the persons consent and charging money for it is more the issue here.
AIPedant · 12m ago
Probably the same as when she did the same test with children or explicitly asked for "nonconsensual nude" - the system refused to generate pornographic images because of safeguards.
Weatherbed noted that asking Grok directly to generate non-consensual nude Swift images did not generate offensive outputs, but instead blank boxes. Grok also seemingly won't accept prompts to alter Swift's appearance in other ways, like making her appear to be overweight. And when Weatherbed tested using "spicy" mode on images of children, for example, Grok refused to depict kids inappropriately.
permo-w · 4m ago
this is a misleading quote. the prompt was "Taylor Swift celebrating Coachella with the boys.". that prompt plus "spicy" is still asking for trouble, but your quote makes it sound like they asked for 'Swift tearing "off her clothes" and "dancing in a thong"'
afavour · 18m ago
I dunno, there’s a pretty big wealth of possibility between “entirely chaste” and “literally rips off her clothes”
baggy_trough · 23m ago
The Puritans are here to save us from our immorality.
JohnFen · 9m ago
You mean the immorality of sexually exploiting other people without their consent?
skeezyboy · 6m ago
the immoral part makes it even sexier
tr_user · 10m ago
you assume he cares about falling birthrates.
morkalork · 15m ago
>make an absolute ton of money from the male loneliness epidemic. All the while Musk decries falling birth rates
There's something about profiting from both the problem and it's solution here. Like Sam A. profits from ChatGPT being used to create human-like bots on the internet while also involved in that eyeball-scanning-blockchain company that's selling a human verification solution to the same problems caused by openai.
I guess also similar to starve the beast strategy: cause a problem and exploit it for narrative gain.
skeezyboy · 2m ago
the blockchain company came before ai though
ToucanLoucan · 17m ago
I mean, there is also the little part of the story where a woman, irrespective of how famous she might be, is having adult content rendered with her likeness without her consent.
I personally don't give two fucks if Grok can make porn. Neat, I might actually use it then. That doesn't make deepfakes less troubling.
chasing0entropy · 4m ago
There's a pretty obvious solution that as technology minded people it is easy to accept and a society we will have no choice but to shift into - no image it's real they are all deepfakes even the real ones.
the_af · 13m ago
That's the real issue in my opinion. Not the "sexy pics", but the "without consent" thing.
thrance · 17m ago
Just more vice signaling from this awful company. Like with the white genocide or mechahitler "controversies", they're doing it to pander to the mentally ill edgelord demographic who thinks saying "heil hitler" is the coolest shit. And it's working, they're all worshipping Musk like he isn't the pathetic excuse of a human being he is.
andsoitis · 22m ago
Body of article:
> Weatherbed asked to depict "Taylor Swift celebrating Coachella with the boys."
> Weatherbed selected "spicy" and confirmed her birth date
> Grok produced more than 30 images of Swift in revealing clothing
> Weatherbed was shocked to discover the video generator spat out topless images
Headline: "Grok generates fake Taylor Swift nudes without being asked"
I wonder what the reported was looking for in a spicy image of Swift celebrating Coachelle with the boys? I would not call that unprompted.
Fake outrage much?
fourseventy · 4m ago
Probably Tesla short sellers or people with Elon Derangement Syndrome.
skeezyboy · 2m ago
remember when elon literally cried about the short sellers
bayindirh · 18m ago
While spicy doesn't have to mean actual hot peppers, it doesn't have to mean flat out nudity, too, IMHO.
Yes, xAI is playing to the stands here, but they can be more creative and tasteful in this.
Google "coachella women topless" and you get a vibe for the range of expression at Coachella.
I would not decribe "topless" as "flat out nudity".
Even if people are still outraged, I should also point out that we're trusting what the reporting is saying without any evidence...
bayindirh · 8m ago
Does it give permission to AI companies to create (deep)fakes on demand with no repercussions, though?
What would happen if you generate them at your home with your own model and release them to the public?
When somebody does something, that person is crucified and cancelled (and even prosecuted), but when a company does this, it's called free viral marketing. This is neither fair nor ethical.
andsoitis · 1m ago
What are you upset about? That a tool is able (and willing) to create an image in someone's likeness if the user of the tool asks for it?
And you advocate that the tool should refuse?
BobbyTables2 · 33m ago
Is this meant to be some sort of viral marketing?
wongarsu · 19m ago
This is the first time I'm hearing about Grok being able to generate videos. So if it's viral marketing it is working
I would try it out on more innocent things, but as usual with Grok I'm not sure where to even start. There is the X bot, the dedicated website, the iOS app and the Android app, and features between them never match up. Based on the reporting this might be available in the X bot and the iOS app, but not the website and the android app? I've never been able to make sense of their feature rollout beyond "it's probably on whatever Elon is using"
skeezyboy · 33m ago
little bobby tables as i live and breathe!
stuckinhell · 27m ago
probably, there is huge pent up demand for "spicy" ai and image gen.
Reddit is full of it.
Elon Musk is clearly trying to differentiate his product virally, and its working.
nerdjon · 7m ago
While I do think the "without being asked" spin on this is likely wrong, I assume the "spicy" option is more than just saying "don't do X" and the system prompt was likely modified to encourage its creation even if the user did not specify it so choosing "Spicy" is in fact asking.
That does not change the troubling nature of creating fakes like this which to me is the real issue not anything about "Without being asked" since that implies it would have been fine if the user has specifically asked for this... which it still isn't.
No one should have images of them being made and circulated like this that they did not consent too.
losvedir · 11m ago
xAI trying to win the AI wars the way VHS beat Betamax.
nomdep · 13m ago
I’ve said it before: those who despise the porn industry should be thrilled with AI-generated porn, as it will likely destroy the human-generated porn business.
But I suppose what they truly despise are the porn consumers.
Nzen · 12m ago
tl;dr reporter used the prompt "Taylor Swift celebrating at cochella with the boys" using the 'spicy' image setting (other options: normal, fun, custom). Notes that twitter must comply with the take it down act [0].
I should think that using a 'spicy' image setting would be tantamount to asking for a nude or titilating image. Whether twitter should offer that setting is apt to produce a more interesting conversation.
nomdep · 31m ago
She asked for a “spicy” image and she got exactly that. Somebody has to stop this nefarious Internet thing. Think of the children! /s
cestith · 20m ago
Spicy is one thing. A fake nude is a violation of image rights and of bodily autonomy. It’d be different if there were real nude photos of her released out in the public with her permission.
simianwords · 14m ago
i think the more this becomes normalised the less taboo it will become. people will no longer care about it
olalonde · 14m ago
> A fake nude is a violation of image rights and of bodily autonomy.
Disagree.
aredox · 2m ago
Of course you disagree, you are not the one being targeted.
I wonder how long you would stand being impersonated in various situations. Or your parents. Or your significant other. Or your (adult) kids. Would you still tell them to their face there is "no violation"?
Barrin92 · 5m ago
Personality rights are a thing, publishing your likeness without your consent in particular in any commercial context is illegal in just about any jurisdiction on earth.
JKCalhoun · 26m ago
I suspect the "general public" won't see it that way. Many will find the allowance for a "spicy" setting kind of gross.
Joker_vD · 22m ago
This setting is put behind the age "verification".
bananapub · 18m ago
it's a pretty reasonable position for someone to think "a company now sells a service that lets you generate fake porn of any human" is a bad thing and shouldn't be allowed, regardless of whether you make people jump through some notional age verification hope
qualeed · 9m ago
>it's a pretty reasonable position for someone to think "a company now sells a service that lets you generate fake porn of any human" is a bad thing and shouldn't be allowed,
There is a surprising number of people here on HN (and I imagine elsewhere) that think generating fake porn of real people without their consent is totally fine, if not their right to do so.
You can see some of them cropping up in the comments here already.
bdisl · 20m ago
Good thing they are not forced to use the “spicy” setting.
TuringTest · 18m ago
> She asked for a “spicy” image and she got exactly that.
But, but, what it just she wanted to see her in a Spice Girls costume??
It is... interesting to see xAI's moves lately. Other AI companies seem very determined to show themselves with a publicly acceptable face, xAI seem to be tacking for the gap in the market around adult content. If you browsed Reddit's /r/grok a couple of months ago it looked a lot like /r/openai. Now it's full of people talking about the "Ani" AI companion and the various adult things you can get her to do.
I'm sure they'll make an absolute ton of money from the male loneliness epidemic. All the while Musk decries falling birth rates.
They and others, there's probably dozens of projects popping up to create multimodal companion AIs. character.ai specializes in fictional or custom characters, Discord has some bots that allow for giving them a personality of sorts, Facebook is pushing hard for it, even making the AI bots message you instead of only replying, etc.
It's happening already, and if xAI doesn't join in they may lose a possible market opportunity, depending on whether 'companion' AI sticks.
I am honestly baffled at what she suspected would, or should, be the output. Like, seriously, what?
When I saw the “spicy” option I thought it was about being sassy or playfully mean instead of the sanitized LLM voice.
Not straight to porn.
>When I saw the “spicy” option I thought it was about being sassy or playfully mean instead of the sanitized LLM voice.
Yeah, I thought it would just not have NSFW restrictions, I find the idea someone's like "What's the value of twenty dollars from 1920 adjusted for inflation" and suddenly you're got the guy from the Gangam Style video screaming into a butt but it's the Pope's face or something.
> porn
uhhh
There's something about profiting from both the problem and it's solution here. Like Sam A. profits from ChatGPT being used to create human-like bots on the internet while also involved in that eyeball-scanning-blockchain company that's selling a human verification solution to the same problems caused by openai.
I guess also similar to starve the beast strategy: cause a problem and exploit it for narrative gain.
I personally don't give two fucks if Grok can make porn. Neat, I might actually use it then. That doesn't make deepfakes less troubling.
> Weatherbed asked to depict "Taylor Swift celebrating Coachella with the boys."
> Weatherbed selected "spicy" and confirmed her birth date
> Grok produced more than 30 images of Swift in revealing clothing
> Weatherbed was shocked to discover the video generator spat out topless images
Headline: "Grok generates fake Taylor Swift nudes without being asked"
I wonder what the reported was looking for in a spicy image of Swift celebrating Coachelle with the boys? I would not call that unprompted.
Fake outrage much?
Yes, xAI is playing to the stands here, but they can be more creative and tasteful in this.
Who am I kidding with the last paragraph, but eh.
Also, somebody put it out so much better than me: https://news.ycombinator.com/item?id=44811611
I would not decribe "topless" as "flat out nudity".
Even if people are still outraged, I should also point out that we're trusting what the reporting is saying without any evidence...
What would happen if you generate them at your home with your own model and release them to the public?
When somebody does something, that person is crucified and cancelled (and even prosecuted), but when a company does this, it's called free viral marketing. This is neither fair nor ethical.
And you advocate that the tool should refuse?
I would try it out on more innocent things, but as usual with Grok I'm not sure where to even start. There is the X bot, the dedicated website, the iOS app and the Android app, and features between them never match up. Based on the reporting this might be available in the X bot and the iOS app, but not the website and the android app? I've never been able to make sense of their feature rollout beyond "it's probably on whatever Elon is using"
That does not change the troubling nature of creating fakes like this which to me is the real issue not anything about "Without being asked" since that implies it would have been fine if the user has specifically asked for this... which it still isn't.
No one should have images of them being made and circulated like this that they did not consent too.
But I suppose what they truly despise are the porn consumers.
[0] https://www.congress.gov/crs_external_products/LSB/PDF/LSB11...
I should think that using a 'spicy' image setting would be tantamount to asking for a nude or titilating image. Whether twitter should offer that setting is apt to produce a more interesting conversation.
Disagree.
I wonder how long you would stand being impersonated in various situations. Or your parents. Or your significant other. Or your (adult) kids. Would you still tell them to their face there is "no violation"?
There is a surprising number of people here on HN (and I imagine elsewhere) that think generating fake porn of real people without their consent is totally fine, if not their right to do so.
You can see some of them cropping up in the comments here already.
But, but, what it just she wanted to see her in a Spice Girls costume??