Key point that makes Meta more culpable than usual:
> While many were created by users with a Meta tool for building chatbots, Reuters discovered that a Meta employee had produced at least three, including two Taylor Swift “parody” bots.
dlock17 · 1h ago
Somehow I don't think Visa, MasterCard and PayPal are going to shut down Meta's accounts over these pornographic (and pedophilic) images.
The real problem is that Meta can test the waters with crap like this and get away scot free, maybe a few settled lawsuits at worst.
add-sub-mul-div · 1h ago
The story about the Kendall Jenner chatbot who lured the old guy to his death is wild.
It was supposed to be dinosaurs that science would create and be unable to stop from killing us. It's so much less cool to be killed by a simulation of an influencer who has nothing worthwhile to simulate in the first place.
betterhealth12 · 6m ago
I watched this story and couldn't believe it when the actor responds to the anchor,
"That I know of that have lost money (to scammers impersonating him)? It's in the 100s...
I see people come to my appearances and look at me like we've had a relationship online for a couple of years and I'm like, no, I'm so sorry, I don't know who you are. It's so sad, and you see the devastation."
youtube.com/watch?v=ghmvOP6Daso
At 1:52:00 in this DOAC video Steven says his team spends 30% of their time sorting through deepfake ads, to the extent he had to hire someone whose exclusive job is to spot scam videos and report them to FB etc:
I feel like there's a big undercurrent brewing but because the individual damages are not high enough and victims have limited recourse, nothing significant happens.
> While many were created by users with a Meta tool for building chatbots, Reuters discovered that a Meta employee had produced at least three, including two Taylor Swift “parody” bots.
The real problem is that Meta can test the waters with crap like this and get away scot free, maybe a few settled lawsuits at worst.
It was supposed to be dinosaurs that science would create and be unable to stop from killing us. It's so much less cool to be killed by a simulation of an influencer who has nothing worthwhile to simulate in the first place.
"That I know of that have lost money (to scammers impersonating him)? It's in the 100s...
I see people come to my appearances and look at me like we've had a relationship online for a couple of years and I'm like, no, I'm so sorry, I don't know who you are. It's so sad, and you see the devastation."
youtube.com/watch?v=ghmvOP6Daso
At 1:52:00 in this DOAC video Steven says his team spends 30% of their time sorting through deepfake ads, to the extent he had to hire someone whose exclusive job is to spot scam videos and report them to FB etc:
https://youtu.be/JMYQmGfTltY?si=ntuDgXuhMYj2fh5z&t=6706
I feel like there's a big undercurrent brewing but because the individual damages are not high enough and victims have limited recourse, nothing significant happens.