I miss the old days when Facebook was simply a fun way to reconnect with friend and family who lived far away. Unfortunately, those days are gone. It feels like an over engineered attention-hogging system that collects a large amount of data and risks people's mental health along the way.
cornfieldlabs · 49m ago
I am building one with a chronological feed and no public profiles.
You need to already know someone to find them here.
Your landing page talks about all the right goals. postx is a good placeholder name, I recommend ideating a better name for launch.
looking forward, wish you the best.
cornfieldlabs · 26m ago
Thank you Anand - for the encouragement and joining the waitlist!
It means really a lot to us.
We are working on a better name and the site!
I'll send you the welcome email manually soon!
cornfieldlabs · 21m ago
new users will face the empty feed problem since by design one can't find anyone without their code.
No "People you may know" or "select at least N interests or follow N accounts to continue".
I think early adopters will invite their friends to join and that is the only way.
Got any suggestions?
msgodel · 2h ago
From the very beginning Facebook has been an AI wearing your friends as a skinsuit. People are only just starting to notice now.
d_watt · 1h ago
Perhaps naive to say, but I think there was the briefest moment where your status updates started with "is", feeds were chronological, and photos and links weren't pushed over text, that it was not an adversarial actor to one's wellbeing.
smeej · 16m ago
There was an even briefer moment where there was no such thing as status updates. You didn't have a "wall." The point wasn't to post about your own life. You could go leave public messages on other people's profiles. And you could poke them. And that was about it.
I remember complaining like hell when the wall came out, that it was the beginning of the end. But this was before publicly recording your own thoughts somewhere everyone could see was commonplace, so I did it by messaging my friends on AIM.
And then when the Feed came out? It was received as creepy and stalkerish. And there are now (young) adults born in the time since who can't even fathom a world without ubiquitous feeds in your pocket.
Call me nostalgic, but we were saner then.
prisenco · 1h ago
The early, organic days of social networking are always fun. They never would have pulled in billions of users if they started off how they are now.
cornfieldlabs · 18m ago
Couldn't have said it better.
Nothing is a social network anymore.
Everything is a content-consumer a platform now.
People just want to scroll and scroll
mysterydip · 1h ago
Well they had to grow the userbase before they could abuse it :)
labster · 1h ago
Nah, not from the very beginning. Before the News Feed, The Facebook was great to find people and keep in contact. Following someone’s page too often was called
Facebook stalking and was socially discouraged.
Unfortunately parasocial behavior is good for engagement.
xyst · 43m ago
These days I treat Facebook as a marketplace for offloading lightly used items.
Social media is dead to me.
idiotsecant · 5m ago
What a coincidence, I use it as a market for buying lightly used items.
npalli · 57m ago
So Feb 4, 2004 (founding) to September 6, 2006 (newsfeed). LOL.
droopyEyelids · 2h ago
This is a real Rip Van Winkle style take (posted with gentle humor)
ants_everywhere · 3h ago
This is why I requested family not to post pictures of my children on Facebook.
They will get to decide what to do with their likenesses when they're older. It seemed cruel to let Facebook train a model on them from the time they were babies until they first start using social media in earnest.
mitthrowaway2 · 1h ago
Some cultures long avoided being photographed, because they believed the camera would steal their soul.
It took the rest of us much longer to realize they were right.
chii · 4m ago
> the camera would steal their soul.
wasn't the camera doing the stealing, but the holder of the photo (facebook in this case)! And it wasn't the soul being stolen, but money!
qntmfred · 1h ago
tf are you talking about. cameras don't steal souls.
phyzix5761 · 52m ago
Maybe they mean identity by soul which is kind of what's happening here.
heavyset_go · 26m ago
It's a metaphor.
jwr · 2h ago
In some countries (notably Poland) Facebook is so burned into people's brains that you can't avoid this, and if you try, people and institutions will consider you a tinfoil hat weirdo and put pressure on you.
Basically every kindergarten, primary school and high school will want to post pictures.
throwacct · 1h ago
I don't care if they label me a weirdo. I agree with OP. Please refrain from posting any pictures of my children. Simple as that.
sebmellen · 2h ago
Since Facebook is pulling from the camera roll, not posting is not an adequate defense.
zhivota · 2h ago
Only logical thing to do personally is to take it completely off your mobile devices. You still get caught in the dragnet if you have friends and family posting you.
Also in many places WhatsApp is practically a requirement for daily life which is frustrating. What I need is some kind of restricted app sandbox in which to place untrustworthy apps, they see a fake filesystem, fake system calls, etc.
latentsea · 2h ago
On Android you can just make a separate user profile for it and do that I suppose.
dzhiurgis · 55m ago
Caught in what tho?
dangus · 2h ago
Recent iOS versions have granular controls over library access to prevent this.
bnjms · 2h ago
It isn’t nice to use though.
You select your picture then when you need to add more you’ve got to go back into the settings for that app and select the picture. Then add the picture you selected.
I’m grateful though. We would have called meta malware back when.
what · 1h ago
The built in camera roll widget lets you edit what pictures are allowed without going to settings. Maybe it’s a new change or the apps you use have a custom photo picker, I dunno.
dangus · 1h ago
It’s not that clunky anymore. You can limit access to the library to pick media from and it’ll give you the full library with this message:
Limited Access to Your Library
"App" can only access the items that you select. The app can add to your library even if no items are selected.
dzhiurgis · 51m ago
I try to use web versions of everything (fb, insta, x). If it’s shitty enough I’ll use it less.
I.e. messenger.com is possible to use if you request desktop version, change font size and deal with all sort of zoom issues. Of course fb doesn’t support actual calls or notifications just because, so I don’t use it.
Instagram is even sneakier - you can’t post stories via mobile to “close friends”, post videos or view them from instant messages.
bigfatkitten · 32m ago
There’s some irony in the fact that the company which spawned React has also produced some of the world’s least usable React apps.
sneak · 9m ago
The way you prevent this is by deleting your facebook account and uninstalling the app.
huhkerrf · 18m ago
I did the same. And then my mother-in-law decided to ignore my requests. And then my mother got angry. And then I caved.
goku12 · 3h ago
This is truly egregious. Facebook and Instagram are installed by default on many android phones and cannot be fully uninstalled. And even if asked for consent, many people may choose the harmful option by mistake or due to lack of awareness. It's alarming that these companies cannot be held to even the bare minimum standards of ethics.
As an aside, there was a discussion a few days back where someone argued that being locked in to popular and abusive social/messaging platforms like these is an acceptable compromise, if it means retaining online contacts with everyone you know. Well, this is precisely the sort of apathy that gives these platforms the power to abuse their marketshare so blatantly. However, it doesn't affect only the people who choose to be irresponsible about privacy. It also drags the ignorant and the unwilling participants under the influence of these spyware.
ethan_smith · 2h ago
You can use ADB (Android Debug Bridge) to disable pre-installed Facebook/Instagram apps without root via `pm disable-user` commands, effectively preventing them from running or collecting data.
baobun · 34m ago
Bettet make it a script or ansible playbook from the start since you will need to reapply it after system updates.
goku12 · 1h ago
That's what I did. But as others point out, how many know about this? And modifications are getting harder by the year. They are relying on these factors to ensure that the majority of the population remains exploitable.
dylan604 · 2h ago
which what, 0.5% of users will know and be able to do?
esseph · 2h ago
That number is way way way way way too high
Jackson__ · 39m ago
Curious, is this really necessary? I'd assume the subtotal of public images posted on meta services to be in the trillions.
ipsum2 · 1m ago
I imagine many people will react only to the headline and not read the article, but:
"Meta tells The Verge that, for now, it’s not training on your unpublished photos with this new feature. “[The Verge’s headline] implies we are currently training our AI models with these photos, which we aren’t. This test doesn’t use people’s photos to improve or train our AI models,”
As someone who is familiar with the ML space, it seems unlikely that the addition of private photos will significantly improve models, as you have mentioned.
toofy · 3h ago
how long until we find out that the brand new government/palantir deal is using these photos as well against citizens?
According to the thread on /r/europe that person smoked weed and lied about it on their immigration form.
hedora · 44s ago
So, they engaged in behavior that’s legal at Facebook HQ?
In other news, FB has been using whatsapp metadata to coordinate genocide campaigns in Gaza. What’d all those dead civilians (including infants) do, again?
Presumably they signed a TOS, so it’s OK.
bigiain · 1h ago
I look forward to the schadenfreude I will feel when someone makes the right FOI request and we discover this "feature" was built by Meta at the request of the NSA or the FBI or some other government TLA.
dzhiurgis · 49m ago
If you have so much trouble government I don’t think deleting facebook will change anything.
aetherspawn · 2h ago
iOS -> Settings -> Privacy and Security -> Photos -> Facebook -> Set limited access
msgodel · 2h ago
You'd have to block nearly every app from ever seeing any image you don't want Facebook getting ahold of including apps that are made by other companies. Almost everyone uses their libraries, they practically have a shell on your phone (which is funny because you're not allowed that on your own device for "security.")
AJ007 · 3h ago
Very helpful for ad targeting. As Apple kills tracking and ramps up its own ad business, Meta will need to collect as many signals as possible.
Maybe this will finally convince people to throw out their smartphones.
IncreasePosts · 2h ago
I wonder how many pieces of code at facebook there are with guards like
if (userId == 1) {
// don't add mark's data to training set
}
polyomino · 2h ago
Mark's user id is 4
samlinnfer · 2h ago
Don’t worry, I upload Zuck’s photos to facebook for him.
SoftTalker · 2h ago
LOL at the idea that he uses Facebook. None of the silicon valley bigwigs or their kids have anything to do with social media tech except in perhaps very controlled, orchestrated ways. The normal users are just "dumb fucks."
kevingadd · 2h ago
This seems like a liability nightmare. If they're just scanning all the image files on people's devices and using them for training, they're inevitably going to scoop up nudes without permission, not to mention the occasional CSAM or gore photo, right? Why would you want to risk having stuff like that sneak into your training set when you already have access to all people's public photos?
heavyset_go · 11m ago
It's simple, they don't care.
latentsea · 1h ago
The purpose of a system is what it does. To that end it could actually be a plot by the CIA to find targets with this type of material on their devices, which can then be used against them to turn them into assets.
sebmellen · 2h ago
I’m sure they use a provider like Hive to scan all the photos before processing them.
xyst · 45m ago
Data and people are the commodity in this Ai gold rush. Primary benefactors are big tech.
deadbabe · 47m ago
Would it be any better if Facebook hired photographers to walk around cities and major events and just photograph random people doing stuff? AI will get hungrier.
dzhiurgis · 48m ago
Wonder if you could ddos it by taking selfies with ai generated faces in background.
The Verge’s clickbait headline makes it sound like Facebook is using private photos without the user’s knowledge/consent. The paywalled next paragraph explains that this is not the case.
I encourage everyone to look at that screenshot and decide for yourself if the media coverage is reasonable here.
eviks · 45m ago
> The Verge’s clickbait headline makes it sound like Facebook is using private photos without the user’s knowledge/consent.
Nah, that's the company's reputation that appends malice in your mind to an innocent headline
alex1138 · 2h ago
Facebook has, though, historically been less than honest about consent
I bet "agree to" is "we clicked the box for you anyway"
dylan604 · 2h ago
Oops, we totally didn't mean to, but an undiscovered bug did not obey the check box and slurped in everything anyways.
bigiain · 1h ago
"Somebody moved fast and broke things. We have no idea why they thought that was appropriate behaviour on production systems, it's completely against company policy."
It's surprising(not) how that class of error always seems to fall on the side of Facebook grabbing more data without consent, and never on the side of accidentally increasing user privacy.
ashdksnndck · 1h ago
How would you know? If Facebook has a bug that accidentally increases user privacy, does The Verge write an article about it?
jiggawatts · 2h ago
My KPIs? I don’t see what my new Lamborghini has to do with anything!
ashdksnndck · 1h ago
Maybe you should get a job at The Verge!
I’m sure if you log the Facebook app’s network traffic on your phone and show that it uploads photos without you clicking on the agree button, they’ll happily publish an article about your findings.
JKCalhoun · 2h ago
Curious about accounts that have been deactivated/deleted.
bigiain · 1h ago
Mine has been deleted for almost 10 years now. I fully assume they've retained and are mining every post I made, every photo I uploaded, and every interaction I ever had on FB, and are still using FB tracking pixels on every website running them to feed more data about me into my profile - and are not only selling that to advertisers but are now training their AI on it without consent at every opportunity.
shakna · 2h ago
They trained on libgen without qualms. There's little reason to suspect they'll give the rest of their users more respect.
wat10000 · 2h ago
The plans were available in the basement, behind the door that says “beware of the leopard.”
Nothing on that screen says they’re using your photos for training. I’m sure it’s in the linked terms, but Facebook knows those won’t be read.
ashdksnndck · 1h ago
The consent screen says “upload it to our cloud on an ongoing basis” and “analyzed by meta AI”. To me that seems like a reasonable level of explanation for non-technical users. Most people don’t know what it means to “train” an AI, but reading that meta is processing the photos in the cloud and analyzing them with AI gives them some picture.
This isn’t buried. The user has to see the screen and click accept for their photos to be uploaded.
Compared to the usual buried disclaimers and vague references to “improving services,” consenting to 1000 things when you sign up for an account, this is pretty transparent. If someone is concerned, they at least have a clear opportunity to decline before anything gets uploaded.
It’s just surprising to me that people look at this example of Facebook going out of their way to not do the bad thing and respond with a bunch of comments about how they doing the bad thing.
basilgohar · 36m ago
This is a pretty generous take. You even highlight most people won't know what this means and then handwave away the concerns of people who DO know what it means and assert most people won't accept it if they did understand it.
ashdksnndck · 2m ago
> assert most people won't accept it if they did understand it
I didn’t make that assertion. I think most people don’t care if their photos are used to train an AI model as long as Facebook doesn’t post the photos publicly where people can see them. But I’m aware some people dislike AI and/or have strong beliefs about how data should be used and disagree. It makes sense to give those people an opportunity to say no, so it seems like a good thing that the feature is opt-in rather than an opt-out buried in a menu.
paulnpace · 2h ago
What does something like this look like from the other side? Do users just agree to everything put in their face? The copy there sounds like it's a really convenient fun new thing.
msgodel · 2h ago
Have you ever watched a "normal" person interact with a modal dialog? They don't even read it, they'll just spam whatever button they think will make it go away.
You need to already know someone to find them here.
Check out the waitlist!
https://waitlist-tx.pages.dev/
Edit:
Here are some rough layout designs https://drive.google.com/drive/folders/1uLwnXDdUsC9hMZBa1ysR...
It's intentionally simple
It means really a lot to us.
We are working on a better name and the site!
I'll send you the welcome email manually soon!
No "People you may know" or "select at least N interests or follow N accounts to continue".
I think early adopters will invite their friends to join and that is the only way.
Got any suggestions?
I remember complaining like hell when the wall came out, that it was the beginning of the end. But this was before publicly recording your own thoughts somewhere everyone could see was commonplace, so I did it by messaging my friends on AIM.
And then when the Feed came out? It was received as creepy and stalkerish. And there are now (young) adults born in the time since who can't even fathom a world without ubiquitous feeds in your pocket.
Call me nostalgic, but we were saner then.
Nothing is a social network anymore.
Everything is a content-consumer a platform now.
People just want to scroll and scroll
Unfortunately parasocial behavior is good for engagement.
Social media is dead to me.
They will get to decide what to do with their likenesses when they're older. It seemed cruel to let Facebook train a model on them from the time they were babies until they first start using social media in earnest.
It took the rest of us much longer to realize they were right.
wasn't the camera doing the stealing, but the holder of the photo (facebook in this case)! And it wasn't the soul being stolen, but money!
Basically every kindergarten, primary school and high school will want to post pictures.
Also in many places WhatsApp is practically a requirement for daily life which is frustrating. What I need is some kind of restricted app sandbox in which to place untrustworthy apps, they see a fake filesystem, fake system calls, etc.
I’m grateful though. We would have called meta malware back when.
Limited Access to Your Library
"App" can only access the items that you select. The app can add to your library even if no items are selected.
I.e. messenger.com is possible to use if you request desktop version, change font size and deal with all sort of zoom issues. Of course fb doesn’t support actual calls or notifications just because, so I don’t use it.
Instagram is even sneakier - you can’t post stories via mobile to “close friends”, post videos or view them from instant messages.
As an aside, there was a discussion a few days back where someone argued that being locked in to popular and abusive social/messaging platforms like these is an acceptable compromise, if it means retaining online contacts with everyone you know. Well, this is precisely the sort of apathy that gives these platforms the power to abuse their marketshare so blatantly. However, it doesn't affect only the people who choose to be irresponsible about privacy. It also drags the ignorant and the unwilling participants under the influence of these spyware.
"Meta tells The Verge that, for now, it’s not training on your unpublished photos with this new feature. “[The Verge’s headline] implies we are currently training our AI models with these photos, which we aren’t. This test doesn’t use people’s photos to improve or train our AI models,”
As someone who is familiar with the ML space, it seems unlikely that the addition of private photos will significantly improve models, as you have mentioned.
i give it a year or less.
Yesterday.[1]
[1] https://www.cincinnati.com/story/news/2025/06/26/jd-vance-me...
In other news, FB has been using whatsapp metadata to coordinate genocide campaigns in Gaza. What’d all those dead civilians (including infants) do, again?
Presumably they signed a TOS, so it’s OK.
Maybe this will finally convince people to throw out their smartphones.
The non-paywalled TechCrunch story shows the consent screen that people agree to before Facebook uses the photos in this way: https://techcrunch.com/2025/06/27/facebook-is-asking-to-use-...
I encourage everyone to look at that screenshot and decide for yourself if the media coverage is reasonable here.
Nah, that's the company's reputation that appends malice in your mind to an innocent headline
I bet "agree to" is "we clicked the box for you anyway"
It's surprising(not) how that class of error always seems to fall on the side of Facebook grabbing more data without consent, and never on the side of accidentally increasing user privacy.
I’m sure if you log the Facebook app’s network traffic on your phone and show that it uploads photos without you clicking on the agree button, they’ll happily publish an article about your findings.
Nothing on that screen says they’re using your photos for training. I’m sure it’s in the linked terms, but Facebook knows those won’t be read.
This isn’t buried. The user has to see the screen and click accept for their photos to be uploaded.
Compared to the usual buried disclaimers and vague references to “improving services,” consenting to 1000 things when you sign up for an account, this is pretty transparent. If someone is concerned, they at least have a clear opportunity to decline before anything gets uploaded.
It’s just surprising to me that people look at this example of Facebook going out of their way to not do the bad thing and respond with a bunch of comments about how they doing the bad thing.
I didn’t make that assertion. I think most people don’t care if their photos are used to train an AI model as long as Facebook doesn’t post the photos publicly where people can see them. But I’m aware some people dislike AI and/or have strong beliefs about how data should be used and disagree. It makes sense to give those people an opportunity to say no, so it seems like a good thing that the feature is opt-in rather than an opt-out buried in a menu.