Third party cookies must be removed

430 pabs3 219 5/2/2025, 1:03:10 AM w3ctag.github.io ↗

Comments (219)

growthwtf · 20h ago
What a weird piece of writing. Is this like just chicken scratch? Or is this seriously some kind of part of the W3C working process?

Section 2: Third party cookies have gotten bad. Ok.

Section 3: There are legitimate use cases that third party cookies currently cover. Also ok. Then they throw in, "Be aware that a set of new technologies which carry minimal risk individually, could be used in combination for tracking or profiling of web users." Yes? Huge scope increase in the document though and all of a sudden we're now talking about tons of tracking technologies in aggregate? The authors move on without further comment.

Section 4: I think the first half is essentially saying that new technology coming online in the web platform will make the third party cookie problem worse, so we should fix it soon. OK, I'm with back with you. Then the document suddenly pivots to proposing general standards for web privacy again, saying that the burden of proof is on the people originating the proposal to, before concluding by saying (apparently without irony?) that justifying the removal of third-party cookies' impact on business is outside of the scope of the document.

I'm missing a ton of cultural context here about how W3C works, so I'm guessing this probably amounts to rough notes that somebody intends to clean up later that I'm being overly critical of, and they didn't expect it to get any traction on hacker news.

bilekas · 19h ago
It's W3c... They've never been the most coherent with standards ironically.
IshKebab · 11h ago
Isn't W3C fairly irrelevant these days?
FinnKuhn · 4h ago
The EU uses the WCAG 2.0 to define web accessibility in multiple acts and directives, some of which were passed pretty recently.[1]

[1] https://www.w3.org/WAI/policies/european-union/

hackrmn · 8h ago
They're very far from irrelevant, depends on what kind of Web development you do, I would say -- I have been writing WebAssembly by hand (I mean, a lot can be said about that but it's a thing) and the spec. is authored by W3C. There's plenty of other things they author, like, you know, either one of the many _CSS_-related specifications.

It's just that with the modern Web 7.0 (or whatever version we're on now), it's WHATWG that's most prominent since there's that one spec that defines 90% of what happens on the Web, it's called "The HTML standard" or some such. Then you have Google de-facto authoring specs., which may or may not find their way back into the HTML document, but even if they don't, they do make you feel like W3C is left behind.

motorest · 18h ago
...or it's a design by committee thing, and some people in the room are doing their best to preserve current and future tracking technology.
bilekas · 17h ago
It's exactly this, there is a group who come together and never agree on rules, but when they do, they never enforce them. It's I believe the definition of a paper tiger, sadly. A great idea executed horribly.
__alexs · 15h ago
Standards bodies rarely enforce rules themselves.
squigz · 14h ago
Is it really on the W3C to enforce standards? How would that even work?
chasd00 · 6h ago
if they had a clear test that declares a site w3c compliant or not with no wiggle room then they could work with something like the ADA or other accessibility related standards and make w3c compliance required for ADA compliance.
lukan · 13h ago
By shipping their own reference browser ..
squigz · 13h ago
In what way would that enforce standards?
lukan · 12h ago
Well, the same way google can enforce their standards via chrome.

(I did not say it is a realistic goal for a theoretical comitee)

victorbjorklund · 9h ago
The only reason Chrome can do that is because it has a huge chunk of the market. It does not work for a browser with no users.
echoangle · 11h ago
So not at all? Shipping something in chrome isn’t enforcing a standard in my opinion. Enforcing a standard would be a regulatory thing, like having to use USB-C in certain situations.
lukan · 11h ago
Chrome is in a monopol position. If they decide to ship a new feature .. then all the other browsers need to implement it as well, or their users assume their browser is broken.
squigz · 9h ago
Okay but that's still not the same as enforcing a standard, in any way... You're suggesting the W3C should simply roll a "reference browser" that supplants Chrome so they can force standards on users themselves. That really doesn't seem like a great way to do it.
motorest · 16h ago
> A great idea executed horribly.

No. It's sabotage.

milesrout · 15h ago
Never attribute to malice etc.
consp · 14h ago
Design by committee is more likely malice than accident or stupidity. Some factors work towards goals which are good for them but malice for the majority.
1oooqooq · 5h ago
exactly.

i don't understand how everyone ignores that w3c is mostly staffed by companies in adtech.

their goal is to keep adtech viable and profitable. Microsoft with ie, and then google with chrome, are just extra pushes to this end. but the main effort is w3c.

disclaimer: was one of the aforementioned grunts in a more naive life.

andrewla · 6h ago
What are the legitimate use cases for third-party cookies?

The only one I can think of is if there is a single logical site spread across multiple domains, and you want to be able to maintain a single session across both domains, but are not willing (for aesthetic reasons or technical reasons) to encode this information in the links while moving between domains.

Are there others?

As far as I'm concerned I don't even want first-party cookies to be available when accessed through a third-party context (i.e. if I have a cookie on a.com, and b.com fetches an image from a.com, I don't want that image request to send my a.com cookie).

My preference for this entire discussion is that we eliminate cookies entirely, but use self-signed client certificates very aggressively. When you navigate to a url the user agents sends a client certificate specific to that domain. Any subsidiary fetches use a (deterministic) scoped client certificate specific to the subsidiary domain. All other state tracking is required to be server-side and linked to the certificate. The user can instruct their user agent to reset any of these.

svieira · 4h ago
> What are the legitimate use cases for third-party cookies?

Embeds

1. As a user in BigCorp I want to embed this new AI reporting software into my larger internal portal for certain groups so they can try it out. The embedded software needs to know who the user is for authorization purposes. 2. As an end user of multiple appliance websites I would like to not have to enter my information multiple times into the support chat bot since all of these various companies are all part of the same overarching corporation and they already know my information. 3. As an end user of multiple blogs all using the same third-party commenting system I would like to already be logged in on foo.blog if I logged in earlier on bar.weblog.

These are all nice convenience features that builders and end users have gotten used to. They can all be achieved in other ways. The other ways are more painful to integrate and require a lot more work both on the part of the integrator and on the part of the end user (at least from a machine perspective - consider how much more work a pre-authenticated (no end-user-visible interstitial screens) OAUth flow is from the browser's perspective than a single request with an attached cookie).

andrewla · 3h ago
Cases 1 & 2 are manageable through url parameters since there is shared infrastructure on both ends. These can be done without any user visible workflow changes.

Case 3 feels the most like a legitimate use case. Even with url parameters, there's no way for Disqus (or whatever) to know that a.com's user is the same person as b.com's user since those are independent entities. It is still solvable, but not in a zero-click way, by having a round-trip redirect (like an OAuth flow), to associate my user on the blog with my user on the commenting platform. But that does require that the blog track my session, which is not a requirement in the current world.

On the other hand, I'm not sure how much I like having #3 work in general -- that a blog commenting platform can implicitly know what blogs I'm visiting, and potentially even having a blog on which I have no account implicitly having access to my account on other blogs, feels a bit intrusive to me. I'd rather have that integration at the level of a browser extension that modifies user agent behavior rather than something that happens without my permission.

svieira · 1h ago
> Cases 1 & 2 are manageable through url parameters

Any URL parameter that isn't signed is going to be modified (Look at me, I'm the CEO now). And if it is signed then it can be leaked (web app logs generally log query string parameters, if it's in a link that can be copy-pasted then it can be widely shared by accident, etc.)

> On the other hand, I'm not sure how much I like having #3 work in general

Yeah, that is another potential architecture for the web (kind of like Project Xanadu: https://en.wikipedia.org/wiki/Project_Xanadu). It isn't how the web is currently built, so the real question is "how much of the web should break?"

dietr1ch · 6h ago
> Single logical site spread across multiple domains.

Is there really need for this? I get subdomains can help routing, but beyond that sites spreading over multiple domains are chaotic and phishing-prone. People get used to jump from foo.com to foo.net or scammyfoo.tk and enter their credentials if they look similar. I think that a big part of how password managers help is by keeping passwords from their users and not sharing them with any random domain that may read similar or misleading.

horsawlarway · 5h ago
A common need for this is during an acquisition or merger.

It's fine and all to assume that domain is identity, but that doesn't actually map too well to relatively complex organizational hierarchies.

Ex - Bank A and Bank B merge. There is going to be a period where they have to navigate that two domains represent a single organization. It's often a fairly high level of effort to move to a completely new domain, and it won't be done overnight.

Yes - eventually you want to be back on a single domain, and I think there is definitely a world where this leads to some very bad patterns (HR and healthcare are two examples - if you've ever seen a login need to bounce between like 5 different domains because they've refused to actually do the technical work to consolidate back on a single domain, and treat the domain as marketing).

But it's a really valid spot to end up in, and is the most common cause of having a single entity spread out over multiple domains in my experience.

miki123211 · 4h ago
More common are multiple sites (which use their own domains for esthetic / brand reasons), but are actually hosted by the same SaaS provider and could therefore share authentication infrastructure.

Imagine an easy-to-use website builder for restaurants where each restaurant gets a memorable domain, and they let you order things online. It would be great for customers if they didn't have to enter their payment details and shipping address for each new restaurant they order from. Maybe they could even see opening hours and product availability for the closest restaurant to their address. There's no privacy risk here, as all these websites are actually on a single provider anyway. They're just multiple entries in some SQL database, each with a `domain` associated with them.

anon7000 · 4h ago
Oh yeah, there are definitely valid use cases. If you’re a web hoster, you host many sites under user’s domains. There are plenty of features you might want to offer your users — for example, if they’re logged into their hosting account, visiting their domain shows them some kind of status or site editing bar. Or maybe there are social features, like liking posts or commenting on other people’s sites without having to create new accounts and logging in everywhere. 3rd party cookies make this possible. Alternatives (at least as of a couple years ago) are often still worse user experiences.

For a modern example, restaurants have online ordering systems. And a lot of them use the same service under the hood. (Eg toast.) If you want to use the same credit card you used somewhere else, you have to login on every single restaurant site using the SMS code. (Eg “link pay.”) Allowing 3rd party cookies would make that flow faster, since you could visit other restaurant’s domains while still being logged into the 3rd party payment domain. (And specifically, logged-in inside an iframe so the restaurant site can’t read your payment info.)

andrewla · 3h ago
These flows all feel very dangerous to me because they potentially allow a site to access information about me that I have not explicitly allowed.

Take the web hosting example; naively if I visit any site hosted by that company, can they detect that I have an account and am logged in to my hosting account? That feels like a dangerous amount of leakage, and you're relying on the hosting website to make the correct restrictions rather than having it structurally embedded in the user agent.

The shared payment system feels even worse -- is it then possible for a random website to get a payment through this system, or extract information about my payment account?

worik · 3h ago
> What are the legitimate use cases for third-party cookies?

None, looking at it from a web users' perspective

You can make up scenarios that require them, but these are artificial and contrived, and boil down to "I can extract more value"

When you are the one extracting value, and you are an articulate intelligent person, I expect you will have screeds and screeds of logical sounding reasons why third party cookies are good for me as a web user.

You would be wrong

I have been deleting them for years.

dbushell · 19h ago
The "replacement" is already being penned: https://www.w3.org/TR/privacy-preserving-attribution/

Which is just going to be in additional to 3rd-party cookies. Google's own study concluded removing 3rd-party cookies loses revenue and "privacy-preserving" tracking increases revenue: https://support.google.com/admanager/answer/15189422 So they'll just do both: https://privacysandbox.com/news/privacy-sandbox-next-steps/

surajrmal · 17h ago
There are regulatory agencies which have specifically told Google it is not allowed to remove 3rd party cookies without a replacement as while Google would be able to continue to function fine, their competitors would take a major loss.
JoshTriplett · 13h ago
Sounds like a great argument for running a different browser not developed by an advertising company, and thus not constrained by that.
chrisweekly · 10h ago
Agreed. Curious what HNers feel is the most viable replacement. I'm experimenting w Arc this week...
barnabee · 8h ago
Firefox with uBlock Origin, Privacy Badger at a minimum, other extensions to taste[0]

I’ve also been experimenting with Zen[1], which is Firefox based, recently and it seems quite promising in terms of a nicer default UI.

[0] I like Tab Stash, Vimium C, SponsorBlock, Decentraleyes, DeArrow, Archive Page, among others

[1] https://zen-browser.app/

aftbit · 9h ago
Firefox is alright. I keep around a script called `chrome-new` for those rare case I still need Chrome.

  #!/bin/sh
  if [ -z $CHROME ]; then
      test -e "$(which chromium)" && CHROME="chromium"
      test -e "$(which google-chrome)" && CHROME="google-chrome"
      test -e "$(which google-chrome-stable)" && CHROME="google-chrome-stable"
      test -e "$(which google-chrome-dev)" && CHROME="google-chrome-dev"
  fi
  TMPDIR=$(mktemp -d /dev/shm/chrome-XXXXX)
  $CHROME --user-data-dir=$TMPDIR --no-first-run --no-default-browser-check "$@"
  rm -rf $TMPDIR
connicpu · 9h ago
I've been on Firefox for years, it's extremely good these days
move-on-by · 7h ago
I’m unhappy with Firefox’s new privacy policy so I jumped over to WaterFox. It’s working good for now, but I’m anxiously awaiting ladybird browser.
pas · 14h ago
Do you have links for this? I'm curious about which bodies and what was their argument.
diogocp · 13h ago
dbushell · 13h ago
Seems like the CMA are concerned for other advertisers who profit from 3rd-party cookies, no concern for user's privacy. That poor billion dollar industry, how will it cope?
blibble · 9h ago
their mandate is to regulate competition

not privacy

josefx · 12h ago
Another "trusted" third party based tracking system. All I need to know to avoid it even when it is printed on toiletpaper.
dbushell · 10h ago
Yep, definitely "trusted third party". For example:

https://blog.mozilla.org/en/mozilla/mozilla-anonym-raising-t...

Owned by Mozilla, ran by ex-Facebook employees. I'm sure it's entirely coincidentally this W3C draft was written by Mozilla and Facebook employees.

red_admiral · 14h ago
I just want someone to explain how I can edit my own privacy preserving attribution database. Is it a local SQLite database or something?

I feel like storing my "preferences" locally without letting me edit them as a stupid move.

jeroenhd · 14h ago
Google's design stores the tracking data locally. Chrome already has a UI to manage topics of interest (chrome://settings/adPrivacy).
worik · 3h ago
> "privacy-preserving" tracking

Wow.

sedatk · 19h ago
If third-party cookies are removed, the tracking parties will just ask web sites to include the script on their web server, so their cookies become "first party" again. I don't understand how this helps the web unless protections against tracking itself, not the methods used, are established.
throwawayqqq11 · 5h ago
Embedding 3rd party JS as first party is a security nightmare or a wet dream for malvertising and already here, eg. via extra DNS records lifting ad servers into the first party.

https://www.freepatentsonline.com/8990330.html

The next step will be to strict JS. Im for it.

fastest963 · 27m ago
Including the script on the publisher's site doesn't allow them to track you from site A to site B which is what third-party cookies let them do.
Dwedit · 19h ago
It's about trust, the third-party ad companies don't trust that the first party will be honest with them, not generating fake impressions or clicks.
thayne · 17h ago
There are also trust issues the other way. I've seen a lot of contention between developers and security teams and marketing about putting third party code or proxying third party domains on the first party site for analytics, tracking, ad attribution, etc.
lolinder · 9h ago
There's all kinds of cryptography available for solving trust problems. I guarantee you that within six months of third party cookies being removed someone will have built an impression signing system that is satisfactory to both the ad companies and the server owners.
sedatk · 18h ago
I doubt that. Their script could as well be "fetch that script from that URL and run it". They would have fraud detections already in place on their side regardless of which script runs on the client.
chii · 16h ago
> "fetch that script from that URL and run it"

but if you cannot have a third party cookie, the remote site from the tracker cannot be sure that the script was actually downloaded, nor executed.

kevin_thibedeau · 9h ago
Generate dynamic, short lifetime URLs that are locked to the client IP.
littlecranky67 · 15h ago
sure you can, if their script is making a 3rd party xhr request to that tracker.
chii · 15h ago
but this request could be faked, if the first party wanted to fake the traffic (for example, to make ad revenue). This third party cookie is what prevents this faking at this moment.
blacksmith_tb · 19h ago
That's old hat, the future is server to server calls from sites to vendors, profile the client but don't try to run any tracking js on it.
kstrauser · 18h ago
That's vastly more expensive, though. Now you have to run extra servers to make outbound connections to the ad tracker's API server instead of turfing off all the work to visitors. It would be enough to significantly affect the ad market.
SoftTalker · 8h ago
You also get to do it on your fast cloud backend infrastructure instead of the end-users home computer and ISP. They will appreciate the increase in page load speed and overall responsiveness, and as a bonus they can't use ad blockers or hostfile tricks anymore.
Griffinsauce · 16h ago
Oh no!
ars · 16h ago
I don't think it's that expensive to do. All it takes is one well written package that is easy to install and this will be come standard.

I could even see a data broker centralizing this and distributing tracking to all of their clients. The client would just need to communicate with the central broker, which is not hard at all.

kstrauser · 8h ago
As long as your scale is tiny, sure. At a point you'd need to turn that into an async task queue etc etc.

BTW, I see this as a feature, not a bug. I'm glad it would be harder and more expensive to violate my privacy.

secondcoming · 9h ago
This setup already exists, they're called Supply Side Platforms.
sedatk · 18h ago
That's also quite the possibility, and supports my point.
fiddlerwoaroof · 17h ago
I think many adtech companies (at least in affiliate marketing) use redirects because third party cookies are unreliable and redirects make all the cookies first party. As mentioned elsewhere, they’ve also been switching to proxies and other such techniques to make it even harder to block their tracking endpoints.
coffeefirst · 12h ago
This doesn’t actually help. If you consider Prebid, Criteo already has js running on the site serving the ads, but that js has no mechanism to figure out whether the user has something in their cart and is eligible for retargeting.

The workaround is looking more and more like IP, fingerprinting, and AI. I’d argue this is worse than 3p cookies, which were at least dumb and easy to clear.

parrit · 17h ago
Proxies for analytics are already a thing. E.g. plausable shows you how to set one up. A 3rd party cookie can however be the same value sent again and again from the same browser from different sites to the central server tracking you across the web. The global who you are is in the cookie.
rajnathani · 8h ago
This should be the top-most comment.
timewizard · 17h ago
> include the script on their web server, so their cookies become "first party" again.

That script would execute with the origin of the server. It's access to resources and /shared state/ would be hampered by this. So as a cross-site tracking strategy I don't think this works.

> I don't understand how this helps the web unless protections against tracking itself, not the methods used, are established.

Which is why I think state partitioning[0] and CHIPs[1] are good technologies. It allows previously existing standards, like cookies, to continue to exist and function mostly as expected, but provides the user a good amount of default security against cross site trackers and other malware.

[0]: https://developer.mozilla.org/en-US/docs/Web/Privacy/Guides/...

[1]: https://developer.mozilla.org/en-US/docs/Web/Privacy/Guides/...

littlecranky67 · 15h ago
Your point is pretty useless, as you assume the web server admins want to be more secure. The opposite is the case, usually they deliberately open up their security model to accomodate 3rd party tracking scripts. For example, Content-Security-Policy headers can effectively prevent all sorts of xss attacks, but they will also prevent 3rd party tracking scripts etc.
timewizard · 14h ago
You've misunderstood my point. It's not what the server admins want it's what the security policy will allow. If two sites, on two different domains, both use the same script, served directly from their domains, it creates absolutely no workaround for third party cookies. This is because the two sites have different origins. CSP does not create a bypass in this case.
freeamz · 20h ago
Feel like all this cookies thing is just white wash, when if you enable JS then they can track you no matter if you have cookies or not!

Nothing is private: https://nothingprivate.gkr.pw

More effort ought to be put into how to make web spec to NOT be able track user even if JS is turned on.

Browser vendor Brave, Firefox suppose to privacy browser are NOT doing anything about it.

At this point, do we need to using JS disabled browser to really get privacy on the web?

littlecranky67 · 15h ago
Any other tracking methods are way more obvious, and way harder to implement for the advertising industry. We shouldn't think in black/white here - the more difficult it is to track a user, the less likely it is implemented. It is okay if 30% of tracking sites dissapear as the cost/value ratio don't work for them. We don't have to sit in silence and do nothing, just because we can't have the 100% privacy.
matthewdgreen · 11h ago
I do think there is a point here: any technical means to block tracking is going to be overrun by technical means to overcome the anti-tracking tech. There are simply too many dollars at stake for anything else to happen. If anti-tracking stops some players, that just means the industry will consolidate into a few large and well-resourced players.

While I am all in favor of continuing the technical battle against tracking, it’s time to recognize that the war will only be won with legislation.

idle_zealot · 19h ago
> At this point, do we need to using JS disabled browser to really get privacy on the web?

My thoughts are that we need a distinction between web pages (no JS) which are minimally interactive documents that are safe to view, and web apps (sites as they exist now) which require considerable trust to allow on your device. Of course, looking that the average person's installed app list indicates that we have a long way to go culturally with regards to establishing a good sense of digital hygiene, even for native software.

wtallis · 17h ago
It doesn't help that web browsers aren't even trying to help users make the distinction. They have an ever-growing list of features and permissions that sites can take advantage of, with no attempt to coalesce anything into a manageable user interface. Instead, it takes a hundred clicks to fully trust or distrust a site/app.
freeamz · 13h ago
More UI/UX distinction is needed! Just the green lock for security! The browser should indicate the level of privacy of the page. If the page use no js or any GPU compromising (css I'm looking at you), then it gets a green kind. For every privacy/security compromising feature you add the turns yellow. Once it start to ask for WebUSB, MIDI, then it should be in some kind of Native Mode. More like a UI/UX issue for the major browser makers!
iggldiggl · 8h ago
The problem is that there is a lot of grey area between pure document-style pages and full-on apps (take online shops for example) and even for the former category of pages a lot of UI niceties are only possible with scripting.
GCUMstlyHarmls · 19h ago
https://nothingprivate.gkr.pw seems to (not) work fine in Firefox... I am running ublock-origin though, no other special things.
Diti · 15h ago
Same here, it’s not just you. Judging by the other comments, it only seems to “work” on Blink-based browsers.
red_trumpet · 9h ago
Same, they were "fooled" by a private window. I was recognized when just using a different Multi-Account Container[1] though.

[1] https://addons.mozilla.org/en-US/firefox/addon/multi-account...

Kovah · 15h ago
Also not working on Brave, without UBlock or similar extensions. Brave says it blocked one requests, probably that for fingerprinting.
karl-j · 14h ago
The site also fails to track on mobile Safari with ”Prevent Cross-Site Tracking” turned on.
brookst · 19h ago
It’s an interesting question: is it possible for JavaScript to be turing complete, able to read/write the DOM, and somehow prevent fingerprinting / tracking?

My gut says no, not possible.

Maybe we need a much lighter way to express logic for UI interactions. Declarative is nice, so maybe CSS grows?

But I don’t see how executing server-controlled JS could ever protect privacy.

Enginerrrd · 19h ago
I've always thought there should be a way to use the browser like a condom. It should obfuscate all the things that make a user uniquely identifiable. Mouse movement/clicks/typing cadence should be randomized and sanitized a bit. And no website should have any authority whatsoever to identify your extensions or other tabs, or even whether or not your tab is open. And it certainly shouldn't allow a website to overrule your right click functionality, or zoom, or other accessibility features.
JSteph22 · 18h ago
The obfuscation makes you more easily identifiable.
victorbjorklund · 9h ago
I think their idea was that it would be in the browser everyone uses.
Enginerrrd · 4h ago
Exactly. My thought was this should be the default configuration in the browser.
teo_zero · 17h ago
How so?
codyvoda · 17h ago
Eldo Kim

you stand out when you obviously hide

chii · 16h ago
only if you are the only one doing the obfuscation.

It's why tor browser is set to a specific dimension (in terms of pixel size), have the same set of available fonts etc.

klabb3 · 12h ago
And yet you still stand out if you use tor.
chii · 12h ago
yes, and it's because not enough people use tor-browser (i meant the browser, not the network).

But if privacy is truly the desired goal, the regular browser ought to behave just like tor-browser.

freeamz · 5h ago
Tor Browser safe mode. That is one of few ways to defeat that fingerprinting thing.
febusravenga · 14h ago
Yes, it is.

Just create _strict_ content security profile, which doesn't allow any external requests (fetch) and only allow load of resources (css, image, whatever) from predefined manifest.

App cannot exfiltrate any data in that case.

You may add permissions mechanisms of course (local disk, some cloud user controls, etc).

That's a big challenge in standards and not sure if anyone is working on such strongly restricted profile for web/js.

chongli · 19h ago
It’s an interesting question: is it possible for JavaScript to be turing complete, able to read/write the DOM, and somehow prevent fingerprinting / tracking?

Yes, of course: restrict its network access. If JS can't phone home, it can't track you. This obviously lets you continue to write apps that play in a DOM sandbox (such as games) without network access.

You could also have an API whereby users can allow the JS application to connect to a server of the user's choosing. If that API works similarly to an open/save dialog (controlled entirely by the browser) then the app developer has no control over which servers the user connects to, thus cannot track the user unless they deliberately choose to connect to the developer's server.

This is of course how desktop apps worked back in the day. An FTP client couldn't track you. You could connect to whatever FTP server you wanted to. Only the server you chose to connect to has any ability to log your activity.

adrr · 16h ago
There's no point. If you diaable JS. Can track you other ways, fingerprint your dns packets like timestamp clock skew and other things. With IPV6 can assign you unique ip address for a dnslookup that can function like a cookie,

Don't want to be tracked. Don't go on the internet.

HumanOstrich · 15h ago
Websites can't fingerprint my dns packets by their clock skew, nor can they assign me a unique IP address for a dns lookup (what?). "Don't go on the internet" isn't a great starting point to improve things.
adrr · 5h ago
Used to fingerprint your TCP packets when i built a large neobank. Could easily tell if you're behind a proxy, falsifying your user agent via syn numbers, and more. We used it to detect bots but it could be easily be used to fingerprint individual users. DNS trick is already used for DNS based CDNs, you can just keep refining it down to more specificity. CDN edge for each individual user.
waynesonfire · 18h ago
Why does it have to be a technological solution? That's what the media industry tried to do with DRM and it failed. The solution is legislation. We need the equivalent of DMCA for our privacy. Make it illegal to fingerprint.
chongli · 12h ago
I’m completely unsold on legislation. Another headline that recently hit the top of HN is about how Apple flagrantly ignored a court order. The judge has recommended the case for criminal contempt prosecution [1].

The comments on the story are completely unconvinced that anyone at Apple will ever be convicted. Any fines for the company are almost guaranteed to be a slap on the wrist since they stand to lose more money by complying with the law.

I think the same could be said about anti-cookie/anti-tracking legislation. This is an industry with trillions of dollars at stake. Who is going to levy the trillions of dollars in fines to rein it in? No one.

With a technological solution at least users stand a chance. A 3rd party browser like Ladybird could implement it. Or even a browser extension with the right APIs. Technology empowers users. Legislation is the tool of those already in power.

[1] https://news.ycombinator.com/item?id=43856795

chii · 16h ago
> The solution is legislation. We need the equivalent of DMCA for our privacy

and how does one know their privacy has been invaded? How does the user know to enforce the DMCA law for privacy?

I think the solution has to be technological. Just like encryption, we need some sort of standard to ensure all browsers are identical and unidentifiable (unless the user _chooses_ to be identified - like logging in). Tor-browser is on the right track.

jenadine · 17h ago
That'd be the GDPR
cluckindan · 17h ago
Which is only applicable in the EU
6510 · 18h ago
I don't know what it is called but if you try to open a window from a timeOut it wont work. The user has to click on something then the click even grants the permission.

You could make something similar where fingerprint worthy information cant be posted or used to build an url. For example, you read the screen size then add it to an array. The array is "poisoned" and cant be posted anymore. If you use the screen size for anything those things and everything affected may stay readable but are poisoned too. New fingerprinting methods can be added as they are found. Complex calculations and downloads might make time temporarily into a sensitive value too.

degamad · 17h ago
In the old days, something similar to what you're calling "poisoned" was called "tainted" [0].

In those scenarios, tainted variables were ones which were read from untrusted sources, so could cause unexpected behaviour if made part of SQL strings, shell commands, or used to assemble html pages for users. Taint checking was a way of preventing potentially dangerous variables being sent to vulnerable places.

In your scenario, poisoned variables function similarly, but with "untrusted" and "vulnerable" being replaced with "secret" and "public" respectively. Variables read from privacy-compromising sources (e.g. screen size) become poisoned, and poisoned values can't be written to public locations like urls.

There's still some potential to leak information without using the poisoned variables directly, based on conditional behaviour - some variation on

    if posioned_screenwidth < poisoned_screenheight then load(mobile_css) else load(desktop_css)
is sufficient to leak some info about poisoned variables, without specifically building URLs with the information included.

[0] https://en.wikipedia.org/wiki/Taint_checking

gkbrk · 14h ago
Doesn't work on Brave. It says to check it on private mode, but when I switch to private mode it just asks for my name again.
IMTDb · 9h ago
On me it had the opposite effect of what was intended:

I opened the website on non anonymous session safari: it asked my name. Then I opened another new non anonymous window on the same browser: it showed my name as expected. I then opened the same browser in incognito mode: it asked my name again. I then opened chrome (non anonymous) and again it asked my name.

Exactly what I expected to see; everything seems to be working as intended. Anonymization online seems to be working perfectly fine.

FridgeSeal · 14h ago
Also doesn’t work on iOS (for me).
deadbolt · 19h ago
Just tried this with Brave and it didn't seem to work, assuming the site working means that it can remember me in an incognito browser. I gave the site a name, and then opened it in incognito (still using brave), and it acts as if I visited the site for the first time.

What am I supposed to witness?

cptskippy · 19h ago
It didn't work on Firefox mobile either... Why are all these browser companies breaking the web!
emsign · 19h ago
Web Browsers Must Be Removed

They run arbritrary code from sketchy servers called "websites" on people's hardware with way too many privileges. While free and open source standalone web applications exist that only use minimal JS code to access the same web resources with a much better user experience. Without trackers, without ads and third parties.

Kiro · 18h ago
I want a browser to be able to run arbitrary code. That's the whole point. I want to play a game or use a complex application in the browser without having to install anything.
afavour · 18h ago
It won’t happen because people don’t care enough.

I don’t mean to sound glib. But people derive a ton of utility from the web as it stands today. If they were asked if they supported the removal of web browsers they would absolutely say no. The privacy costs are worth the gains. If you want change you have to tackle that perception.

hi_hi · 15h ago
I think this is a bit overblown. Brave and Safari we're both private when I just tested. Chrome not so much, but thats expected.
antihipocrat · 18h ago
Unmodified server request headers contain enough information for tracking even if JS is disabled. If you're keen to modify http headers while browsing, then you could also modify any JS run on your system that snoops system information (or strip the info from any request sent to the server) and continue with JS enabled.
myHNAccount123 · 19h ago
Works as advertised on Edge but not on safari
xiaomai · 4h ago
hmm, this didn't recognize me in a private window in either firefox or brave.
kstrauser · 18h ago
I can't get that site to work on Safari on my Mac, with JS enabled.
sensanaty · 13h ago
The more egregious and frankly disgusting one is https://fingerprint.com

IMO this service should straight up be made illegal. I love the tagline they have of supposedly "stopping fraud" or "bots", when it's obvious it's just privacy invasive BS that straight up shouldn't exist, least of all as an actual company with customers.

alkonaut · 15h ago
I have almost no hope that this is a matter that has a technical solution. The GDPR shows that law - even if not global, and even if not widely enforced - is pretty good at getting people to act. And most importantly, it will make the largest players the most afraid as they have the most to lose. And if just a handful of the largest players online are looking after peoples privacy then that is a huge win for privacy.

Doing what this demo shows, is clearly a violation of the GDPR if it works the way I assume it does (via fingerprints stored server side).

matheusmoreira · 19h ago
They can track you just fine via CSS and countless other ways. They'll even fingerprint the subtle intricacies of your network stack.

What we need to do is turn the hoarding of personal information into a literal crime. They should be scrambling to forget all about us the second our business with them is concluded, not compiling dossiers on us as though they were clandestine intelligence agencies.

hobs · 19h ago
I by default block JS on the web and only allow it for domains I accept. It's a tiny bit of work for a whole lot of safety.
switch007 · 8h ago
I've tried this recently and I found it very difficult. Cloudflare bot protection is everywhere, other anti-scrape protections, many 'document' sites using JS to render with no fallback, basic forms requiring JS, authentication requiring JS, payments requiring JS etc

Not intending to sound snarky but do you just not use the web much? Or if you're adding allows all the time, what's the net gain?

hobs · 1h ago
I use the web fairly constantly and yeah, if I am visiting a new site and I want to see the content there's a 50/50 chance I have to press a button in noscript (like 2-3 clicks) - but when you setup your initial set (usually takes me about a week) you'd be surprised how few net new properties you set in a week - maybe 100 or less?

I also set temporary permissions for any site I dont think I will be spending a lot of time on because they might change what's running and I dont have any trust or insight into their process - so I might authorize that site 3-4x a year sometimes before I say it can stay.

jeroenhd · 14h ago
Google won't implement this spec. Currently, they're legally not allowed to, because advertisers called in the industry watchdog, asserting that without third party cookies to stalk users, they could not compete. Google extended their privacy sandbox, opened and closed it, talked about it, and eventually backed down from their plan to block third party cookies ASAP.

Maybe Chrome can get away with "the spec says it, sorry advertisers" but I doubt the courts will accept that.

nine_k · 14h ago
That is, Firefox can reject third-party cookies because it's not made by a company that deals in online advertising, but Chrome cannot, because Google is the biggest online ads dealer and thus would have an unfair advantage over other ads dealers, correct?
j16sdiz · 18h ago
> Some of the use cases that are important enough to justify the creation of purpose-specific solutions include federated identity, authorizing access to cross-site resources, and fraud mitigation.

Unpopular opinion: There are no privacy-preserving way for "fraud mitigation".

Either you accept fraud as cost to run business, or do away the privacy. Most business owner don't want the fraudulent user to come back, ever. If we value the privacy of user, we need to harm some business.

omeid2 · 18h ago
In theory it is by possible by "blind attestations" by a 3rd party, in an indirect way, that is what you get by Cloudflare, where they monitor traffic from an "agent" using their own heuristics for identity, without sharing that identity with you.
RainyDayTmrw · 17h ago
This is kinda hollow while Google controls Chrome, and Chrome has majority market share[1]. And, if regulators get their way, and Google divests Chrome[2], I'm not expecting that the new highest bidder would do any better with it.

[1] The exact figure may depend on which source you use, and there is some indication that ad and tracker blocking may artificially deflate Firefox and friends. https://gs.statcounter.com/browser-market-share [2] https://www.wired.com/story/the-doj-still-wants-google-to-di...

JoshTriplett · 13h ago
As long as the new steward of Chrome is not an advertising company, they will no longer be restricted from removing third-party cookies.
codeqihan · 20h ago
I have always blocked third-party cookies. The only problem I've encountered (there may be others, but I haven't come across them) is that some embedded videos on certain web pages won't play and prompt me to enable cookies.
xnx · 20h ago
Careful what you wish for. Removing third party cookies without a replacement will make aggressive fingerprinting ubiquitous.
Springtime · 19h ago
I've always assumed fingerprinting was already ubiquitous. I look at the absolute absurdity of tracking/fingerprinting permission dialogs on sites, stating up-front their data sharing with 'trusted partners' in the hundreds ranges (thingiverse.com with over 900, theverge.com on mobile with over 800) and find it more surprising that the default state of all clients shouldn't be to block everything by default.

Edit: for clarity, I believe anything with the ability to analyze the user environment via Javascript/etc on major sites is likely fingerprinting regardless. Blocking, environment isolation and spoofing is already necessary to mitigate this.

deadbolt · 19h ago
Do you believe that while third party cookies exist, tracking companies aren't using other fingerprinting methods?
Macha · 5h ago
There's an entire sub-industry of companies doing cross device targeting and attribution.

Guess how they're doing it. It's not cookies. It's also why the GDPR is not a "cookie law" and accepting the prompts but blocking cookies is not really a substitute.

xenator · 20h ago
I have feeling that it is all related. When use see request to accept cookies with list of over 9000 trackers it doesn't mean that this page will have zillions of javasripts included on the page. It just means that site owners fingerprint user and process user interactions to third parties server side.

Only reason why we see this movement is because advertisers feels confident about removing third party cookies.

bennettnate5 · 19h ago
...thus raising the bar for privacy-preserving techniques in client side browsing. Aggressive fingerprinting arrived years ago; if we can move beyond cookies altogether and focus on it as the next issue to tackle, I would think that's a net win. Saying that we should keep 3rd part cookies alive and healthy because it will keep websites using them against users rather than fingerprinting is just throwing the majority of users who don't know to block them under the bus. Plus it still leaves the door open for even privacy-conscious users to be defeated by fingerprinting anyways if a server is keen on tracking particular individuals.
Terr_ · 19h ago
Yeah, the only way third-party cookies will block creepier fingerprinting crap is if the creepy stuff is prohibitively more expensive.

But once anyone gets a creepy fingerprinting system working, the barriers drop, and it becomes cheaper to resell the capability as a library or service.

It may offer some minor benefits in terms of enabling companies that "want to be more ethical than the competition", but that too seems like a long-shot. :p

xnx · 10h ago
Fingerprinting defeating technology is just the kind of thing that I wish Firefox spent its effort developing instead of reimplementing features form Chrome like tab groups.
Funes- · 8h ago
Adopting javascript universally, and consequentially making it a de facto standard, was a terrible mistake. I think the only way out of this nightmarish privacy-less state of things (in this regard) might be something like the EU putting out an extremely severe law banning all these bad practices. Banning all practices that undermine privacy is the only morally valid option; it's not only about 3rd-party cookies. It'd be an incontrovertible measure for everyone but bad actors, just like USB-C or user-removable batteries (February 2027).
oliwarner · 11h ago
Sure but this neither makes an attempt to list the valid uses of third party cookies, nor a suggestion of what magic definitely not a third-party cookie unicorn is going to ride in and offer us the safety we need. Pretty fluffy through and through.

I suggest that we do just need to keep third-party cookies but they're explicitly opt-in. That could just be allowing (once) a third party to be present everywhere (like a SSO) and browsers making it known when a third party is accessing data.

johnmiroki · 20h ago
Replacement solutions must be provided before it's mandatory to remove third party cookies. Otherwise, it's doomed to fail.
recursive · 20h ago
Replacement for what use case? The whole point is to eliminate the behavior, not provide another feature that has the same problems. What does failure mean? It's a problem for ad networks, not for regular humans.
svieira · 20h ago
The use case of not having to log in to system A which is being embedded within system B because you already logged in to system A? Without needing to introduce a third party SSO C? That's pretty "regular human", even if it's "medium sized corporation" instead of "Joe Regular" (but even Joe likes it if he doesn't have to log into the comment box on every site that uses THE_COMMENT_SYSTEM_HE_LIKES.)
koolba · 20h ago
This exists already. You can have cookies at higher level of the same domain. So foo.example.com and bar.example.com can share cookies at example.com. You can also use CORS to interact with a truly third party site. None of these require third party cookies.
nwalters512 · 19h ago
A use case this doesn't address is embedding across two completely different domains, which is pretty common in the education space with LMS platforms like Canvas (https://www.instructure.com/canvas) embedding other tools for things like quizzes, textbooks, or grading. I ended up in a Chrome trial that disabled third-party cookies which broke a lot of these embeds because they can no longer set identity cookies that they rely on from within their iframe.
svieira · 19h ago
As nwalters also points out, this isn't the same at all. System A and System A' both from Source Α are not the same as System A (Source Α) and System B (Source Β).

Which you know, because you say "you can also use CORS to interact with a truly third party site". But now, I invite you to go the rest of the way - what if the third party site isn't Project Gutenburg but `goodreads.com/my-reading-lists`? That is, what if the information that you want to pull into System A from System B should only be available to you and not to anyone on the net?

cuu508 · 17h ago
Use OAuth2 to get system B's access token, then use authenticated server-to-server API requests to pull needed information from system B.
svieira · 8h ago
BINGO! The issue here of course is that now instead of _two_ components (Front End A and Embed B) you now have four (the back ends must communicate and if A didn't need a back end ... well, now it does).

Now, if you meant "Use OAuth2 in the browser", that's just the original case (you can't authorize if you can't authenticate and it's the ambient authentication that's being stripped when you eliminate third party cookies).

namdnay · 9h ago
This multiplies the cost of the integration by at least an order of magnitude
jfengel · 20h ago
The use case is web sites that want to earn income with as little user overhead as possible. Targeted ads have many downsides but they do pay websites without any money at all from the user, or even having to create an account.

So the problem for regular humans is the disappearance of features that they've grown used to having without paying any money. Finding a better way to support themselves has proven remarkably difficult.

deadbolt · 19h ago
I feel like many people here wouldn't care if those websites simply stopped existing.
jfengel · 9h ago
Certainly a lot of people would care if Facebook disappeared.

There are also a billion other ad-supported web sites, each of which make ten people happy. Not a single one of them would be widely mourned, but 5 billion people would each be saddened by one of them.

bittercynic · 18h ago
Many people would, though.

For a long time I thought pinterest was search spam that no human could possibly want to see, but then I met real people in the world who like it and intentionally visit the site. I bet there are people who like ehow and the rest, too.

int_19h · 15h ago
The viability of their business model shouldn't be everyone's problem.
jfengel · 9h ago
It is their problem when a feature that they like disappears.

They don't care about what happens to the business itself. But they do care about the things the business provides.

If they don't in fact care, then indeed, nothing is lost. But a lot of people will miss a lot of things. Whoever comes up with an alternative that suits the case will make a lot of people happy.

etchalon · 17h ago
People made money on advertising before the existence of cookies and ubiquitous tracking. Nature will heal.
JoshTriplett · 13h ago
And people had websites before the existence of Internet advertising. Let's set our expectations higher for how much healing is needed.
petesergeant · 20h ago
The article explicitly calls out that there are valid use cases (although doesn’t enumerate them). Federated sign-on and embedded videos seem like obvious examples
p_ing · 20h ago
Google/Chrome just declared that they won't be moving forward with removing 3rd party cookie support.

https://privacysandbox.com/news/privacy-sandbox-next-steps/

> Taking all of these factors into consideration, we’ve made the decision to maintain our current approach to offering users third-party cookie choice in Chrome, and will not be rolling out a new standalone prompt for third-party cookies.

svieira · 20h ago
Ah, now _that_ makes sense why this go published then. Glad to see that common sense prevailed. The day may come when all the use cases for third-party cookies that aren't "track Joe Regular all around the web" can be satisfied with other widely available web features, but until we have all those features I think taking a page from Linus' book and ensuring "we don't break userland" is important (and something I've always loved about the web and I'm glad to see it continuing).
somenameforme · 19h ago
Which use cases? I use Brave, which has a built in toggle to disable 3rd party cookies, which I have set to default, and at least my experience of 'the entire internet' works fine.
asddubs · 14h ago
embedded iframes that need to authenticate logins but don't trust the parent domain to store the login data there is a problem. You can somewhat work around it with the Storage Access API if that browser supports it (brave doesn't), but it does mean every embed requires a click by the user first before it works properly
hedora · 18h ago
Same here, but other browsers. I’ve had zero issues since well before the dot com crash.
Nevermark · 20h ago
Company whose market cap reflects pervasive surveillance non-requested announces that after serious consideration they won’t be removing technologies that enable pervasive non-requested surreptitious surveillance.”

It is going to be interesting to see if anti-trust enforcement's manages to separate Google from its financial and practical hold on web standards/browsers.

The opportunity to increase ethical norms of web browsing would be welcome to me.

pests · 20h ago
Google wants to remove third party cookies but they can't as the government sees it as anticompetitive to their competition. They dont need third party cookies, everyone else does.
svieira · 19h ago
Precisely - removing third-party cookies doesn't stop Google from tracking anyone. It just prevents anyone who doesn't own a browser and have one of the three major email providers from tracking everyone.

Well, it doesn't prevent them, but it does make it a little bit harder ...

pests · 19h ago
I personally think this decision hurts users more than anything else. We must let Google's competitors continue tracking us or else it won't be fair to them?

I don't even understand how being forced to divest Chrome will even help. Once another company owns Chrome and can remove third party cookies, Google gets the same benefit.

Nevermark · 18h ago
Google has remarkable financial influence across the four major commercial entity related browsers.

So limiting Google's control over browsers will create more competition. More competition on implementations. And also more competition in terms of features and user centric service.

--

Question: Does Google really not gather information from anything but its search engine and first party apps? That would seem financially non-optimal for any advertising funded business.

I would think that sure, they log everything peopel use their search for.

But that they would also find a way to track post-search behavior as well. Google leaving money on the table seems ... unusual if there isn't some self-serving reason they would forgo that.

I am happy to become better informed.

nemothekid · 16h ago
There are only 3 effective browsers - Chrome, Safari and Firefox. I don't see how limiting Google's control will create competition. The barrier to more browsers is the massive investment needed to create one, not any action that Google is doing.
VladStanimir · 10h ago
You are correct, although its more correct to say there a only 3 major browser engines, Blink (used by all chromium derivatives), WebKit (used by Safari and some minor browsers), Gecko (used by Firefox and its derivatives). Creating a browser engine is hard, so hard that even a multi billion dollar company like Microsoft gave up on doing it. And we may soon witness Gecko going away as a side effect of the Google antitrust lawsuit.
youngtaff · 14h ago
Google could have removed third-party ten years ago as Safari did…

Their long wait to do it is part of why we ended up in a regulatory mess

svieira · 4h ago
Safari's choice broke portions of the web for users of Safari and is part of the reason (I believe) that Chrome continued to take more market share since 2015.
driverdan · 20h ago
We don't need a replacement, they're not needed today. I've been blocking them for years and I can't remember the last time it caused a problem.
jeroenhd · 14h ago
Google has set up a replacement that puts the user in control of their ad interest tracking. It has its upsides and downsides, but I think it's pretty balanced. Anti-tracking features are embedded into the API so the API can't be abused by advertisers.

Of course, ad companies scream bloody murder, and the UK market watchdog had to step in so Google wouldn't turn off third party cookies by default.

hiccuphippo · 20h ago
Do not worry, the ad networks will come up with ways to circumvent it as soon as it becomes mandatory.
tejtm · 20h ago
done. third parties can be replaced with legally culpable first parties.
kgwxd · 20h ago
I've have them turned off since Firefox added the feature. Looks like that was around 2018, though I could have sworn it was much earlier than that. I've never had an issue where I had to make an exception for a site. Is there still some environment where it's common for them to be needed?
g-b-r · 19h ago
I don't recall a browser that didn't let you disable third-party cookies; given how long ago cookies were introduced, I could have forgotten about it, but I'm at least sure that Mozilla always supported it.

Firefox, especially in the first versions, permitted much less control on cookies than Mozilla did, but I think it still always allowed disabling third party cookies.

kazinator · 18h ago
> Some features of the web that people have come to expect, and which greatly improve user experience, currently depend on third-party cookies.

Idea: domains should be able to publish a text record in their DNS (similarly to SPF record for mail domains) designating other domains which are allowed to peek at their cookies.

Suppose I operate www.example.com. My cookie record could say that foo.com and bar.com may ask for example.com cookies (in addition to example.com, of course). A website from any other domain may not. As the operator of example.com, I can revoke that at any time.

Whenever a page asks for a cookie outside of its domain, the browser will perform a special DNS query for that cookie's domain. If that query fails, or returns data indicating that the page does not have access, then it is denied.

fastest963 · 25m ago
This is already a feature called "Related Websites": https://github.com/WICG/first-party-sets
int_19h · 15h ago
But then all the ad-supported websites will whitelist the ad tracking cookies, which is precisely what they are trying to avoid here.
kazinator · 14h ago
Ah, but in so doing they will have to publish their whitelist, which will exhaustively have to list every single affiliated domain.

Browsers and browser extensions will be able to use that info to identify shit sites, turning the whitelist around into blacklisting uses, like ad blocking and whatnot.

One simple mechanism would be for the browser to deny the cookie request if the requested domain's cookie DNS record contains more than, say, three affiliated domains. (At the discretion of the browser developer, and user settings.) The proliferation of that sort of config would discourage domains from being overly promiscuous with their tracking cookie access.

Plus, existing cookie control mechanisms don't go away.

j16sdiz · 18h ago
Not a bad idea, TBH.

Just feeling uncomfortable putting more data into DNS. DNS is not encrypted. DNSSEC is easy to bypass (or break way too often that nobody want to enforce it).

-- but these are not w3c's problem.

kazinator · 18h ago
Yes; if someone hijacks example.com's main A record, that gets caught at the SSL level.

If someone hijacks example.com's cookie record, that won't be caught; they just write themselves permission to have their page access example.com's cookies.

The same info could just be hosted by example.com (at some /.well-known path or whatever). The web could generate a lot of hits against that.

The DNS records could be (optionally?) signed. You'd need the SSL key of the domain to check the signature.

pabs3 · 17h ago
When you say bypass, do you mean disable DNSSEC on your own computer? Or are there known vulnerabilities in DNSSEC cryptography or software?
tptacek · 5h ago
The stub resolver on your own computer doesn't actually speak DNSSEC. It speaks normal DNS to a recursing resolver, probably at your ISP or at Google, that itself does DNSSEC validation, and then just sets a bit in the response back to you that says "this is authentic".
kazinator · 1h ago
Glibc supposedly has DNSSEC, but does anyone use it:

https://sourceware.org/glibc/wiki/DNSSEC

tptacek · 1h ago
That page appears to be mostly about how to trust a real recursive cache from a glibc program.
tptacek · 18h ago
DNSSEC isn't encrypted either.
chii · 16h ago
I dont think DNS should be overloaded to have a security measure.
kazinator · 14h ago
It's already used in a similar way for SPF records, in the context of e-mail.

Using a SPF record, a domain indicates hosts that are allowed to deliver mail on its behalf (meaning using an envelope sender address from that domain).

Animats · 19h ago
I haven't allowed third party cookies in a decade. No problem.
kstrauser · 18h ago
I had a little trouble when Safari rolled out ITP a while back. SSO providers scrambled to figure out how to fix federated logins, and because it affected every iPhone, they managed to do it with a quickness. I haven't had a single problem since.
ordu · 13h ago
How about third party js? The site doesn't render properly without third party js from www.w3.org.
badmonster · 19h ago
third-party cookies have done more harm than good, and it's time to fully remove them from the web platform. It is refreshing that their acknowledgment that replacements must not just be privacy-washed clones of the old model — purpose-built alternatives need to prove they don’t recreate the same surveillance infrastructure.
dankwizard · 19h ago
Using a custom-built interception layer, I decouple session tokens from identifiable browser states, rotating my signature footprint every few requests via controlled entropy injection. “No more third-party cookies” sounds like a big shift, but it’s functionally irrelevant if your presence is already undetectable.
aligundogdu · 16h ago
This is actually a somewhat inconvenient wish, because the alternative would increase the fingerprint investments required for all browsers to recognise us.
AdmiralAsshat · 20h ago
Fine. All that will happen is we'll see more sites switching to requiring a login to do anything on their website, so that they can track you with first-party cookies, and sell your information that way. Nothing will meaningfully change.

The only distinction is that I can do a decent job of blocking third-party cookies today with my existing solutions like uBlock Origin, but I will probably have a much more difficult time getting around login/paywalls.

recursive · 20h ago
First party cookies can't build a profile on you across multiple origins.
jmb99 · 19h ago
They absolutely can. They have, at minimum, your account information and your IP address. Maybe you use a burner email address and/or phone number, and maybe a VPN, but chances are you’re not cycling your VPN IP constantly so there’s going to be some overlap there. And if you do cycle your IP, 99%+ of users probably aren’t clearing session cookies when doing so, which means you’re now tracked across IP/VPN sessions. Same deal if you ever connect without a VPN - that IP is tracked too. There’s tons of ways to fingerprint without third party cookies, they just make it easier (and also easier to opt out of if they exist, just disable third party cookies; if no one has third party cookies, sites are going to start relying on more intrusive tracking methods).

You can also easily redirect from your site to some third party tracking site that returns back to your successful login page - and fail the login if the user is blocking the tracking domain. The user then has to choose whether to enable tracking (by not blocking the tracking domain) or not seeing your website at all. Yes the site might lose viewers, but if they weren’t making the site any money, that might be a valid trade off if there’s no alternative.

Not saying I agree with any of this, btw, I hate ads and tracking with a passion - I run various DNS blocking solutions, have ad blockers everywhere possible, etc. Just stating what I believe these sort of sites would and can do.

hiccuphippo · 20h ago
All they need to do is redirect you through a central hub after login.
quectophoton · 8h ago
On first visit:

* "Please wait while we verify that you're not a bot, for which we'll need to associate a unique identifier with your browsing session." (logged in or not)

* The validation needs to do a quick redirection to an external centralized service, because if they can already identify that you're not a bot, you save CPU cycles, and you care a lot about carbon footprint after all.

* Redirect back to the original website, passing the "proof of not-a-bot" somewhere in the URL. This is just a string.

* The website absolutely needs to load the external script `https://proof-validation.example.com/that-unique-string.js` for totally legit purposes obviously related to detecting bot behavior, "somehow".

Half-joking because I don't think this would fly. Or maybe it would, since it's currently trendy to have that PoW on first visit, and users are already used to multiple quick redirections[1] (I don't think they even pay attention to what happens in the URL bar).

But I'm sure we'd get some creative workarounds anyway.

[1]: Easy example: A post on Xitter (original domain) -> Shortened link (different domain) -> Final domain (another different domain). If the person who posted the original link also used a link shortener for tracking clicks, then that's one more redirection.

ear7h · 20h ago
Can't you just work around all of this by proxying to the third party site(s) with a subdomain?
crummy · 20h ago
I think you're right. I imagine if third party cookies were ever banned, we'd quickly see googleads.whatever.com become a common sight.
g-b-r · 19h ago
There's no need for a login to track you with "first-party cookies", looking at the IP is perfectly adequate, at most adding some fingerprinting if you really want.

The only problem is that then the tracking companies have to place more trust on the first party that they're giving them real data.

But they're doing it, actually, see confection.io for example

noduerme · 15h ago
I block almost all 3rd party cookies, but at this point isn't it kind of nice to just have your google login follow you around, so you don't constantly have to login on other sites? Sure, it sucks for privacy, which is why your google account should never be tied to your phone number or your actual identity, but it's super convenient. Oh wait. It's tied to your real identity? Go back to square one and start a fake identity with all the root info. Buy a burner with a prepaid card, use it to set up a yahoo mail account, use that to set up a mail server you pay for in bitcoin, use that to verify a gmail account, and never let down your VPN. You're going to be tracked; the right move isn't to waste time worrying about that, it's to be someone invisible and untethered in the real world.
samyar · 7h ago
what about iframes?
anothernewdude · 17h ago
UMatrix blocks those by default. Blocking third party cookies very rarely breaks anything. I can only think of one instance in the past five years, and that wasn't really a third party cookie, but one website using two different domains.
jeroenhd · 14h ago
You even don't need uMatrix for that. Every major website has a toggle for it in the settings.
nurettin · 18h ago
Sounds like a diversion. Websites can use local storage and fingerprinting to do anything they want at this point.
Svoka · 20h ago
So, the web Ad marked is being monopolized on platforms. Google and Facebook make overwhelming revenue from their own websites.

Now, down with the rest.

candiddevmike · 20h ago
Facebook pixel works just fine without third party cookies.
lofaszvanitt · 10h ago
Has anyone noticed this pattern that for some pulled out of my arse explanation, these standards groups and google suddenly remove features that would be useful to people, but they decided it's now not ok in the future. Like http referers now only show the domain, not the full url, because insert complete bs explanation. And now 3rd party cookies too...
nolroz · 20h ago
Here we go again!