I have an idea that another way of preventing being tracked is just massively spamming trash in the data layer object, pushing thousands of dollars worth of purchase events and such, pushing randomly generated user details and other such events. Perhaps by doing this your real data will be hard to filter out. A side effect is also that data becomes unreliable overall, helping less privacy aware people in the process.
chamomeal · 10h ago
Now there’s a fun idea!! I wonder how difficult it would be to spoof events.
Since installing it on firefox on this computer (18 months ago or so) Ad Nauseam has clicked ~$38,000 worth of ads, that i never saw.
Between this and "track me not" i've been fighting back against ads and connecting my "profile" with any habits since 2016 or so. I should also note i have pihole and my own DNS server upstream, so that's thiry-eight grand in ad clicks that got through blacklists.
[Preface: I hate ads, I love uBlock origin, I use pihole, I'm a proponent of ad blockers]
I manage a Google Ads account with a $500,000 budget. That budget is spent on a mix of display ads, google search, and youtube ads.
If I knew that 10% of our budget was wasted on bot clicks, there's nothing I can do as an advertiser. We can't stop advertising... we want to grow our business and advertising is how you get your name out there. We also can't stop using Google Ads - where else would we go?
$38,000 in clicks boosts Google's revenue by $38k (Google ain't complaining). The only entity you're hurting are the advertisers using Google. Advertisers might see their campaigns performing less well, but that's not going to stop them from advertising. If anything, they'll increase budgets to counteract the fake bot clicks.
I really don't understand what Ad Nauseam is trying to achieve. It honestly seems like it benefits Google more than it hurts them. It directly hurts advertisers, but not enough that it would stop anyone from advertising.
Google has a system for refunding advertisers for invalid clicks. The $500k account that I manage gets refunded about $50/month in invalid clicks. I'm guessing if bot clicks started making a real dent in advertiser performance, Google would counter that by improving their bot detection so they can refund advertisers in higher volumes. If there's ever an advertiser-led boycott of Google Ads, Google would almost certainly respond by refunding advertisers for bot clicks at much higher rates.
TeMPOraL · 7h ago
> I really don't understand what Ad Nauseam is trying to achieve. It honestly seems like it benefits Google more than it hurts them.
Google is part of the problem, but they're neither the only ones nor best to target through bottom-up approaches.
> It directly hurts advertisers, but not enough that it would stop anyone from advertising.
You know the saying about XML - if it doesn't solve the problem, you are not using enough of it.
> there's nothing I can do as an advertiser. We can't stop advertising...
We know. The whole thing is a cancer[0], a runaway negative feedback loop. No single enlightened advertiser can do anything about it unilaterally. Which is why the pressure needs to go up until ~everyone wants change.
You know, I'm not too worried that I'm making the lives of people who spy on me harder and wasting their money.
You don't have to buy privacy violating ads. You don't have to buy targetted ads
freeone3000 · 8h ago
Hopefully it puts my browsers on an bot blocklist, which then invalidates the tracking profile and eliminates targeted advertising entirely.
michaelt · 7h ago
The problem with being on google's bot blocklist is you'll suddenly discover that recaptcha is used in a heck of a lot of places.
thatguy0900 · 5h ago
My assumption with something as hostile as ad nauseum is that you were running the risk of Google profile bans
ddtaylor · 5h ago
> I'm guessing if bot clicks started making a real dent in advertiser performance, Google would counter that by improving their bot detection so they can refund advertisers in higher volumes.
They already have methods to detect a lot. Like you said yourself, customers have no alternative, so why would they refund money they don't have to?
sneak · 7h ago
> I hate ads
> The only entity you're hurting are the advertisers using Google.
That’s fine. Advertising is cancer. Reducing advertisers’ ROI is good too.
You don’t hate ads if you’re spending $500k on them. You just hate receiving ads, which makes you hypocritical.
mystified5016 · 8h ago
The point is to poison your ad tracking profile so that advertisers can't figure out who you are and what you'll buy.
No matter how secure your browser setup is, Google is tracking you. By filling their trackers with garbage, there's less that can personally identify you as an individual
mediumsmart · 2h ago
Apple bought the patent to do just that 13 years ago … the .Mac observer article about it is now gone - here is the archive record
Carter invented it and got paid so they can bury it. Must be good tech.
aziaziazi · 7h ago
> It honestly seems like it benefits Google more than it hurts them. It directly hurts advertisers, but not enough that it would stop anyone from advertising.
GP fights agains ads, not Google. And not being able to win 100% of the gain shouldn’t restrain someone from taking action it they consider the win share worth the pain.
> $38,000 in clicks boosts Google's revenue by $38k
You should include costs here, and if (big if) a substantial part of the clicks comes from bots and get refunded, the associated cost comes on top of the bill. At the end the whole business is impacted. I agree 50/50k is a penny through.
> I hate ads […] I manage a Google Ads account
[no cynism here, I genuinely wonder] how do you manage your conscience, mood and daily motivation? Do you see a dichotomy in what you wrote and if so, how did you arrive to that situation? Any future plan?
I’m asking as you kind of introduce the subject but if you’re not willing to give more details that’s totally fine.
jorvi · 7h ago
> want to grow our business and advertising is how you get your name out there
Or.. you know.. offering a quality product?
econ · 3h ago
Tiny trafic but everyone is buying things. High praise in the reviews, not a single organic link.
behringer · 4h ago
This is great. I seek out competitors to the companies that advertise so I can get the product without rewarding advertisers.
Man scape? Nah, generic women's razers. Pcbway? Nope. JLCPCB.
Screw your ads. Find a better way.
Wowfunhappy · 8h ago
I would worry about being labeled a bot and denied access to websites at all.
Chrome banned it from their add on store but it can still be installed manually
dylan604 · 7h ago
I’d imagine that by this point in time, they are able to filter this specific type of noise out of the dataset. They have been tracking everyone for so long that I doubt there’s anyone they don’t know about whether directly of shadow profiles. These randomly generated users would just not match up to anything and would be fine to just drop
aerzen · 10h ago
Am I dumb or does this article fail to explain what does the tag manager actually do? And not just with a loaded word, such as surveillance or spying, but actually technically explain what they are selling for and why it is bad.
mlinsey · 8h ago
Google Tag Manager is a single place for you to drop in and manage all the tracking snippets you might want to add to your site. When I've worked on B2C sites that run a lot of paid advertising campaigns, the marketing team would frequently ask me to add this tracking pixel or another, usually when we were testing a new ad channel. Want to start running ads on Snapchat? Gotta ad the Snapchat tracker to your site to know when users convert. Now doing TikTok? That's another snippet. Sometimes there would be additional business logic for which pages to fire or not fire, and this would change more often. Sometimes it was so they could use a different analytics tool.
While these were almost always very easy tickets to do, they were just one more interruption for us and a blocker for the stakeholders, who liked to have an extremely rapid iteration cycle themselves.
GTM was a way to make this self-service, instead of the eng team having to keep this updated, and also it was clear to everyone what all the different trackers were.
simonw · 1h ago
The self-service thing is such a nightmare. There are two things that you almost certainly cannot trust your marketing team with:
1. Understanding the security implications of code they add via tag manager. How good are they at auditing the third parties that they introduce to make sure they have rock-solid security? Even worse, do they understand that they need to be very careful not to add JavaScript code that someone emailed to them with a message that says "Important! The CEO says add this code right now!".
2. Understand the performance overhead of new code. Did they just drop in a tag that loads a full 1MB of JavaScript code before the page becomes responsive? Can they figure that out themselves? Are they positioned to make good decisions on trade-offs with respect to analytics compared to site performance?
captn3m0 · 34m ago
You effectively delegate code-review on a XSS path to your marketing team. I refused to do that anywhere users could be logged in.
bravesoul2 · 41m ago
Yep it's vibe coding before vibe coding existed. Paste in the script. No code review. No staging. No roll-out. Just straight in prod. And it can break stuff.
simonsarris · 6h ago
The chief reason is that websites pay for advertising and want to know if the advertising is working and Google tag manager is the way to do that, for Google Ads.
This is not unreasonable! People spend a lot of money on ads and would like to find out if and when they work. But people act like its an unspeakable nebulous crime but this is probably the most common case by miles.
bravesoul2 · 39m ago
Why should an advertiser have a right to know if their ads work, regardless of privacy considerations. EU brought out a freaking legal framework around this. I can't take seriously how you've over simplified it.
throwaway65449 · 6h ago
If running spyware on people's browsers just to see if your ads are working is "not unreasonable", what is?
arcfour · 4h ago
Try responding in good faith on a non-throwaway account.
reaperducer · 1h ago
This is not unreasonable! People spend a lot of money on ads and would like to find out if and when they work.
Companies were doing this for hundreds of years before Google even existed. You can learn if your ads work without invasive tracking.
a2800276 · 9h ago
I was tasked with auditing third party scripts at a client a couple of years ago, the marketing people where unable to explain wtf tag manager does concretely without resorting to ‚it tracks campaign engagement´ mumbo jumbo, but were adamant they they can’t live without it.
sitharus · 6h ago
XSS-as-a-service. It lets people drop in random JavaScript to be injected on to the page without any oversight.
It’s used by marketing people to add the 1001 trackers they love to use.
sandspar · 8h ago
Google Tag Manager lets you add tracking stuff on your website without needing to touch the code every time. So if you want to track things like link clicks, PDF downloads, or people adding stuff to their cart.
It doesn't track things by itself. It just links your data to other tools like Google Analytics or Facebook Pixel to do the tracking.
This kind of data lets businesses do stuff like send coupon emails to people who left something in their cart.
There are lots of other uses. Basically, any time you want to add code or track behavior without dealing with a developer.
fguerraz · 10h ago
Maybe you’re being misled by the cryptic name. It’s got nothing to do with managing tags, it’s a behaviour tracker and fingerprint machine.
9dev · 9h ago
I mean technically you can use it to manage HTML tags to inject into a site.
snowwrestler · 9h ago
This is in fact what it is primarily used for.
slow_typist · 9h ago
Well I can inject HTML tags (or elements) with native JavaScript. Or manage them. Why would I want a bloated third party piece of software doing that?
connicpu · 8h ago
So that your sales and marketing team can add the third-party tracker for a new ad campaign service without bothering the engineering team.
bravesoul2 · 33m ago
They can also add features! Yes have fun!
SquareWheel · 8h ago
Since you're asking, you could use it to tie together triggers and actions to embed code in specific situations (eg. based on the URL or page state). It has automatic versioning. There's a preview feature for testing code changes before deploying, and a permission system for sharing view/edit access with others.
xiande04 · 9h ago
There's a section in the article titled, "WHAT DOES GOOGLE TAG MANAGER DO?":
> Whilst Google would love the general public to believe that Tag Manager covers a wide range of general purpose duties, it's almost exclusively used for one thing: surveillance.
munchler · 9h ago
That’s a single word, not much of an actual explanation.
Finnucane · 8h ago
the "general public" probably has no idea that Tag Manager is a thing that exists.
paradox460 · 2h ago
Years ago, I worked on a site where we constantly had requests from the non technical side of the company to make the site load faster. We were perplexed in engineering. The site loaded and was ready for us in less than a fraction of a second.
Eventually we realized that every dev ran ubo, and tried loading the site without it. It took about 5 seconds. Marketing and other parts of the company had loaded so much crap into GTM that it just bogged everything down
gleenn · 11h ago
I'm all for blocking surveillance but how tiring is it to block JavaScript as suggested and then watch the majority of the internet not work?
pluc · 11h ago
It really isn't. I've been blocking all JavaScript for years now, selectively allowing what is essential for sites to run or using a private session to allow more/investigate/discover. Most sites work fine without their 30 JS sources, just allowing what is hosted on their own domain. It takes a little effort, but it's a fair price to pay to have a sane Internet.
The thing is - with everything - it's never easy to have strong principles. If it were, everyone would do it.
roywiggins · 10h ago
It's certainly not that bad if you have uMatrix to do it with, but I haven't found a reasonable way to do it on mobile. uMatrix does work on Firefox Mobile but the UI is only semi functional.
1vuio0pswjnm7 · 8h ago
uMatrix is fully-functional on Nightly.
Using Firefox Add-Ons on a "smartphone" sucks because one has to access every Add-On interface via an Extensions menu.
In that sense _all_ Add-Ons are only semi-functional.
I use multiple layers: uMatrix + NetGuard + Nebulo "DNS Rules", at the least. Thus I have at least three opportunities where I can block lookups for and requests to Google domains.
DavideNL · 4h ago
Doesn’t uBlock Origin in advanced mode do the exact same thing as uMatrix?
pmontra · 1h ago
Maybe, but the UX is so terrible that I never figured out how to use uBO to replace uMatrix. I always use both: uBO for ads and DOM elements filtering and uMatrix for JavaScript, frames, cookies, anything in the columns of its UI.
Basically uMatrix is so donor to use that anybody can use it. The equivalent uBO section is so complicated that I feel I need to take a master degree in that subject.
Having tried both, IMHO they do not do exactly the same thing. One is pattern-based, the other is host-based. As such, one can use them together, simultaneously.
No comments yet
bornfreddy · 9h ago
Not quite the same (I love uMatrix UI), but advanced mode in uBO is similar. It lacks filtering by data type (css, js, images, fonts,...) per domain, but it does resolve domains to their primary domain, revealing where they are hosted. A huge kudos to gorhill for both of these!
baobun · 10h ago
NoScript + uBO is all right.
pluc · 9h ago
Yup that's what I use as well. With whatever the name of the extension that makes allowing cookies a whitelist thing too, and PrivacyBadger/Decentraleyes.
Also, deleting everything when Firefox closes. It's a little annoying to re-login to everything every day, but again, they are banking on this inconvenience to fuck you over and I refuse to let them win. It becomes part of the routine easily enough.
dylan604 · 7h ago
That’s my default as well. Self hosted/1st party scripts can load, but 3rd party scripts are blocked. The vast majority of sites work this way. If a site doesn’t work because they must have a 3rd party script to work, I tend to just close the tab. I really don’t feel like it has caused me to miss anything. There’s usually 8 other sites with the same data in a slightly less hostile site
palata · 7h ago
Do you selectively enable JavaScript for the whole site, or is there a way with uBO to only enable subparts of it?
It has pretty advanced features but also basic ones that allow you to block scripts by source
1vuio0pswjnm7 · 8h ago
Impossible to know because when I disable Javascript "the majority of the internet" works fine. As does a majority of the web.
I read HN and every site submitted to HN using TCP clients and a text-only browser, that has no Javascript engine, to convert HTML to text.
The keyword is "read". Javascript is not necessary for requesting or reading documents. Web developers may use it but that doesn't mean it is necessary for sending HTTP requests or reading HTML or JSON.
If the web user is trying to do something else other than requesting and reading, then perhaps it might not "work".
heavyset_go · 10h ago
Whitelisting JS has worked on my end for a while.
I won't browse the Internet on my phone without it, everything loads instantly and any site that actually matters was whitelisted years ago.
qualeed · 7h ago
Echoing others, I've used NoScript for years and at this point it is practically unnoticeable.
Many sites work without (some, like random news & blogs, work better). When a site doesn't work, I make a choice between temporarily or permanently allowing it depending on how often I visit the site. It takes maybe 5 seconds and I typically only need to spend that 5 seconds once. As a reward, I enjoy a much better web experience.
sureglymop · 11h ago
It's easier than I thought. I just use uBlock Origin with everything blocked by default and then allow selectively.
michaelt · 7h ago
It depends.
If you're spending 99% of your time on your favourite websites that you've already tuned the blocking on? Barely a problem.
On the other hand if your job involves going to lots of different vendors' websites - you'll find it pretty burdensome, because you might end up fiddling with the per-site settings 15+ times per day.
dylan604 · 7h ago
If I’m at work using a work provided computer, I don’t bother with the blocking. They are not tracking me as I do not do anything as me. I’m just some corporate stooge employee that has no similarity to me personally.
My personal devices block everything I can get away with
kevin_thibedeau · 10h ago
StackOverflow switched over from spying with ajax.google.com to GTM in the past year or so. All for some pointless out of date jQuery code they could self-host. I wonder how much they're being paid to let Google collect user stats from their site.
goopypoop · 10h ago
People who want you to run their scripts aren't really your friends
anothernewdude · 10h ago
The sites that don't work are usually the worst websites around - you end up not missing much. And if it's a store or whatever, you can unblock all js when you actually want to buy.
Rapzid · 11h ago
About as tiring as hearing about it all the time. Thank god it's a fringe topic these days but this article snuck it in. Probably the constant use of the word "surveillance" was an early tell haha.
tempodox · 1h ago
> Meanwhile, Google Tag Manager is regularly popping up on Government sites. This means not only that governments can study you in more depth - but also that Google gets to follow you into much more private spaces.
The corruption of the system knows no bounds.
v5v3 · 5h ago
I use:
VPN so constantly changing ip.
Tor browser for everyday browsing (has no script preinstalled). So onion provides double Vpn. Regularly closed down so history cleared.
Safari in private mode and lockdown mode for when tor won't work (tor ip blocked/hd video that is too slow to stream on tor). Safari Isolation in private mode is excellent, you can use two tabs with, say emails, and neither will know other is logged in.
Safari non private for sites I want available and in sync across devices.
Firefox in permanent private mode with ublock origin for when safari lockdown mode causes issues. (Bizarely Firefox containers doesn't work in private so no isolation across tabs).
Chromium for logged into Google stuff.
Chrome for web development.
Plus opt out for everything possible inc targeted ads.
I rarely see ads of anything I would want to buy, and VPN blocks most of it at its DNS.
Beyond that, anything else would be too much effort for me.
The advertising companies I'm sure know I am not susceptible to impulse buy on ads, I research and seek vfm so not really their target.
culi · 3h ago
> Tor browser for everyday browsing
Do you just... log back in to Hacker News every day?
I downloaded the Mullvad browser (basically Tor without the onion protocol part) but having no way to save passwords ended up making it unusable for me
sheiyei · 22m ago
What platform do you use that doesn't allow for password managers? A browser's password manager is not the ideal for security, apparently (I would like to know how generally true this is, of course saving them on Google or Microsoft is as good as idea as it sounds)
adamiscool8 · 9h ago
I don't think this article makes a good case for why you should.
>The more of us who incapacitate Google's analytics products and their support mechanism, the better. Not just for the good of each individual person implementing the blocks - but in a wider sense, because if enough people block Google Analytics 4, it will go the same way as Universal Google Analytics. These products rely on gaining access to the majority of Web users. If too many people block them, they become useless and have to be withdrawn.
OK - but then also in the wider sense, if site owners can't easily assess the performance of their site relative to user behavior to make improvements, now the overall UX of the web declines. Should we go back to static pages and mining Urchin extracts, and guessing what people care about?
card_zero · 9h ago
But I like it better when they have to guess. If it's something we care about enough, we'll let them know.
throw123xz · 8h ago
Analytics can have good uses, but these days it's mostly used to improve things for the operator (more sales, conversions, etc) and what's best for the website isn't always the best for the user. And so I block all that.
bredren · 9h ago
Belt and suspenders approach is to attach analytics to the most important events on the server side and combine with the session.
If the frontend automatic js is blocked, it doesn’t matter.
slow_typist · 9h ago
Effective and accessible UX design is a solved problem. It’s a matter of education of front end developers, not of A/B testing your users to death.
add-sub-mul-div · 9h ago
If the analytics brought us to this, of what use are the analytics?
fvgvkujdfbllo · 11h ago
> surveillanceware
I thought the term was spyware.
Surveillanceware almost sounds like something necessary to prevent bad stuff. Is this corporate rebranding to make spyware software sound less bad?
Eggs-n-Jakey · 10h ago
I don't know, the memetics of Surveillanceware or spyware mostly leads me to the belief that everything is weaponized to drain your money thru ads/marketing instead of the direct approach of stealing my money.
schiffern · 6h ago
>Use uBlock Origin with JavaScript disabled, as described above, but also with ALL third-party content hard-blocked. To achieve the latter, you need to add the rule ||.^$third-party to the My Filters pane.
This is a worse way to implement uBO's "Hard Mode" (except with JS blocked), which has the advantage that you can easily whitelist sites individually and set a hotkey to switch to lesser blocking modes.
Blocking Google Tag Manager script injection seems to have few side effects.
Blocking third party cookies also seems to have few side effects.
Turning off Javascript breaks too much.
You can then enable just enough JS to make sites work, slowly building a list of just what is necessary. It can also block fonts, webgl, prefetch, ping and all those other supercookie-enabling techniques.
The same with traditional cookies. I use Cookie AutoDelete to remove _all_ cookies as soon as I close the tab. I can then whitelist the ones I notice impact on authentication.
Also, you should disable JavaScript JIT, so the scripts that eventually load are less effective at exploiting potential vulnerabilities that could expose your data.
ayaros · 7h ago
Is there a good way to collect basic analytics if you have a site you're hosting on GitHub pages? In such cases I'd rather not rely on Google Analytics if I don't have to.
I figured... just wanted to see which ones people on HN think are worth looking at.
monista · 8h ago
If you block Google Tag Manager, you probably also want to block Yandex Metrics and Cloudflare Insights.
reddalo · 7h ago
I think it's hard to block Cloudflare Insights because most of the data is collected server-side.
rurban · 11h ago
Just add the domain to your /etc/hosts as 0.0.0.0
Doing that for years
future10se · 10h ago
As mentioned on the blog post:
> Used as supplied, Google Tag Manager can be blocked by third-party content-blocker extensions. uBlock Origin blocks GTM by default, and some browsers with native content-blocking based on uBO - such as Brave - will block it too.
> Some preds, however, full-on will not take no for an answer, and they use a workaround to circumvent these blocking mechanisms. What they do is transfer Google Tag Manager and its connected analytics to the server side of the Web connection. This trick turns a third-party resource into a first-party resource. Tag Manager itself becomes unblockable. But running GTM on the server does not lay the site admin a golden egg...
By serving the Google Analytics JS from the site's own domain, this makes it harder to block using only DNS. (e.g. Pi-Hole, hosts file, etc.)
One might think "yeah but the google js still has to talk to google domains", but apparently, Google lets you do "server-side" tagging now (e.g. running a google tag manager docker container). This means more (sub)domains to track and block. That said, how many site operators choose to go this far, I don't know.
Slightly related I've also been recently noticing some sites loading ads pseudo-dynamically from "content-loader" subdomains usually used to serve images. It's obnoxious because blocking that subdomain at the DNS level usually breaks the site.
My current strategy is to fully block the domain if that's the sort of tactic they're willing to use.
I am going to use this for sure, but it is a little ironic.
jpgreens · 5h ago
What if we could resolve every domain to 0.0.0.0 by default at the start. When visiting a website manually through the browser's URL bar it would automatically be whitelisted. Clicking links would also whitelist the domain of the link only. Sure you'd have to occasionally allow some 3rd party domains as well. Guess it would be cumbersome at first but after a while it would be pretty stable and wouldn't require much extra attention.
Try pihole (self-hosted) or nextdns if you want something that stays up to date.
lerp-io · 6h ago
ugh... if you think the internet should be a "static webpage" i got bad news for you bud
drcongo · 11h ago
Google Tag Manager and the whole consent management platform certification business is nothing more than a shakedown. It's racketeering.
hinkley · 3h ago
We had a disgusting number of tags on some of our customer pages and a few dozen of them start to have effects on page load, especially if you were still on HTTP 1.1.
aleppopepper · 7h ago
That's hilarious. Do you really Google should be privacy respecting?
Edit: looks like this might exist already: https://addons.mozilla.org/en-US/firefox/addon/adnauseam/
Between this and "track me not" i've been fighting back against ads and connecting my "profile" with any habits since 2016 or so. I should also note i have pihole and my own DNS server upstream, so that's thiry-eight grand in ad clicks that got through blacklists.
https://www.trackmenot.io/faq
I manage a Google Ads account with a $500,000 budget. That budget is spent on a mix of display ads, google search, and youtube ads.
If I knew that 10% of our budget was wasted on bot clicks, there's nothing I can do as an advertiser. We can't stop advertising... we want to grow our business and advertising is how you get your name out there. We also can't stop using Google Ads - where else would we go?
$38,000 in clicks boosts Google's revenue by $38k (Google ain't complaining). The only entity you're hurting are the advertisers using Google. Advertisers might see their campaigns performing less well, but that's not going to stop them from advertising. If anything, they'll increase budgets to counteract the fake bot clicks.
I really don't understand what Ad Nauseam is trying to achieve. It honestly seems like it benefits Google more than it hurts them. It directly hurts advertisers, but not enough that it would stop anyone from advertising.
Google has a system for refunding advertisers for invalid clicks. The $500k account that I manage gets refunded about $50/month in invalid clicks. I'm guessing if bot clicks started making a real dent in advertiser performance, Google would counter that by improving their bot detection so they can refund advertisers in higher volumes. If there's ever an advertiser-led boycott of Google Ads, Google would almost certainly respond by refunding advertisers for bot clicks at much higher rates.
Google is part of the problem, but they're neither the only ones nor best to target through bottom-up approaches.
> It directly hurts advertisers, but not enough that it would stop anyone from advertising.
You know the saying about XML - if it doesn't solve the problem, you are not using enough of it.
> there's nothing I can do as an advertiser. We can't stop advertising...
We know. The whole thing is a cancer[0], a runaway negative feedback loop. No single enlightened advertiser can do anything about it unilaterally. Which is why the pressure needs to go up until ~everyone wants change.
--
[0] - https://jacek.zlydach.pl/blog/2019-07-31-ads-as-cancer.html
You don't have to buy privacy violating ads. You don't have to buy targetted ads
They already have methods to detect a lot. Like you said yourself, customers have no alternative, so why would they refund money they don't have to?
> The only entity you're hurting are the advertisers using Google.
That’s fine. Advertising is cancer. Reducing advertisers’ ROI is good too.
You don’t hate ads if you’re spending $500k on them. You just hate receiving ads, which makes you hypocritical.
No matter how secure your browser setup is, Google is tracking you. By filling their trackers with garbage, there's less that can personally identify you as an individual
https://web.archive.org/web/20200601034723/https://www.macob...
Carter invented it and got paid so they can bury it. Must be good tech.
GP fights agains ads, not Google. And not being able to win 100% of the gain shouldn’t restrain someone from taking action it they consider the win share worth the pain.
> $38,000 in clicks boosts Google's revenue by $38k
You should include costs here, and if (big if) a substantial part of the clicks comes from bots and get refunded, the associated cost comes on top of the bill. At the end the whole business is impacted. I agree 50/50k is a penny through.
> I hate ads […] I manage a Google Ads account
[no cynism here, I genuinely wonder] how do you manage your conscience, mood and daily motivation? Do you see a dichotomy in what you wrote and if so, how did you arrive to that situation? Any future plan?
I’m asking as you kind of introduce the subject but if you’re not willing to give more details that’s totally fine.
Or.. you know.. offering a quality product?
Man scape? Nah, generic women's razers. Pcbway? Nope. JLCPCB.
Screw your ads. Find a better way.
https://adnauseam.io/
Chrome banned it from their add on store but it can still be installed manually
While these were almost always very easy tickets to do, they were just one more interruption for us and a blocker for the stakeholders, who liked to have an extremely rapid iteration cycle themselves.
GTM was a way to make this self-service, instead of the eng team having to keep this updated, and also it was clear to everyone what all the different trackers were.
1. Understanding the security implications of code they add via tag manager. How good are they at auditing the third parties that they introduce to make sure they have rock-solid security? Even worse, do they understand that they need to be very careful not to add JavaScript code that someone emailed to them with a message that says "Important! The CEO says add this code right now!".
2. Understand the performance overhead of new code. Did they just drop in a tag that loads a full 1MB of JavaScript code before the page becomes responsive? Can they figure that out themselves? Are they positioned to make good decisions on trade-offs with respect to analytics compared to site performance?
This is not unreasonable! People spend a lot of money on ads and would like to find out if and when they work. But people act like its an unspeakable nebulous crime but this is probably the most common case by miles.
Companies were doing this for hundreds of years before Google even existed. You can learn if your ads work without invasive tracking.
It’s used by marketing people to add the 1001 trackers they love to use.
It doesn't track things by itself. It just links your data to other tools like Google Analytics or Facebook Pixel to do the tracking.
This kind of data lets businesses do stuff like send coupon emails to people who left something in their cart.
There are lots of other uses. Basically, any time you want to add code or track behavior without dealing with a developer.
> Whilst Google would love the general public to believe that Tag Manager covers a wide range of general purpose duties, it's almost exclusively used for one thing: surveillance.
Eventually we realized that every dev ran ubo, and tried loading the site without it. It took about 5 seconds. Marketing and other parts of the company had loaded so much crap into GTM that it just bogged everything down
The thing is - with everything - it's never easy to have strong principles. If it were, everyone would do it.
Using Firefox Add-Ons on a "smartphone" sucks because one has to access every Add-On interface via an Extensions menu.
In that sense _all_ Add-Ons are only semi-functional.
I use multiple layers: uMatrix + NetGuard + Nebulo "DNS Rules", at the least. Thus I have at least three opportunities where I can block lookups for and requests to Google domains.
Basically uMatrix is so donor to use that anybody can use it. The equivalent uBO section is so complicated that I feel I need to take a master degree in that subject.
https://github.com/gorhill/uBlock/wiki/Advanced-settings
Having tried both, IMHO they do not do exactly the same thing. One is pattern-based, the other is host-based. As such, one can use them together, simultaneously.
No comments yet
Also, deleting everything when Firefox closes. It's a little annoying to re-login to everything every day, but again, they are banking on this inconvenience to fuck you over and I refuse to let them win. It becomes part of the routine easily enough.
https://noscript.net/
It has pretty advanced features but also basic ones that allow you to block scripts by source
I read HN and every site submitted to HN using TCP clients and a text-only browser, that has no Javascript engine, to convert HTML to text.
The keyword is "read". Javascript is not necessary for requesting or reading documents. Web developers may use it but that doesn't mean it is necessary for sending HTTP requests or reading HTML or JSON.
If the web user is trying to do something else other than requesting and reading, then perhaps it might not "work".
I won't browse the Internet on my phone without it, everything loads instantly and any site that actually matters was whitelisted years ago.
Many sites work without (some, like random news & blogs, work better). When a site doesn't work, I make a choice between temporarily or permanently allowing it depending on how often I visit the site. It takes maybe 5 seconds and I typically only need to spend that 5 seconds once. As a reward, I enjoy a much better web experience.
If you're spending 99% of your time on your favourite websites that you've already tuned the blocking on? Barely a problem.
On the other hand if your job involves going to lots of different vendors' websites - you'll find it pretty burdensome, because you might end up fiddling with the per-site settings 15+ times per day.
My personal devices block everything I can get away with
The corruption of the system knows no bounds.
VPN so constantly changing ip.
Tor browser for everyday browsing (has no script preinstalled). So onion provides double Vpn. Regularly closed down so history cleared.
Safari in private mode and lockdown mode for when tor won't work (tor ip blocked/hd video that is too slow to stream on tor). Safari Isolation in private mode is excellent, you can use two tabs with, say emails, and neither will know other is logged in.
Safari non private for sites I want available and in sync across devices.
Firefox in permanent private mode with ublock origin for when safari lockdown mode causes issues. (Bizarely Firefox containers doesn't work in private so no isolation across tabs).
Chromium for logged into Google stuff.
Chrome for web development.
Plus opt out for everything possible inc targeted ads.
I rarely see ads of anything I would want to buy, and VPN blocks most of it at its DNS.
Beyond that, anything else would be too much effort for me.
The advertising companies I'm sure know I am not susceptible to impulse buy on ads, I research and seek vfm so not really their target.
Do you just... log back in to Hacker News every day?
I downloaded the Mullvad browser (basically Tor without the onion protocol part) but having no way to save passwords ended up making it unusable for me
>The more of us who incapacitate Google's analytics products and their support mechanism, the better. Not just for the good of each individual person implementing the blocks - but in a wider sense, because if enough people block Google Analytics 4, it will go the same way as Universal Google Analytics. These products rely on gaining access to the majority of Web users. If too many people block them, they become useless and have to be withdrawn.
OK - but then also in the wider sense, if site owners can't easily assess the performance of their site relative to user behavior to make improvements, now the overall UX of the web declines. Should we go back to static pages and mining Urchin extracts, and guessing what people care about?
If the frontend automatic js is blocked, it doesn’t matter.
I thought the term was spyware.
Surveillanceware almost sounds like something necessary to prevent bad stuff. Is this corporate rebranding to make spyware software sound less bad?
https://github.com/gorhill/uBlock/wiki/Blocking-mode
https://github.com/gorhill/uBlock/wiki/Blocking-mode:-hard-m...
https://noscript.net
You can then enable just enough JS to make sites work, slowly building a list of just what is necessary. It can also block fonts, webgl, prefetch, ping and all those other supercookie-enabling techniques.
The same with traditional cookies. I use Cookie AutoDelete to remove _all_ cookies as soon as I close the tab. I can then whitelist the ones I notice impact on authentication.
Also, you should disable JavaScript JIT, so the scripts that eventually load are less effective at exploiting potential vulnerabilities that could expose your data.
Doing that for years
> Used as supplied, Google Tag Manager can be blocked by third-party content-blocker extensions. uBlock Origin blocks GTM by default, and some browsers with native content-blocking based on uBO - such as Brave - will block it too.
> Some preds, however, full-on will not take no for an answer, and they use a workaround to circumvent these blocking mechanisms. What they do is transfer Google Tag Manager and its connected analytics to the server side of the Web connection. This trick turns a third-party resource into a first-party resource. Tag Manager itself becomes unblockable. But running GTM on the server does not lay the site admin a golden egg...
By serving the Google Analytics JS from the site's own domain, this makes it harder to block using only DNS. (e.g. Pi-Hole, hosts file, etc.)
One might think "yeah but the google js still has to talk to google domains", but apparently, Google lets you do "server-side" tagging now (e.g. running a google tag manager docker container). This means more (sub)domains to track and block. That said, how many site operators choose to go this far, I don't know.
https://developers.google.com/tag-platform/tag-manager/serve...
My current strategy is to fully block the domain if that's the sort of tactic they're willing to use.
> <script async src="https://www.googletagmanager.com/gtag/js?xxxxxxx"></script>
I am going to use this for sure, but it is a little ironic.
This GitHub repo seems way more up-to-date: https://github.com/StevenBlack/hosts