What amazes me is that this wasn't the original plan. What product manager thinks "the best thing for our customers is to delete their data!".
> We understand these links are embedded in countless documents, videos, posts and more, and we appreciate the input received.
How did they think the links were being used?
borg16 · 5h ago
i read in an earlier thread for this on HN - "this is a classic example of data driven product decision" aka we can reduce costs by $x if we just stopped goo.gl links. Instead of actually wondering how this would impact the customers.
Also helps that they are in a culture which does not mind killing services on a whim.
Aurornis · 4h ago
The Google URL shortener stopped accepting new links around 2018. It has been deprecated for a long time.
I doubt it was a cost-driven decision on the basis of running the servers. My guess would be that it was a security and maintenance burden that nobody wanted.
They also might have wanted to use the domain for something else.
cogman10 · 3h ago
How much of a burden could this really be?
The nature of something like this is that the cost to run it naturally goes down over time. Old links get clicked less so the hardware costs would be basically nothing.
As for the actual software security, it's a URL shortener. They could rewrite the entire thing in almost no time with just a single dev. Especially since it's strictly hosting static links at this point.
It probably took them more time and money to find inactive links than it'd take to keep the entire thing running for a couple of years.
simonw · 2h ago
"How much of a burden could this really be?"
My understanding from conversations I've seen about Google Reader is that the problem with Google is that every few years they have a new wave of infrastructure, which necessitates upgrading a bunch of things about all of their products.
I guess that might be things like some new version of BigTable or whatever coming along, so you need to migrate everything from the previous versions.
If a product has an active team maintaining it they can handle the upgrade. If a product has no team assigned there's nobody to do that work.
chrisjj · 1h ago
> I guess that might be things like some new version of BigTable or whatever coming along, so you need to migrate everything from the previous versions.
Arrival of new does not neccessitate migration.
Only departure of old does.
mbac32768 · 24m ago
They deprecate internal infrastructure stuff zealously and tell teams they need to be off of such and such by this date.
But it's worse than that because they'll bring up whole new datacenters without ever bringing the deprecated service up, and they also retire datacenters with some regularity. So if you run a service that depends on deprecated services you could quickly find yourself in a situation where you have to migrate to maintain N+2 redundancy but there's hardly any datacenter with capacity available in the deprecated service you depend on.
Also, how many man years of engineering do you want to spend on keeping goo.gl running. If you were an engineer would you want to be assigned this project? What are you going to put in your perf packet? "Spent 6 months of my time and also bothered engineers in other teams to keep this service that makes us no money running"?
No comments yet
lokar · 12m ago
A lot of Google infra services are built around the understanding that clients will be re-built to pick up library changes pretty often, and that you can make breaking API changes from time to time (with lots of notice).
pavel_lishin · 49m ago
But if you don't downgrade the old, then you're endlessly supporting systems, forever. At some point, it does become cheaper to migrate everything to the new.
stouset · 1h ago
And you could assign somebody to do that work, but who wants to be employed as the maintainer of a dead product? It’s a career dead-end.
davidcbc · 29m ago
> How much of a burden could this really be?
You know how Google deprecating stuff externally is a (deserved) meme? Things get deprecated internally even more frequently and someone has to migrate to the new thing. It's a huge pain in the ass to keep up with for teams that are fully funded. If something doesn't have a team dedicated to it eventually someone will decide it's no longer worth that burden and shut it down instead.
londons_explore · 3h ago
I think the concern is someone might scan all the inactive links and find that some of them link to secret URL's, leak design details about how things are built, link to documents shared 'anyone with the link' permission, etc.
cogman10 · 2h ago
> I think the concern is someone might scan all the inactive links
How? Barring a database leak I don't see a way for someone to simply scan all the links. Putting something like Cloudflare in front of the shortener with a rate limit would prevent brute force scanning. I assume google semi-competently made the shortener (using a random number generator) which would make it pretty hard to find links in the first place.
Removing inactive links also doesn't solve this problem. You can still have active links to secret docs.
rdtsc · 1h ago
> I doubt it was a cost-driven decision on the basis of running the servers. My guess would be that it was a security and maintenance burden that nobody wanted.
Yeah I can't imagine it being a huge cost saver? But guessing that the people who developed it long moved on, and it stopped being a cool project. And depending on the culture inside Google it just doesn't pay career-wise to maintain someone else's project.
rany_ · 1h ago
I really doubt it was about security/maintenance burdens. Under the hood, goo.gl just uses Firebase Dynamic Links which is still supported by Google.
Edit: nevermind, I had no idea Dynamic Links is deprecated and will be shutting down.
quesera · 1h ago
Firebase Dynamic Links is shutting down at the end of August 2025.
rany_ · 29m ago
I had no idea. It's too late to delete my comment now.
It's a really ridiculous decision though. There's not a lot that goes into a link redirection service.
mort96 · 1h ago
Documents from 2018 haven't decayed or somehow become irrelevant.
EGreg · 2h ago
How much does it really cost google to answer some quick HTTP requests and redirect, vs all their youtube videos etc
resize2996 · 4h ago
"security and maintenance burden" == "cost" == "cost-driven decision"
aspenmayer · 3h ago
Capital inputs are one part of the equation. The human cost of mental and contextual overhead cannot be reduced to dollars and cents.
mixdup · 2h ago
Sure it can. It takes X people Y hours a day/month/week to perform tasks, including planning and digging up the context behind, related to this service. Those X people make Z dollars per year. It's an extremely simple math equation
observationist · 28m ago
At this point, anyone depending on Google for anything deserves to get burned.
I don't know how much more clearly they could tell their users that Google has absolutely no respect for users without drone shipping boxes of excrement.
thevillagechief · 3h ago
One of the complaints about Google is that it's difficult to launch products due to bureaucracy. I'm starting to thing that's not a bad thing. If they'd done a careful analysis of the cost of jumping into this url-shortener bandwagon, we wouldn't be here. Maybe it's not a bad thing they move slower now.
jerlam · 3h ago
Goo.gl didn't have customers, it had users. Customers pay, either with money or their personal data, now or the future. Goo.gl did not make any money or have a plan to do so in the future.
CydeWeys · 3h ago
One wonders why they don't, instead of showing down, display a 15s interstitial unskippable YouTube-style ad prior to redirecting.
That way they'll make money, and they can fund the service not having to shut down, and there isn't any linkrot.
gloxkiqcza · 2h ago
This is such an evil idea.
xp84 · 2h ago
Why is it evil? If we assume that a free URL shortener is a good thing, and that shutting one down is a bad thing, and given that every link shortener will have costs (not just the servers -- constant moderation needs, as scammers and worse use them) and no revenue. The only possible outcome is for them all to eventually shut down, causing unrecoverable linkrot.
Given those options, an ad seems like a trivial annoyance to anyone who very much needs a very old link to work. Anyone who still has the ability to update their pages can always update their links.
sincerely · 1h ago
This is how every URL shortener on the internet worked used to work
somat · 49m ago
I always figured most of the real value of these url hashing services was as an marketing tracking metric. That is, sort of equivalent to the "share with" widgets provided that conveniently also dump tons of analytics to the services.
I will be honest I was never in an environment that would benefit from link shortening, so I don't really know if any end users actually wanted them (my guess twitter mainly) and always viewed these hashed links with extreme suspicion.
franga2000 · 3h ago
The monetary value of the goodwill and mindshare generated by such a free service is hard to calculate, but definitely significant. I wouldn't be surprised if it was more than it costs to run.
Imustaskforhelp · 4h ago
If companies can spend billions on AI and not have anything in return and be okay with that in the ways of giving free stuff (okay, I'll admit not completely free since you are the product but still free)
Then they should also be okay for keeping the goo.gl links honestly.
Sounds kinda bad for some good will but this is literally google, the one thing google is notorious for is killing their products.
citizenpaul · 4h ago
This is basically modern SV business. This old data is costing us about a million a year to hold onto. KILL IT NOW WITH FIRE.
Hey lets also dump 100 Billion dollars into this AI thing without any business plan or ideas to back it up this year. HOW FAST CAN YOU ACCEPT MY CHECK!
manquer · 2h ago
Hard to imagine costs were ever a factor.
For company running GCP and giving things like Colab TPUs free the costs of running a URL service would be trivial rounding number at best
no_wizard · 3h ago
Arguably, this is them collecting the wrong types of data to inform decisions, if that isn't represented in the data.
j45 · 3h ago
All while data and visibility is part of the business.
Like other things spun down there must not be value in the links.
troupo · 2h ago
> How did they think the links were being used?
Can't dig this document up right now, but in their Chrome dev process they say something along these lines: "even if a ferie is used by 0.01% of users, at scale that's a lot of users . Don't remove until you've made solely due impost is negligible".
At Google scale I'm surprised [1] this is not applied everywhere.
[1] Well, not that surprised
cnst · 2h ago
Yup, 0.01% of users at scale is indeed a lot of users.
This is exactly why many big companies like Amazon, Google and Mozilla still support TLSv1.0, for example, whereas all the fancy websites would return an error unless you're using TLSv1.3 as if their life depends on it.
In fact, I just checked a few seconds ago with `lynx`, and Google Search even still works on plain old HTTP without the "S", too — no TLS required whatsoever to start with.
Most people are very surprised by this revelation, and many don't even believe it, because it's difficult to reproduce this with a normal desktop browser, apart from lynx.
But this also shows just out how out of touch Walmart's digital presence really is, because somehow they deem themselves to be important enough to mandate TLSv1.2 and the very latest browsers unlike all the major ecommerce heavyweights, and deny service to anyone who doesn't have the latest device with all the latest updates installed, breaking even the slightly outdated browsers even if they do support TLSv1.2.
rs186 · 2h ago
I guess the number of people who use Chrome to access files via FTP must be below 0.01% then.
So bizarre. Embedded links, docs, social posts, stuff that could be years and years old, and they're expecting traffic to them recently? Why do they seem to think their link shortener is only being used for like someone's social profile linktree or something. Some marketing person's bizarre view of how the web is being used.
cellover · 2h ago
tail -f access.log maybe?
neilv · 4h ago
"Actively used" criteria scrods that critical old document you found, in which someone trusted it was safe to use a Google link.
Not knowing all the details motivating this surprising decision, from the outside, I'd expect this to be an easy "Don't Be Evil" call:
"If we don't want to make new links, we can stop taking them (with advance warning, for any automation clients). But we mustn't throw away this information that was entrusted to us, and must keep it organized/accessible. We're Google. We can do it. Oddly, maybe even with less effort than shutting it down would take."
inetknght · 4h ago
> someone trusted it was safe to use a Google link.
That someone made a poor decision to rely on anything made by Google.
progval · 3h ago
Hindsight is 20/20. Google was considered by geeks to be a very reliable company at some point.
wolrah · 3h ago
Using a link shortener for any kind of long-term link, no matter who hosts it, has never been a good idea. They're for ephemeral links shared over limited mediums like SMS or where a human would have to manually copy the link from the medium to the browsing device like a TV ad. If you put one in a document intended for digital consumption you've already screwed up.
CydeWeys · 3h ago
Link shorteners are old enough that likely more URLs that were targeted by link shorteners have rotted away than have link shorteners themselves.
Go look at a decade+ old webpage. So many of the links to specific resources (as in, not just a link to a domain name with no path) simply don't work anymore.
babypuncher · 2h ago
I think it would be easy for these services to audit their link database and cull any that have had dead endpoints for more than 12 months.
That would come off far less user hostile than this move while still achieving the goal of trimming truly unnecessary bloat from their database. It also doesn't require you to keep track of how often a link is followed, which incurs its own small cost.
xp84 · 2h ago
> cull any that have had dead endpoints
That actually seems just as bad to me, since the URL often has enough data to figure out what was being pointed to even if the exact URL format of a site has changed or even if a site has gone offline. It might be like:
kmart dot com / product.aspx?SKU=12345678&search_term=Staplers or /products/swingline-red-stapler-1235467890
Those URLs would now be dead and kmart itself will soon be fully dead but someone can still understand what was being linked to.
Even if the URL is 404, it's still possibly useful information for someone looking at some old resource.
neilv · 3h ago
Yeah, when Google was founded, people acted like they were normal smart and benevolent and forward-thinking Internet techies (it was a type), and they got a lot of support and good hires because of that.
Then, even as that was eroding, they were still seen as reliable, IIRC.
The killedbygoogle reputation was more recent. And still I think isn't common knowledge among non-techies.
And even today, if you ask a techie which companies have certain reliability capabilities, Google would be at the top of some lists (e.g., keeping certain sites running under massive demand, and securing data against attackers).
forty · 2h ago
"Don't Be Evil" has been deprecated for a while
userbinator · 3h ago
"Those who control the past control the present. Those who control the present control the future."
Look at what happened to their search results over the years and you'll understand.
It may help prevent linkjacking. If an old URL no longer works, but the goo.gl link is still available, it's possible that someone could take over the URL and use it for malicious. Consider a scenario like this:
1. Years ago, Acme Corp sets up an FAQ page and creates a goo.gl link to the FAQ.
2. Acme goes out of business. They take the website down, but the goo.gl link is still accessible on some old third-party content, like social media posts.
3. Eventually, the domain registration lapses, and a bad actor takes over the domain.
4. Someone stumbles across a goo.gl link in a reddit thread from a decade ago and clicks it. Instead of going to Acme, they now go to a malicious site full of malware.
With the new policy, if enough time has passed without anyone clicking on the link, then Google will deactivate it, and the user in step 4 would now get a 404 from Google instead.
dundarious · 2h ago
In this little story, what's the difference if the direct ACME URL was used? What does the goo.gl indirection have to do with anything?
xp84 · 2h ago
Goo.gl was a terrible idea in the first place because it lends Google's apparent legitimacy (in the eyes of the average "noob") to unmoderated content that could be malicious. That's probably why they at least stopped allowing new ones to be made. By allowing old ones, they can't rule out the Google brand being used to scam and phish.
e.g. Imagine SMS or email saying "We've received your request to delete your Google account effective (insert 1 hour's time). To cancel your request, just click here and log into your account: https://goo.gl/ASDFjkl
This was a very popular strategy for phishing and it's still possible if you can find old links that go to hosts that are NXDOMAIN and unregistered, of which there are no doubt millions.
NewJazz · 1h ago
Yeah I'm pretty sure this is the main reason google is shutting the service down. They don't want their brand tainted by phishing attempts.
mystifyingpoi · 4h ago
It creates a good entry in the promo package for that Google manager. "Successfully conducted cost saving measure, cutting down the spend on the link shortener service by 70%". Of course, hoping that no one will check the actual numbers.
maven29 · 5h ago
A warning shot to guard against an AT&T Bell-style forced divestiture?
imchillyb · 4h ago
I believe this is the simplest and most succinct answer given the current anti monopoly climate the courts and prosecutors have.
42lux · 5h ago
Increasing database ops.
18172828286177 · 5h ago
[flagged]
zarzavat · 4h ago
Do PMs at Google have so much power that they can shut down a product used by billions of people?
afavour · 4h ago
They’re not shutting down a product, they’re removing old links.
I’m not defending it, just that I can absolutely imagine Google PMs making a chart of “$ saved vs clicks” and everyone slapping each other on the back and saying good job well done.
deelowe · 4h ago
They can write the proposals to do so and if it gets picked up by a VP and approved, then they can cite that on their promo.
OutOfHere · 4h ago
The product was shut down a long time ago. They're now deleting inactive data of users.
Retr0id · 5h ago
Presumably, saving disk space on some google servers.
dietr1ch · 5h ago
More than disk space I think they care about having short links, higher cache hit rates and saving RAM on their fleet.
smaudet · 4h ago
I find even this incredibly stingy... Back of the envelope:
1043*1000000000 / (1023^3)
10 4 byte characters times 3 billion links, dividing by 1 GB of memory...
Roughly 111 GB of RAM.
Which is like nothing to a search giant.
To put that into perspective, my Desktop Computer's max Mobo memory is 128 GB, so saying it has to do with RAM is like saying they needed to shut off a couple servers...and save like maybe a thousand dollars.
This reeks of something else, if not just sheer ineptitude...
dietr1ch · 4h ago
> Roughly 111 GB of RAM. Which is like nothing to a search giant.
You are forgetting job replication. A global service can easily have 100s of jobs on 10-20 datacenters.
Saving 111TiB of RAM can probably pay your salary forever. I think I paid mine with fewer savings while there. During covid there was a RAM shortage too enough to have a call to prefer trading CPU to save RAM with changes to the rule of thumb resource costs.
nomel · 4h ago
> A global service can easily have 100s of jobs on 10-20 datacenters.
There's obviously, something in between maintaining the latency with 20 datacenter, increasing the latency a bit reducing hosting to a couple $100 worth of servers, and setting the latency to infinity, which was the original plan.
dietr1ch · 3h ago
I'm guessing that they ran out of leeway with small tweaks and found that breaking inactive links was probably a better way out. We don't know the hit rates of what they call inactive nor the real cost it takes to keep them around.
A service like this is probably on maintenance mode too, so simplifying it to use fewer resources probably makes sense, and I bet the PMs are happy about shorter links, since at some point you are better off not using a link shortener and instead just use a QR code in fear of inconvenience and typos.
Retr0id · 5h ago
If they really are only purging the inactive ones, this shouldn't impact cache hit rate much.
I don't understand. For you to see the message, you have to click on the link. Your clicking on the link must mean that the link is active, since it is getting clicks. So why is the link being deactivated for being inactive?
skybrian · 4h ago
> showed no activity in late 2024
Apparently they measured it once by running a map-reduce or equivalent.
I don’t see why they couldn’t measure it again. Maybe they don’t want it to be gamed, but why?
poyu · 4h ago
I interpreted "inactive" as the link that the shortener is linking to is not responding.
OutOfHere · 4h ago
No. Inactive means that the short URL hasn't been accessed in a while.
lathiat · 5h ago
If I had to guess it is possibly something to do with fighting crawlers/bots/etc triggering the detection? And running some kind of more advanced logic to try ensure it's really being used. Light captcha style.
But just a guess.
xp84 · 2h ago
I am pretty sure the terrible idea of putting the Google brand on something that can so easily be used for phishing is the reason they deprecated it in the first place. They should have used something without obvious branding.
alpb · 51m ago
This whole thing has 0 cost to Google to run. They could be nice citizens and continue to provide this service for free, but they chose to not to.
mixdup · 2h ago
I'm sure there's some level of security implication, but maybe they could also archive the database of redirect with Archive.org or just release it
quink · 4h ago
And for an encore, I guess they'll start tearing out random pages in the books I didn't happen to read last August?
jjice · 4h ago
I would've imagined that the good will (or more likely, the lack of bad will) from _not_ doing this would've been worth the cost, considering I can't imagine this has high costs to run.
yandie · 4h ago
They probably saved the equivalent of an engineer's salary!!
ChrisArchitect · 4h ago
Noticed recently on some google properties where there are Share buttons that it's generating https://share.google links now instead of goo.gl.
Is that the same shortening platform running it?
And also does this have something to do with the .gl TLD? Greenland? A redirect to share.google would be fine
ewoodrich · 1h ago
The key difference is share.google, as you mentioned, is for Google controlled properties whereas goo.gl allowed shortening any arbitrary user provided URL. Which opened up a giant can of worms with Google implicitly lending its brand credibility to any URL used by a scammer, phisher or attacker.
charlesabarnes · 1h ago
You can generate share.google links on chrome for any arbitrary url.
ewoodrich · 52m ago
How? I just tried each of the Share options for this thread in the desktop Share menu, and they all used the full URL. Including the QR code which I verified by saving as a PNG and scanning it outside of any Google app. I also haven't found any Share option in the iOS app either that doesn't use the full URL. But harder to test on mobile given the various permutations of sharing between random apps.
alliao · 1h ago
oh google, please get your mojo back this is correct
TZubiri · 2h ago
Next step, deprecate those ridiculous forms.gle links that just train users to ignore domain names.
hk1337 · 2h ago
Yet another google product put to the chopping block. If products were people, they'd have a lot of blood on their hands.
lofaszvanitt · 59m ago
Jesus, do not rely on Google for anything.
AlienRobot · 5h ago
I'll never use a URL shortener again.
saurik · 4h ago
The same reason you did in the first place -- despite a ton people who saw the future saying you shouldn't -- is the reason why the next generation of people will do it despite you trying to warn them.
SoftTalker · 5h ago
Any form of URL is at best a point in time reference.
Shortened or not, they change, disappear, get redirected, all the time. There was once an idea that a URL was (or should be) a permanent reference, but to the extent that was ever true it's long in the past.
The closest thing we might have to that is an Internet Archive link.
Otherwise, don't cite URLs. Cite authors, titles, keywords, and dates, and maybe a search engine will turn up the document, if it exists at all.
toomuchtodo · 3h ago
Cite Wayback Machine links, as you mention.
Jabrov · 5h ago
Has there ever been one that survived for a really long time?
reddalo · 5h ago
Three random examples that come to my mind:
- Tinyurl.com, launched in 2002, currently 23 years old
- Urly.it, launched in 2009, currently 16 years old
I think I might be doing a self plug here, so pardon me but I am pretty sure that I can create something like a link shortener which can last essentially permanent, it has to do with crypto (I don't adore it as an investment, I must make it absolutely clear)
But basically I have created nanotimestamps which can embed some data in nano blockchain and that data could theoretically be a link..
Now the problem is that the link would atleast either be a transaction id which is big or some sort of seed passphrase...
So no, its not as easy as some passphrase but I am pretty sure that nano isn't going to dissolve, last time I checked it has 60 nodes and anyone can host a node and did I mention all of this for completely free.. (I mean, there is no gas fees in nano, which is why I picked it)
I am not associated with the nano team and it would actually be sort of put their system on strain if we do actually use it in this way but I mean their system allows for it .. so why not cheat the system
Tldr: I am pretty sure that I can build one which can really survive a really long time, decentralized based link shortener but the trade off is that the shortened link might actually become larger than original link. I can still think of a way to actually shorten it though
Like I just thought that nano has a way to catalogue transactions in time so its theoretically possible that we can catalogue some transactions from time, and so basically its just the nth number of transaction and that n could be something like 1000232
and so it could be test.org/1000232 could lead to something like youtube rickroll. Could theoretically be possible, If literally anybody is interested, I can create a basic prototype since I am just so proud really that I created some decent "innovation" in some space that I am not even familiar with (I ain't no crypto wizard)
tqi · 1h ago
1) i think this means every link is essentially public? probably not ideal.
2) you don't actually want things to be permanent - users will inevitably shorten stuff strings didn't mean to / want to, so there needs to be a way to scrub them.
ameliaquining · 4h ago
You can't address the risk that whoever owns the domain will stop renewing it, or otherwise stop making the web gateway available. Best-case scenario is that it becomes possible to find out what URL a shortened link used to point to, for as long as the underlying blockchain lasts, but if a regular user clicks on a link after the web gateway shuts down then they'll get an error message or end up on a domain squatting site, neither of which will provide any information about how to get where they want to go.
OutOfHere · 1h ago
These days one can register a domain for ten years, and have it auto-renew with prefunded payments that are already sitting in the account. This is what I did for the URL shortener I am developing.
The same would have to be done for the node running the service, and it too has been prefunded with a sitting balance.
Granted, there still exist failure modes, and so the bus factor needs to be more than one, but the above setup can in all probability easily ride out a few decades with the original person forgetting about it. In principle, a prefunded LLM with access to appropriate tooling and a headless browser can even be put in charge to address common administrative concerns.
Imustaskforhelp · 3h ago
I mean yes the web gateway can shut, but honestly like atleast with goo.gl if things go down, then there is no way of recovering.
With the system I am presenting, I think that it can be possible to have a website like redirect.com/<some-gibberish> and even if redirect.com goes down then yes that link would stop working but what redirect.com is doing under the hood can be done by anybody so that being said,
it can be possible for someone to archive redirect.com main site which might give instructions which can give a recent list on github or some other place which can give a list of top updated working web gateways
And so anybody can go to archive.org, see that's what they meant and try it or maybe we can have some sort of slug like redirect.com/block/<random-gibberish> and then maybe people can then have it be understood to block meaning this is just a gateway (a better more niche word would help)
But still, at the end of the day there is some way of using that shortened link forever thus being permanent in some sense.
Like Imagine that someone uses goo.gl link for some extremely important document and then somehow it becomes inaccessible for whatever use case and now... Its just gone?
I think that a way to recover that could really help. But honestly, I am all in for feedback and since its 0 fees
and as such I would most likely completely open source it and neither am I involved in this crypto project, I most likely will earn nothing like ever even if I do make this, but I just hope that I could help in making the internet a little less like a graveyard with dead links and help in that aspect.
OutOfHere · 4h ago
It's not useful if the resulting URL is too long. It defeats the purpose of a URL shortener. The source URL can just be used then.
Imustaskforhelp · 3h ago
Yes I did address that part but honestly I can use the time of when it was sent into blockchain / transaction id which is generally really short as I said in the comment. I will hack a prototype tomorrow.
OutOfHere · 1h ago
It is the long URL that also needs to be stored, not just the short URL.
If you want to use blockchain for this, I advise properly using a dedicated new blockchain, not spamming the Nano network.
wizzwizz4 · 4h ago
> which can last essentially permanent
Data stored in a blockchain isn't any more permanent than data stored in a well-seeded SQLite torrent: it's got the same failure modes (including "yes, technically there are a thousand copies… somewhere; but we're unlikely to get hold of one any time in the next 3 years").
But yes, you have correctly used the primitives to construct a system. (It's hardly your fault people undersell the leakiness of the abstraction.)
Imustaskforhelp · 3h ago
Honestly, I agree with your point so wholeheartedly.
I was really into p2p technologies like iroh etc. and at a real fundamental level you are still trusting that someone won't just suddenly leave things so things can still very well go down... even in crypto
But I think compared to sqlite torrent, the part about crypto might be the fact that since there's people's real money involved (for the worse or for the better) it then becomes of absolute permanence that data stored in blockchain becomes permanent.. and like I said, I can use that 60 nodes for absolutely free due to absolutely 0 gas fees compared to Sqlite torrent.
rsync · 3h ago
"I'll never use a URL shortener again."
I don't know if anyone should use a URL shortener or not ... but if you do ...
"Oh By"[1] will be around in thirty years.
Links will not be "purged". Users won't be tracked. Ads won't be served.
Says who? These assertions mean nothing and guarantee nothing.
OutOfHere · 4h ago
The more correct generalization would be to never trust a Google product again with your data.
Fwiw, I wrote and hosted my own URL shortener, also embeddable in applications.
HPsquared · 5h ago
Finally a use for blockchain?
Imustaskforhelp · 4h ago
Oh boy... I think I found the man that I can yap
about the idea that I got scrolling thorugh HN: link shortener in blockchain with 0 gas fees
Here is the comment since I don't want to spam the same comment twice, Have a nice day
If you make it read-only, maybe. If anyone can generate a link, wait for your hosting provider to shout at you and ask why there is so much spam/illegal content with your domain. The you realize you can't actually manage a service like this.
As we already have a PostgreSQL database server, thecost of running this is extremely low, and we aren't concerned about GDPR (etc) issues with using a third-party site.
Imustaskforhelp · 4h ago
dub.sh comes to my mind
VWWHFSfQ · 5h ago
I use pinboard.in. Also pay the $20/yr for archiving if the links rot
> We understand these links are embedded in countless documents, videos, posts and more, and we appreciate the input received.
How did they think the links were being used?
Also helps that they are in a culture which does not mind killing services on a whim.
I doubt it was a cost-driven decision on the basis of running the servers. My guess would be that it was a security and maintenance burden that nobody wanted.
They also might have wanted to use the domain for something else.
The nature of something like this is that the cost to run it naturally goes down over time. Old links get clicked less so the hardware costs would be basically nothing.
As for the actual software security, it's a URL shortener. They could rewrite the entire thing in almost no time with just a single dev. Especially since it's strictly hosting static links at this point.
It probably took them more time and money to find inactive links than it'd take to keep the entire thing running for a couple of years.
My understanding from conversations I've seen about Google Reader is that the problem with Google is that every few years they have a new wave of infrastructure, which necessitates upgrading a bunch of things about all of their products.
I guess that might be things like some new version of BigTable or whatever coming along, so you need to migrate everything from the previous versions.
If a product has an active team maintaining it they can handle the upgrade. If a product has no team assigned there's nobody to do that work.
Arrival of new does not neccessitate migration.
Only departure of old does.
But it's worse than that because they'll bring up whole new datacenters without ever bringing the deprecated service up, and they also retire datacenters with some regularity. So if you run a service that depends on deprecated services you could quickly find yourself in a situation where you have to migrate to maintain N+2 redundancy but there's hardly any datacenter with capacity available in the deprecated service you depend on.
Also, how many man years of engineering do you want to spend on keeping goo.gl running. If you were an engineer would you want to be assigned this project? What are you going to put in your perf packet? "Spent 6 months of my time and also bothered engineers in other teams to keep this service that makes us no money running"?
No comments yet
You know how Google deprecating stuff externally is a (deserved) meme? Things get deprecated internally even more frequently and someone has to migrate to the new thing. It's a huge pain in the ass to keep up with for teams that are fully funded. If something doesn't have a team dedicated to it eventually someone will decide it's no longer worth that burden and shut it down instead.
How? Barring a database leak I don't see a way for someone to simply scan all the links. Putting something like Cloudflare in front of the shortener with a rate limit would prevent brute force scanning. I assume google semi-competently made the shortener (using a random number generator) which would make it pretty hard to find links in the first place.
Removing inactive links also doesn't solve this problem. You can still have active links to secret docs.
Yeah I can't imagine it being a huge cost saver? But guessing that the people who developed it long moved on, and it stopped being a cool project. And depending on the culture inside Google it just doesn't pay career-wise to maintain someone else's project.
Edit: nevermind, I had no idea Dynamic Links is deprecated and will be shutting down.
It's a really ridiculous decision though. There's not a lot that goes into a link redirection service.
That way they'll make money, and they can fund the service not having to shut down, and there isn't any linkrot.
Given those options, an ad seems like a trivial annoyance to anyone who very much needs a very old link to work. Anyone who still has the ability to update their pages can always update their links.
I will be honest I was never in an environment that would benefit from link shortening, so I don't really know if any end users actually wanted them (my guess twitter mainly) and always viewed these hashed links with extreme suspicion.
Then they should also be okay for keeping the goo.gl links honestly.
Sounds kinda bad for some good will but this is literally google, the one thing google is notorious for is killing their products.
Hey lets also dump 100 Billion dollars into this AI thing without any business plan or ideas to back it up this year. HOW FAST CAN YOU ACCEPT MY CHECK!
For company running GCP and giving things like Colab TPUs free the costs of running a URL service would be trivial rounding number at best
Like other things spun down there must not be value in the links.
Can't dig this document up right now, but in their Chrome dev process they say something along these lines: "even if a ferie is used by 0.01% of users, at scale that's a lot of users . Don't remove until you've made solely due impost is negligible".
At Google scale I'm surprised [1] this is not applied everywhere.
[1] Well, not that surprised
This is exactly why many big companies like Amazon, Google and Mozilla still support TLSv1.0, for example, whereas all the fancy websites would return an error unless you're using TLSv1.3 as if their life depends on it.
In fact, I just checked a few seconds ago with `lynx`, and Google Search even still works on plain old HTTP without the "S", too — no TLS required whatsoever to start with.
Most people are very surprised by this revelation, and many don't even believe it, because it's difficult to reproduce this with a normal desktop browser, apart from lynx.
But this also shows just out how out of touch Walmart's digital presence really is, because somehow they deem themselves to be important enough to mandate TLSv1.2 and the very latest browsers unlike all the major ecommerce heavyweights, and deny service to anyone who doesn't have the latest device with all the latest updates installed, breaking even the slightly outdated browsers even if they do support TLSv1.2.
https://www.auslogics.com/en/articles/is-it-bad-that-google-...
Not knowing all the details motivating this surprising decision, from the outside, I'd expect this to be an easy "Don't Be Evil" call:
"If we don't want to make new links, we can stop taking them (with advance warning, for any automation clients). But we mustn't throw away this information that was entrusted to us, and must keep it organized/accessible. We're Google. We can do it. Oddly, maybe even with less effort than shutting it down would take."
That someone made a poor decision to rely on anything made by Google.
Go look at a decade+ old webpage. So many of the links to specific resources (as in, not just a link to a domain name with no path) simply don't work anymore.
That would come off far less user hostile than this move while still achieving the goal of trimming truly unnecessary bloat from their database. It also doesn't require you to keep track of how often a link is followed, which incurs its own small cost.
That actually seems just as bad to me, since the URL often has enough data to figure out what was being pointed to even if the exact URL format of a site has changed or even if a site has gone offline. It might be like:
kmart dot com / product.aspx?SKU=12345678&search_term=Staplers or /products/swingline-red-stapler-1235467890
Those URLs would now be dead and kmart itself will soon be fully dead but someone can still understand what was being linked to.
Even if the URL is 404, it's still possibly useful information for someone looking at some old resource.
Then, even as that was eroding, they were still seen as reliable, IIRC.
The killedbygoogle reputation was more recent. And still I think isn't common knowledge among non-techies.
And even today, if you ask a techie which companies have certain reliability capabilities, Google would be at the top of some lists (e.g., keeping certain sites running under massive demand, and securing data against attackers).
Look at what happened to their search results over the years and you'll understand.
1. Years ago, Acme Corp sets up an FAQ page and creates a goo.gl link to the FAQ.
2. Acme goes out of business. They take the website down, but the goo.gl link is still accessible on some old third-party content, like social media posts.
3. Eventually, the domain registration lapses, and a bad actor takes over the domain.
4. Someone stumbles across a goo.gl link in a reddit thread from a decade ago and clicks it. Instead of going to Acme, they now go to a malicious site full of malware.
With the new policy, if enough time has passed without anyone clicking on the link, then Google will deactivate it, and the user in step 4 would now get a 404 from Google instead.
e.g. Imagine SMS or email saying "We've received your request to delete your Google account effective (insert 1 hour's time). To cancel your request, just click here and log into your account: https://goo.gl/ASDFjkl
This was a very popular strategy for phishing and it's still possible if you can find old links that go to hosts that are NXDOMAIN and unregistered, of which there are no doubt millions.
I’m not defending it, just that I can absolutely imagine Google PMs making a chart of “$ saved vs clicks” and everyone slapping each other on the back and saying good job well done.
1043*1000000000 / (1023^3)
10 4 byte characters times 3 billion links, dividing by 1 GB of memory...
Roughly 111 GB of RAM.
Which is like nothing to a search giant.
To put that into perspective, my Desktop Computer's max Mobo memory is 128 GB, so saying it has to do with RAM is like saying they needed to shut off a couple servers...and save like maybe a thousand dollars.
This reeks of something else, if not just sheer ineptitude...
You are forgetting job replication. A global service can easily have 100s of jobs on 10-20 datacenters. Saving 111TiB of RAM can probably pay your salary forever. I think I paid mine with fewer savings while there. During covid there was a RAM shortage too enough to have a call to prefer trading CPU to save RAM with changes to the rule of thumb resource costs.
There's obviously, something in between maintaining the latency with 20 datacenter, increasing the latency a bit reducing hosting to a couple $100 worth of servers, and setting the latency to infinity, which was the original plan.
A service like this is probably on maintenance mode too, so simplifying it to use fewer resources probably makes sense, and I bet the PMs are happy about shorter links, since at some point you are better off not using a link shortener and instead just use a QR code in fear of inconvenience and typos.
Google's shortened goo.gl links will stop working next month - https://news.ycombinator.com/item?id=44683481 - July 2025 (219 comments)
Google URL Shortener links will no longer be available - https://news.ycombinator.com/item?id=40998549 - July 2024 (49 comments)
Apparently they measured it once by running a map-reduce or equivalent.
I don’t see why they couldn’t measure it again. Maybe they don’t want it to be gamed, but why?
But just a guess.
Is that the same shortening platform running it?
And also does this have something to do with the .gl TLD? Greenland? A redirect to share.google would be fine
Shortened or not, they change, disappear, get redirected, all the time. There was once an idea that a URL was (or should be) a permanent reference, but to the extent that was ever true it's long in the past.
The closest thing we might have to that is an Internet Archive link.
Otherwise, don't cite URLs. Cite authors, titles, keywords, and dates, and maybe a search engine will turn up the document, if it exists at all.
- Tinyurl.com, launched in 2002, currently 23 years old
- Urly.it, launched in 2009, currently 16 years old
- Bitly.com, also launched in 2009
So yes, some services survived a long time.
[1]: https://en.wikipedia.org/wiki/Digital_object_identifier
I think I might be doing a self plug here, so pardon me but I am pretty sure that I can create something like a link shortener which can last essentially permanent, it has to do with crypto (I don't adore it as an investment, I must make it absolutely clear)
But basically I have created nanotimestamps which can embed some data in nano blockchain and that data could theoretically be a link..
Now the problem is that the link would atleast either be a transaction id which is big or some sort of seed passphrase...
So no, its not as easy as some passphrase but I am pretty sure that nano isn't going to dissolve, last time I checked it has 60 nodes and anyone can host a node and did I mention all of this for completely free.. (I mean, there is no gas fees in nano, which is why I picked it)
I am not associated with the nano team and it would actually be sort of put their system on strain if we do actually use it in this way but I mean their system allows for it .. so why not cheat the system
Tldr: I am pretty sure that I can build one which can really survive a really long time, decentralized based link shortener but the trade off is that the shortened link might actually become larger than original link. I can still think of a way to actually shorten it though
Like I just thought that nano has a way to catalogue transactions in time so its theoretically possible that we can catalogue some transactions from time, and so basically its just the nth number of transaction and that n could be something like 1000232
and so it could be test.org/1000232 could lead to something like youtube rickroll. Could theoretically be possible, If literally anybody is interested, I can create a basic prototype since I am just so proud really that I created some decent "innovation" in some space that I am not even familiar with (I ain't no crypto wizard)
2) you don't actually want things to be permanent - users will inevitably shorten stuff strings didn't mean to / want to, so there needs to be a way to scrub them.
The same would have to be done for the node running the service, and it too has been prefunded with a sitting balance.
Granted, there still exist failure modes, and so the bus factor needs to be more than one, but the above setup can in all probability easily ride out a few decades with the original person forgetting about it. In principle, a prefunded LLM with access to appropriate tooling and a headless browser can even be put in charge to address common administrative concerns.
With the system I am presenting, I think that it can be possible to have a website like redirect.com/<some-gibberish> and even if redirect.com goes down then yes that link would stop working but what redirect.com is doing under the hood can be done by anybody so that being said,
it can be possible for someone to archive redirect.com main site which might give instructions which can give a recent list on github or some other place which can give a list of top updated working web gateways
And so anybody can go to archive.org, see that's what they meant and try it or maybe we can have some sort of slug like redirect.com/block/<random-gibberish> and then maybe people can then have it be understood to block meaning this is just a gateway (a better more niche word would help)
But still, at the end of the day there is some way of using that shortened link forever thus being permanent in some sense.
Like Imagine that someone uses goo.gl link for some extremely important document and then somehow it becomes inaccessible for whatever use case and now... Its just gone?
I think that a way to recover that could really help. But honestly, I am all in for feedback and since its 0 fees and as such I would most likely completely open source it and neither am I involved in this crypto project, I most likely will earn nothing like ever even if I do make this, but I just hope that I could help in making the internet a little less like a graveyard with dead links and help in that aspect.
If you want to use blockchain for this, I advise properly using a dedicated new blockchain, not spamming the Nano network.
Data stored in a blockchain isn't any more permanent than data stored in a well-seeded SQLite torrent: it's got the same failure modes (including "yes, technically there are a thousand copies… somewhere; but we're unlikely to get hold of one any time in the next 3 years").
But yes, you have correctly used the primitives to construct a system. (It's hardly your fault people undersell the leakiness of the abstraction.)
But I think compared to sqlite torrent, the part about crypto might be the fact that since there's people's real money involved (for the worse or for the better) it then becomes of absolute permanence that data stored in blockchain becomes permanent.. and like I said, I can use that 60 nodes for absolutely free due to absolutely 0 gas fees compared to Sqlite torrent.
I don't know if anyone should use a URL shortener or not ... but if you do ...
"Oh By"[1] will be around in thirty years.
Links will not be "purged". Users won't be tracked. Ads won't be served.
[1] https://0x.co
How can you (or I) know that?
Fwiw, I wrote and hosted my own URL shortener, also embeddable in applications.
https://news.ycombinator.com/reply?id=44760545
As we already have a PostgreSQL database server, thecost of running this is extremely low, and we aren't concerned about GDPR (etc) issues with using a third-party site.
https://pinboard.in/