I'm against link rot and I hate how Google doesn't maintain old projects. But this is one shutdown I 100% agree with.
Having an official Google domain that anyone can hijack is dangerous, given that many people's main internet identity is GMail (aka their Google account). I know anyone can create an offshoot (goooogle.org, etc), but Google was using goo.gl too.
It was easy to redirect a goo.gl to a Google login page (which is on a real Google domain), and trick people into authorizing access to their account.
I consider myself savvy, and I got a pretty convincing one recently. The email looked legit, and the link was a goo.gl link that ultimately landed me on a legitimate Google login page. It didn't trick me, but it did take me a few minutes to figure out how it wasn't legit.
NOTE: This article is kinda misleading. They already stopped letting people add new links in 2019. And now, they're only removing "inactive" links, AKA links that had no activity since 2024. If you visit a link right now, it will be kept. Here's more info: https://blog.google/technology/developers/googl-link-shorten...
jszymborski · 19h ago
I think it's pretty easy to make an argument against URL shorteners, but I think it's a bit harder to defend killing existing short links. Stop new links from being minted, keep up a "Report Abuse" page, maybe even scan the existing DB for Google Login look-alikes. The upkeep is as much or less than responsibly running a URL short link site in the first place.
Instead, they're just disappearing _all_* goo.gl short links. The overwhelming majority of which are benign links made by users who were promised a super stable URL link shortening service backed by the Google brand.
*edit: Not all, but nearly.
striking · 19h ago
Not all.
> All other [active] goo.gl links will be preserved and will continue to function as normal. To check if your link will be retained, visit the link today. If your link redirects you without a message, it will continue to work.
This is true. An enormous number links are going to die, however.
Aurornis · 19h ago
> Stop new links from being minted
They already did this in 2019.
The service has been deprecated for a very long time.
> Instead, they're just disappearing _all_ goo.gl short links
This is false. They are sunsetting inactive links.
jszymborski · 19h ago
I know its been deprecated. I'm just saying that they could have stopped it there.
It is true that it is not _all_ links, apologies. "Inactive" here is defined as "not visited in 2024" which is a crazy small envelope. I wouldn't be surprised if nearly all links were deleted.
seba_dos1 · 11h ago
> They are sunsetting inactive links.
...which is even harder to justify than removing them all.
troupo · 15h ago
They were going to shut down the service entirely until a public outcry.
procaryote · 19h ago
I guess there's a tiny chance they'll learn to A: not trust google to keep anything alive, or B: not use link shorteners?
Guillaume86 · 19h ago
Most of the time it's not the people who created the link who are going be annoyed with the broken link...
Aurornis · 19h ago
I fully agree. Official looking redirect URLs are a dream come true for scams and phishing attacks.
The goo.gl link shortener hasn’t accepted new links for many years. Over 99% of the links had no recent activity. The play was to scrape the web for old goo.gl links that went to expired domains, register the domain, and then you have a goo.gl URL that you can send wherever you want, indefinitely.
Nearly all of the angry blog posts, Tweets, and HN comments missed this and jumped to the conclusion that it was purely a cost cutting measure, but link official-looking open redirect URLs are a big deal in the security space.
jszymborski · 19h ago
> Over 99% of the links had no recent activity.
"Recent" is defined within the last year. If the Wayback Machine adopted this logic, it would be useless.
The security concerns were largely addressed by not accepting new links. This was a cost cutting measure, plain and simple. I think we all agree that a goo.gl shortener was a terrible idea to begin with, and my blog post even shows evidence that folks knew this was a bad idea at launch.
da_chicken · 19h ago
Yeah, and I question how much cost they're saving. Just how much storage do you need for a URI redirect? How much are you spending to have a record that isn't being used?
It would make sense if they were pruning links whose TARGETS were no longer responding. But all the unused links are costing essentially nothing. Essentially all the cost was spent already.
ruds · 16h ago
The security concerns were not addressed by not accepting new links. As the post you replied to said,
> The play was to scrape the web for old goo.gl links that went to expired domains, register the domain, and then you have a goo.gl URL that you can send wherever you want, indefinitely.
pinum · 19h ago
Instead of shutting down completely, why not this:
For goo.gl links that were created by google, continue redirecting them as normal.
For others, show a warning page explaining to the user that the link wasn't created (or vouched for) by google. If they press an "agree" button, still don't show a clickable link, but instead show it as plain text to be copied.
gkoberger · 18h ago
Yeah, this is sorta what's already happening. They stopped new links years ago, and are only removing "inactive" links now.
akersten · 19h ago
> Having an official Google domain that anyone can hijack is dangerous,
This makes me wonder if they're retiring sites.google.com any time soon?
jsploit · 7h ago
> If you visit a link right now, it will be kept.
No, only those that were deemed "active" in 2024 will be kept.
amelius · 19h ago
Can't they just show a warning screen containing a button "I accept the risk"?
ants_everywhere · 19h ago
I've never quite understood URL shorteners. They seem like a way of opting into link rot and sending tracking data to third parties who aren't necessary for the client-server connection.
Is there a major benefit I'm missing? I could kind of see them if you have a character limit, want to hide the URL, or have to type a URL manually. But manual typing is rare, and even microblogging services are expanding character limits. Hiding the URL seems slightly sketchy, but you can achieve it without a shorter URL so maybe that's not a real benefit.
Anyway, I'm actually curious about this because people seem to love them.
(and this is aside from all the very valid issues and concerns people have with Google shutting down a widely used service).
pythonaut_16 · 19h ago
QR Codes is the main case I've looked at where the shorter link makes a difference in the complexity (and therefore scannability) of the generated code.
But otherwise I think it's about traffic and tracking.
First in that you can track visitors to the short link, perhaps in a more centralized place if your org has multiple systems or if you're linking to sites you don't otherwise control. And secondly it's used when tacking on tons of query parameters that would otherwise make it too long even with expanded character limits. And just aesthetically a short link might be more aesthetic in your post.
Imagine seeing "link.short/abcd1234" vs "linksrus.corporate/blog/2025/08/12/the-title-of-our-post.html?utm=something&some-other-tracking-param=abcdefghijklmnopqrstuvwxyz&even-more-data-in-the-url&some-hashed-value=QXNoIG5hemcgZHVyYmF0dWzDu2ssCiAgIGFzaCBuYXpnIGdpbWJhdHVsLApBc2ggbmF6ZyB0aHJha2F0dWzDu2sKICAgYWdoIGJ1cnp1bS1pc2hpIGtyaW1wYXR1bAoKT25lIHJpbmcgdG8gcnVsZSB0aGVtIGFsbCwKICAgb25lIHJpbmcgdG8gZmluZCB0aGVtLApPbmUgcmluZyB0byBicmluZyB0aGVtIGFsbAogICBhbmQgaW4gdGhlIGRhcmtuZXNzIGJpbmQgdGhlbS4KCg"
I think more important than URL shorteners is URL structure. It should work like a directory, where you can intuitively guess what subdirectories there are.
This NYT link is actually pretty good. year/month/day/category/title is pretty nice. The query parameter ruins it, but that's kind of unavoidable if you want a reproducible link with extra data. Usually you can omit query parameters and the URL should still work.
kmoser · 15h ago
Not saying you're wrong, just wondering if it was an option to take a picture of the screen using your phone. In my experience, that's how students tend to capture such info when it isn't provided to them electronically.
bigstrat2003 · 17h ago
"Rare" doesn't mean "non-existent". Some people (such as yourself) might indeed need to type links in manually on a regular basis, but for most people it is quite rare.
cxr · 17h ago
It's Conway's law again. Orgs could mint and maintain uptime for short/memorable permalinks hosted on their own domains, but the Web has been captured by a professional class with predilections and goals all their own and which are rarely aligned with the org and the people/purpose it serves.
o11c · 14h ago
Historically there were a lot of CMS's with very small character limits. I remember a war between my high school teachers trying to make their pages work, and the IT department trying to ban all URL shorteners categorically (which admittedly I never understood, given that the DNS interceptor should be able to catch the redirected domain)
xmprt · 17h ago
I never used them for shortening the URL but it helped to give a human understandable name to a link. For example, if I'm linking to a Google Doc, then I can either share the base64(?) link that Google autogenerates, or share short.link/insert-name-here.
benwills · 15h ago
URL shorteners significantly increased in popularity as Twitter did. There was originally a 140 character limit that became quite squeezed when adding most URLs, especially to blog posts where the title is part of the URL.
Later, adding things like analytics and tracking (eg: not just in social media, but also in email campaigns) became another reason to use them, especially for those less tech inclined.
dkiebd · 18h ago
It used to be that characters in urls would count for the character limit in tweets. So people would use shorteners to be able to fit more stuff in a tweet.
mariotacke · 19h ago
One use-case I found somewhat reasonable is to shorten to embed the URL in a smaller QR code, especially ones with lots of parameters.
skwirl · 19h ago
They are useful for putting URLs in print materials like books. Useful for sharing very long links in IRC and some other text based chat apps (many google maps links would span multiple IRC lines if not shortened, for example). They are good for making more easily scannable QR codes.
jtbayly · 19h ago
The reasons I’ve used them are related to easy tracking of stats. Which (short)links I created to this page are producing traffic?
Of course, this is possible in other ways, but note that I said, “easy”. ;)
unethical_ban · 19h ago
For an internal organization filled with non-technical users (most), link shorteners are very useful. Each shortlink has an owner, it can have delegates with edit capability, full history is shown to the entire organization.
yes, it is still a little chaotic, and no, it should not be a complete substitute for a well-indexed corporate directory. But it is a really useful shortcut generator for the people that are good at memorizing them, and it costs little to operate internally. Every place that I've worked at that didn't have one, I wish had one.
pythonaut_16 · 19h ago
Go links are a specific subset of short links with a slightly different purpose than services like goo.gl.
And more organizations should adopt them because they are great.
black_puppydog · 19h ago
it makes QR codes smaller. /s
hollowonepl · 20h ago
Well, to answer the post title question directly: far away from google services for years. It was obvious to me 10+ years ago that they only sustainable business products for them is search, yt, and marketing tracking/performance tools. Nothing else is stable, nor secured and there were examples of it before that shortener existed. G+ exposed how vulnerable google is regarding addon services and now they can’t even keep up with the archives (usenet). URL Shortener… please, that by definition looks like vaporware regardless of the vendor, anyway.
taftster · 19h ago
I don't know that URL shorteners are/were vaporware, per se. They delivered what they promised. They just didn't have any business model behind it that could keep it sustained. It was the classic, "we have built up traffic, now let's figure out how to monetize" model.
rickdeckard · 19h ago
I wonder if anyone at Google considered just contributing the DB and the domain to the Internet Archive.
After stripping any past statistical data from each entry, it shouldn't be that much of data per URL...
Aurornis · 19h ago
It’s a security issue to have a goo.gl domain that redirects to arbitrary pages. Attackers can find goo.gl links that go to expired domains, register them, and then they have a goo.gl link to use in phishing attacks.
Giving the domain to a 3rd party is not going to happen.
rickdeckard · 15h ago
So it was already a security issue for goo.gl to exist in first place?
The point is whether Google considered any other options than keep operating it or burning everything to the ground. Google could also keep the domain and let users reach a intermediary landing page of the Internet Archive first
ants_everywhere · 19h ago
Google won't do this IMO because that's all user data. A user submitted the pairing between the key and the value. If they released an entire database of user data they would run afoul of data privacy regulations.
They could possibly provide a GCP service where you make an authenticated request to look up the value of a given goo.gl key. That would mitigate fishing concerns, eliminate the pressure of running a productionized legacy service, and allow the to do use quotas etc to tamp down on abuse. But that also would be covered by the regulatory laws and I don't know what they say about such a thing.
sneak · 19h ago
URL targets are secrets as they are expected to be non-public.
It would be a serious breach of trust for them to publish the database. It likely includes links to non-public YT video URLs, for example.
skwirl · 19h ago
The database is being reverse engineered and published anyways, as per the article.
therealdrag0 · 4h ago
I think Archive is just rehydrating shortened links in webpages that have been archived. I doubt They’re discovering previously unknown urls.
pimlottc · 20h ago
This is a great summary + call to action. The ArchiveTeamWarrior tool from the Internet Archive is a pretty sweet piece of kit, it is fun and satisfying to watch it crunch away at URLs!
progbits · 15h ago
Nice to see it speeding up.
Two weeks ago [1] they were doing 37k a minute with ETA just barely before end of the month. Now it's ~55k a minute, and ETA of just 5 more days.
Very easy to set up as well - at least provided you already have a VM manager set up
encom · 19h ago
Isn't there just some simple Python script buried below piles of vm/docker nonsense I can run?
pimlottc · 17h ago
There are a couple options for other containers (Podman, Orbstack) on their wiki [0]. But you're free to dig into the Dockerfile [1] if you really want to try to figure out how to run it directly.
It seems like Google doesn't want a single machine visiting so many links in such a short time. I wonder what the "bad response" could be?
Google asks for a login, sleeping 20 minutes.
Server returned bad response. Sleeping 8 seconds.
635=302 https://images.google.com.pk/imgres?imgurl=http://www.pakiboutique.com/wp-content/gallery/zainab-chottani-pret-collection-2017/zainab-chottani-luxury-pret-2017-7.jpg&imgrefurl=http://www.pakiboutique.com/zainab-chottani-pret-collection-2017&docid=0gx_lzfAk8YInM&tbnid=qKdQuVyBbOHJfM:&w=948&h=1280&source=sh/x/im
Google asks for a login, sleeping 20 minutes.
Server returned bad response. Sleeping 11 seconds.
636=302 https://images.google.com.pk/imgres?imgurl=http://www.pakiboutique.com/wp-content/gallery/zainab-chottani-pret-collection-2017/zainab-chottani-luxury-pret-2017-7.jpg&imgrefurl=http://www.pakiboutique.com/zainab-chottani-pret-collection-2017&docid=0gx_lzfAk8YInM&tbnid=qKdQuVyBbOHJfM:&w=948&h=1280&source=sh/x/im
Google asks for a login, sleeping 20 minutes.
Server returned bad response. Sleeping 29 seconds.
637=302 https://images.google.com.pk/imgres?imgurl=http://www.pakiboutique.com/wp-content/gallery/zainab-chottani-pret-collection-2017/zainab-chottani-luxury-pret-2017-7.jpg&imgrefurl=http://www.pakiboutique.com/zainab-chottani-pret-collection-2017&docid=0gx_lzfAk8YInM&tbnid=qKdQuVyBbOHJfM:&w=948&h=1280&source=sh/x/im
moomin · 18h ago
A small thing that drives me up the wall:
* Microsoft has code that references a URL shorterner
* The URL shortener is in-house and points to a Microsoft property.
* However, the documentation team don’t keep this stuff up to date, resulting in you getting error messages containing a broken link.
You would have thought it would be more than possible for Microsoft to keep their own house in order.
giancarlostoro · 20h ago
Considering a URL shortener is basically a KV data store, why? It's an insanely low amount of code to maintain a URL shortener.
kentonv · 19h ago
Yes, particularly when considering they stopped accepting new links six years ago. It's a read-only data set and the total storage backing it is probably trivial by Google standards.
But they actually backtracked and said they'll keep the "active" links working.
Why even spend the effort to remove the "inactive" links? They must feel they represent some sort of liability?
Or would it have been too embarrassing to just cancel the whole turn-down plan?
olejorgenb · 19h ago
I'm not sure if this is the definition of "active" the have used, but:
<quote> Over time, these existing URLs saw less and less traffic as the years went on - in fact more than 99% of them had no activity in the last month.</quote>
I'm sure it's correct that the wast majority of links would never be used again, but to gauge which links this is I'd say you should measure at LEAST a year.
rsync · 19h ago
I run a public URL shorter and you are correct:
it is relatively simple and inexpensive to operate and there is no reason it can’t continue operating into the indefinite future.
I disagree with the op on one major point… They suggest that the proper way for a service like this to operate is with a public database of links…
However, when I think of all of the different ways, one might use a shorter, it seems obvious that many links would necessarily be sensitive and/or private.
Therefore, I consider myself to have a dual mandate: privacy of the links created, and a duty, that I have assumed, of running the service for decades.
jszymborski · 19h ago
I think that's fair. 301works.org[0], for example, requires its members to upload their DB to Internet Archive, but IA keeps it private. They then ask for the technical control of the domain to be transferred 301works.org in the event of the site's closure so they can continue serving the traffic.
I do think that, however, that a service can keep their DB public _iff_ users know that's the case before creating links. Often times, folks are just shortening public links, but it is true that people sometimes share "private" links (e.g., Google Doc private links, etc...) which shouldn't be enumerated.
Running any public facing service as a company with a brand and security policies is a non-zero amount of work
Also, don't use link shorteners... and don't build link shorteners as it just enables people to use them.
karel-3d · 18h ago
It's not technically but abuse is a problem with any user-submitted data and always will be
giancarlostoro · 18h ago
Sure, but as others have noted, they dont accept new links since... 6 years ago. So in the grand scheme of things... It's really pointless, and breaks obscure portions of the web where these links might still be.
pimlottc · 19h ago
Because maintaining it will not get anyone a promotion
morkalork · 19h ago
I remember there was a blog by someone who implemented their own url-shortener and it turned into a huge amount of work fighting spam, scams and malicious actors abusing the system to the point of their domain was being threatened with blacklisting. But then again, if anyone has the transferable skills to handle that, it would be google since they're already battling the same bad people on all their other services.
kentonv · 19h ago
But those problems all went away six years ago when Google stopped accepting new links.
Maxion · 19h ago
URLs rot, domains expire. So I don't think the problem went away completely.
jsploit · 19h ago
It's still being abused by people registering expired domains.
morkalork · 19h ago
Exactly, there is no guarantee that what a shortened url pointed to 10 years ago is the same as today, or even the same owner or administrator.
kentonv · 17h ago
OK but the same vulnerability exists if short URLs weren't used -- in that case the link source would link directly to the now-hijacked domain. So why does Google have to care about this?
Is it because they're worried that the domain name goo.gl in the link implies a Google endorsement? Seems like they should have thought of that before launching the service in the first place?
Still, the frequency of actual abuse must be low and going down over time (due to the data set being read-only since 2019 and actual traffic to these links surely decreasing as time goes on)...
webstrand · 19h ago
So if the issue is `goo.gl` is useful for phising, why not shut it down and leave a readonly copy on some other domain? Sure it still breaks every link, but for people who are in-the-know can use the alternate domain by manually translating it.
perdomon · 18h ago
What is this container doing exactly? Sequentially visiting every possible goo.gl link, seeing if it points somewhere, and saving that full URL if so?
8bitsrule · 8h ago
"their super stable and secure URL shortener is getting nuked from orbit in just over a year."
IIRC, that's about how long Google+ lasted, after they locked my access to the blog I'd been writing for four years ... until I gave them my real name.
<b>Moral of story</b>: With Google, it's always something.
nfriedly · 17h ago
I let the ArchiveTeam Warrior docker run in the background on my file server, configured to contribute to whichever project archiveteam picks. When I first set it up, that was archiving telegram channels, but I took a peak yesterday and it was backing up goo.gl links. Apparently I've uploaded 94gb's worth of links already.
kazinator · 15h ago
Google applications are still handing out goo.gl links.
If you go to Google Maps right now, drop a pin, and right lick to Share This Location you get a goo.gl link:
It always seemed strange to me that people would use a link shortener (that they didn't own!) for anything other than Twitter, or the small use-case of manually typing a URL into a different computer that you couldn't sync in some other way. In what way was this not the obvious outcome?
fudged71 · 15h ago
I just sent my Manus agent to run this across 20 instances. Interesting to see how many instances failed installation.
Anyone know why these would return “ Google asks for a login, sleeping 20 minutes.”?
amelius · 19h ago
Step 1: make everyone use your url shortener
Step 2: shut it down
Step 3: make every link redirect to an advertisement
Step 4: profit
No comments yet
tedchs · 17h ago
My question is, why can Google themselves not just provide a dump to archive.org themselves? Having volunteer middlemen doing the work seems like an artificial crisis.
brokensegue · 19h ago
people definitely should donate resources to archive team but as someone who has been participating in this project it's my understanding that all of the URLs at risk have already been backed up
swills · 19h ago
Might I suggest that instead of linking to the article at theatlantic.com you instead link to the archive.org archive of it? I couldn't visit it due to Cloudflare thinking IPv6 is the devil and then still couldn't read it due to the paywall. Here's the archive.org link that sorta works for me, until their JavaScript removes the majority of the page content shortly after loading:
Google is not some opaque corporate entity (I mean, yes it IS, but...), it is made up of individuals, many of whom read this site.
Google should offer to send (the public content from) their DB directly to The Internet Archive. It results in LESS overhead for Google than this attempt to scrape it, and results in better information.
ndr · 20h ago
They could offer the domain too to be honest.
eamag · 19h ago
The domain is still used in google maps "share" links, for example
https://maps.app.goo.gl/<id>
cxr · 17h ago
maps.app.goo.gl and
maps.google.com are exactly the same number of characters.
Those links smack of the same mindset behind "Refactor to rewrite procedure A in terms of procedure B. 7 lines deleted; 17 lines added."
GuinansEyebrows · 19h ago
subdomain delegation isn't too hard to manage :)
procaryote · 19h ago
security issue
GuinansEyebrows · 19h ago
sure, everything's a security issue when you don't pay attention to it :)
mcherm · 19h ago
They certainly could, but all I'm asking for here are things that SAVE money for Google. (They don't have to pay to support the scraping.)
anoncow · 19h ago
I don't think the reason is phishing. The reason is they will save money and who cares about people who are using it.
rkagerer · 15h ago
Not sure why anyone would trust Google to this given their track record...
hk1337 · 19h ago
It probably should never have been for public use, make it for official Google links only.
jmyeet · 20h ago
This is unbelievable. We're all used to Google abandonware by now but the resources required for storage of shortened URLs and any dev maintenance must be so miniscule to be a rounding error to a rounding error on any budget anywhere in the company.
What can possibly be the motivation for this? I see no rationale for it in the linked post.
Google already has severe trust issues with anything they launch. They hurt themselves because people expect whatever is being launched to be abandoned so why invest in it? Reputation matters.
rickdeckard · 19h ago
If I remember correctly, one of the purposes for goo.gl was for Google to get metrics on how such direct-links spread "out of band" and how often they were opened (as they would direct people to pages without Google acting as intermediary).
I guess Chrome metrics and URL-suffixes now answer this better (and so they probably failed to justify the internal budget to keep it)...
tracker1 · 17h ago
They simply don't care about anything that isn't making Billions a year. Domains was profitable and IMO one of the best domain experiences online and they still sold it off.
Google could easily afford to keep existing short links working indefinitely and simply not allow new entries... it's a simple KV lookup and redirect. Even with billions of links, hardly any overhead.
nvch · 19h ago
Maybe it's for the better. They are very clear and consecutive in their intentions, leaving no room for doubt.
alexvitkov · 19h ago
Conspiracy: They're shutting it down so competitors' AI and search crawlers can't visit the links, but theirs can, since archiving efforts aside only they have the DB.
Given that it doesn't make money, it has zero value alive and at least epsilon value dead. Plus they don't need the links to work to collect metrics.
p3rls · 15h ago
At the same time, google also promotes backlinks to the point where domains are bought up for $10k+ sometimes just to give it a redirect to a gambling site
Mousewheel through last years auctions on https://x.com/namemaxicom and can see how broken the entire system is from top to bottom.
opentokix · 17h ago
URL shorteners should not have existed in the first place.
It's mostly for print usage anyway.. and with all the TLDs out there now, you can definitely still get some short options. I literally have one strictly for (reverse)dns for my server(s). Another for my mail server, separate from the domains the server hosts.
Also, hosting your own means you don't have to worry about being associated with externalized spam links. Of course, if you are a spammer, then it works even better for users that will block you.,
Having an official Google domain that anyone can hijack is dangerous, given that many people's main internet identity is GMail (aka their Google account). I know anyone can create an offshoot (goooogle.org, etc), but Google was using goo.gl too.
It was easy to redirect a goo.gl to a Google login page (which is on a real Google domain), and trick people into authorizing access to their account.
I consider myself savvy, and I got a pretty convincing one recently. The email looked legit, and the link was a goo.gl link that ultimately landed me on a legitimate Google login page. It didn't trick me, but it did take me a few minutes to figure out how it wasn't legit.
NOTE: This article is kinda misleading. They already stopped letting people add new links in 2019. And now, they're only removing "inactive" links, AKA links that had no activity since 2024. If you visit a link right now, it will be kept. Here's more info: https://blog.google/technology/developers/googl-link-shorten...
Instead, they're just disappearing _all_* goo.gl short links. The overwhelming majority of which are benign links made by users who were promised a super stable URL link shortening service backed by the Google brand.
*edit: Not all, but nearly.
> All other [active] goo.gl links will be preserved and will continue to function as normal. To check if your link will be retained, visit the link today. If your link redirects you without a message, it will continue to work.
https://developers.googleblog.com/en/google-url-shortener-li...
They already did this in 2019.
The service has been deprecated for a very long time.
> Instead, they're just disappearing _all_ goo.gl short links
This is false. They are sunsetting inactive links.
It is true that it is not _all_ links, apologies. "Inactive" here is defined as "not visited in 2024" which is a crazy small envelope. I wouldn't be surprised if nearly all links were deleted.
...which is even harder to justify than removing them all.
The goo.gl link shortener hasn’t accepted new links for many years. Over 99% of the links had no recent activity. The play was to scrape the web for old goo.gl links that went to expired domains, register the domain, and then you have a goo.gl URL that you can send wherever you want, indefinitely.
Nearly all of the angry blog posts, Tweets, and HN comments missed this and jumped to the conclusion that it was purely a cost cutting measure, but link official-looking open redirect URLs are a big deal in the security space.
"Recent" is defined within the last year. If the Wayback Machine adopted this logic, it would be useless.
The security concerns were largely addressed by not accepting new links. This was a cost cutting measure, plain and simple. I think we all agree that a goo.gl shortener was a terrible idea to begin with, and my blog post even shows evidence that folks knew this was a bad idea at launch.
It would make sense if they were pruning links whose TARGETS were no longer responding. But all the unused links are costing essentially nothing. Essentially all the cost was spent already.
> The play was to scrape the web for old goo.gl links that went to expired domains, register the domain, and then you have a goo.gl URL that you can send wherever you want, indefinitely.
For goo.gl links that were created by google, continue redirecting them as normal. For others, show a warning page explaining to the user that the link wasn't created (or vouched for) by google. If they press an "agree" button, still don't show a clickable link, but instead show it as plain text to be copied.
This makes me wonder if they're retiring sites.google.com any time soon?
No, only those that were deemed "active" in 2024 will be kept.
Is there a major benefit I'm missing? I could kind of see them if you have a character limit, want to hide the URL, or have to type a URL manually. But manual typing is rare, and even microblogging services are expanding character limits. Hiding the URL seems slightly sketchy, but you can achieve it without a shorter URL so maybe that's not a real benefit.
Anyway, I'm actually curious about this because people seem to love them.
(and this is aside from all the very valid issues and concerns people have with Google shutting down a widely used service).
But otherwise I think it's about traffic and tracking.
First in that you can track visitors to the short link, perhaps in a more centralized place if your org has multiple systems or if you're linking to sites you don't otherwise control. And secondly it's used when tacking on tons of query parameters that would otherwise make it too long even with expanded character limits. And just aesthetically a short link might be more aesthetic in your post.
Imagine seeing "link.short/abcd1234" vs "linksrus.corporate/blog/2025/08/12/the-title-of-our-post.html?utm=something&some-other-tracking-param=abcdefghijklmnopqrstuvwxyz&even-more-data-in-the-url&some-hashed-value=QXNoIG5hemcgZHVyYmF0dWzDu2ssCiAgIGFzaCBuYXpnIGdpbWJhdHVsLApBc2ggbmF6ZyB0aHJha2F0dWzDu2sKICAgYWdoIGJ1cnp1bS1pc2hpIGtyaW1wYXR1bAoKT25lIHJpbmcgdG8gcnVsZSB0aGVtIGFsbCwKICAgb25lIHJpbmcgdG8gZmluZCB0aGVtLApPbmUgcmluZyB0byBicmluZyB0aGVtIGFsbAogICBhbmQgaW4gdGhlIGRhcmtuZXNzIGJpbmQgdGhlbS4KCg"
It's absolutely not rare. I can't tell you how many times I sat in the back of a classroom, squinting at a whiteboard, trying to type copy a URL that looked like this: https://www.nytimes.com/2025/08/12/technology/silicon-chips-...
This NYT link is actually pretty good. year/month/day/category/title is pretty nice. The query parameter ruins it, but that's kind of unavoidable if you want a reproducible link with extra data. Usually you can omit query parameters and the URL should still work.
Later, adding things like analytics and tracking (eg: not just in social media, but also in email campaigns) became another reason to use them, especially for those less tech inclined.
Of course, this is possible in other ways, but note that I said, “easy”. ;)
go/holidays = holiday schedule
go/itrequest
go/ithelpdesk
go/corpcalendar
go/fsckyourself = self-service filesystem healthchecks
---
yes, it is still a little chaotic, and no, it should not be a complete substitute for a well-indexed corporate directory. But it is a really useful shortcut generator for the people that are good at memorizing them, and it costs little to operate internally. Every place that I've worked at that didn't have one, I wish had one.
And more organizations should adopt them because they are great.
After stripping any past statistical data from each entry, it shouldn't be that much of data per URL...
Giving the domain to a 3rd party is not going to happen.
The point is whether Google considered any other options than keep operating it or burning everything to the ground. Google could also keep the domain and let users reach a intermediary landing page of the Internet Archive first
They could possibly provide a GCP service where you make an authenticated request to look up the value of a given goo.gl key. That would mitigate fishing concerns, eliminate the pressure of running a productionized legacy service, and allow the to do use quotas etc to tamp down on abuse. But that also would be covered by the regulatory laws and I don't know what they say about such a thing.
It would be a serious breach of trust for them to publish the database. It likely includes links to non-public YT video URLs, for example.
Two weeks ago [1] they were doing 37k a minute with ETA just barely before end of the month. Now it's ~55k a minute, and ETA of just 5 more days.
[1] https://news.ycombinator.com/item?id=44688478
0: https://wiki.archiveteam.org/index.php/ArchiveTeam_Warrior#A...
1: https://github.com/ArchiveTeam/warrior-dockerfile/blob/maste...
Google shifts goo.gl policy: Inactive links deactivated, active links preserved - https://news.ycombinator.com/item?id=44759918 - Aug 2025 (189 comments)
Google's shortened goo.gl links will stop working next month - https://news.ycombinator.com/item?id=44683481 - July 2025 (222 comments)
Google URL Shortener links will no longer be available - https://news.ycombinator.com/item?id=40998549 - July 2024 (49 comments)
Ask HN: Google is sunsetting goo.gl on 3/30. What will be your URL shortener? - https://news.ycombinator.com/item?id=19385433 - March 2019 (14 comments)
Tell HN: Goo.gl (Google link Shortener) is shutting down - https://news.ycombinator.com/item?id=16902752 - April 2018 (45 comments)
Google is shutting down its goo.gl URL shortening service - https://news.ycombinator.com/item?id=16722817 - March 2018 (56 comments)
Transitioning Google URL Shortener to Firebase Dynamic Links - https://news.ycombinator.com/item?id=16719272 - March 2018 (53 comments)
* Microsoft has code that references a URL shorterner
* The URL shortener is in-house and points to a Microsoft property.
* However, the documentation team don’t keep this stuff up to date, resulting in you getting error messages containing a broken link.
You would have thought it would be more than possible for Microsoft to keep their own house in order.
But they actually backtracked and said they'll keep the "active" links working.
Why even spend the effort to remove the "inactive" links? They must feel they represent some sort of liability?
Or would it have been too embarrassing to just cancel the whole turn-down plan?
<quote> Over time, these existing URLs saw less and less traffic as the years went on - in fact more than 99% of them had no activity in the last month.</quote>
I'm sure it's correct that the wast majority of links would never be used again, but to gauge which links this is I'd say you should measure at LEAST a year.
it is relatively simple and inexpensive to operate and there is no reason it can’t continue operating into the indefinite future.
I disagree with the op on one major point… They suggest that the proper way for a service like this to operate is with a public database of links…
However, when I think of all of the different ways, one might use a shorter, it seems obvious that many links would necessarily be sensitive and/or private.
Therefore, I consider myself to have a dual mandate: privacy of the links created, and a duty, that I have assumed, of running the service for decades.
I do think that, however, that a service can keep their DB public _iff_ users know that's the case before creating links. Often times, folks are just shortening public links, but it is true that people sometimes share "private" links (e.g., Google Doc private links, etc...) which shouldn't be enumerated.
[0] https://archive.org/details/301works-faq
Also, don't use link shorteners... and don't build link shorteners as it just enables people to use them.
Is it because they're worried that the domain name goo.gl in the link implies a Google endorsement? Seems like they should have thought of that before launching the service in the first place?
Still, the frequency of actual abuse must be low and going down over time (due to the data set being read-only since 2019 and actual traffic to these links surely decreasing as time goes on)...
IIRC, that's about how long Google+ lasted, after they locked my access to the blog I'd been writing for four years ... until I gave them my real name.
<b>Moral of story</b>: With Google, it's always something.
If you go to Google Maps right now, drop a pin, and right lick to Share This Location you get a goo.gl link:
https://maps.app.goo.gl/mPMvmv3dJDsLGvtF9
Anyone know why these would return “ Google asks for a login, sleeping 20 minutes.”?
Step 2: shut it down
Step 3: make every link redirect to an advertisement
Step 4: profit
No comments yet
https://web.archive.org/web/20250527052607/https://www.theat...
Google is not some opaque corporate entity (I mean, yes it IS, but...), it is made up of individuals, many of whom read this site.
Google should offer to send (the public content from) their DB directly to The Internet Archive. It results in LESS overhead for Google than this attempt to scrape it, and results in better information.
https://maps.app.goo.gl/<id>
Those links smack of the same mindset behind "Refactor to rewrite procedure A in terms of procedure B. 7 lines deleted; 17 lines added."
What can possibly be the motivation for this? I see no rationale for it in the linked post.
Google already has severe trust issues with anything they launch. They hurt themselves because people expect whatever is being launched to be abandoned so why invest in it? Reputation matters.
I guess Chrome metrics and URL-suffixes now answer this better (and so they probably failed to justify the internal budget to keep it)...
Google could easily afford to keep existing short links working indefinitely and simply not allow new entries... it's a simple KV lookup and redirect. Even with billions of links, hardly any overhead.
Given that it doesn't make money, it has zero value alive and at least epsilon value dead. Plus they don't need the links to work to collect metrics.
Mousewheel through last years auctions on https://x.com/namemaxicom and can see how broken the entire system is from top to bottom.
It's mostly for print usage anyway.. and with all the TLDs out there now, you can definitely still get some short options. I literally have one strictly for (reverse)dns for my server(s). Another for my mail server, separate from the domains the server hosts.
Also, hosting your own means you don't have to worry about being associated with externalized spam links. Of course, if you are a spammer, then it works even better for users that will block you.,