Anubis saved our websites from a DDoS attack

233 DoctorOW 147 5/1/2025, 10:34:15 PM fabulous.systems ↗

Comments (147)

mrweasel · 1h ago
Sadly it hard to tell if this is an actual DDoS attack, or scrappers descending on the site. It all looks very similar.

The search engines always seemed happy to announce that they are in fact GoogleBot/BingBot/Yahoo/whatever and frequently provided you with their expected IP ranges. The modern companies, mostly AI companies, seems to be more interested in flying under the radar, and have less respect for the internet infrastructure at a whole. So we're now at a point where I can't tell if it's an ill willed DDoS attack or just shitty AI startup number 7 reloading training data.

jeroenhd · 6m ago
> The modern companies, mostly AI companies, seems to be more interested in flying under the radar, and have less respect for the internet infrastructure at a whole

I think that makes a lot of sense. Google's goal is (or perhaps used to be) providing a network of links. The more they scrape you, the more visitors you may end up receiving, and the better your website performs (monetarily, or just in terms of providing information to the world).

With AI companies, the goal is to consume and replace. In their best case scenario, your website will never receive a visitor again. You won't get anything in return for providing content to AI companies. That means there's no reason for website administrators to permit the good ones, especially for people who use subscriptions or ads to support their website operating costs.

piokoch · 32m ago
Yes, search engines were not hiding, as it was website owner interest involved here as well - without those search bots their sites would not be indexed and searchable in the Internet. So there was kind of win-win situation, in most typical cases at least, as for instance publishers complained about deep links, etc. because their ads revenue was hurt.

AI scrapping bots provide zero value for sites owners.

CaptainFever · 1h ago
> To me, Anubis is not only a blocker for AI scrapers. Anubis is a DDoS protection.

Anubis is DDoS protection, just with updated marketing. These tools have existed forever, such as CloudFlare Challenges, or https://github.com/RuiSiang/PoW-Shield. Or HashCash.

I keep saying that Anubis really has nothing much to do with AI (e.g. some people might mistakenly think that it magically "blocks AI scrapers"; it only slows down abusive-rate visitors). It really only deals with DoS and DDoS.

I don't understand why people are using Anubis instead of all the other tools that already exist. Is it just marketing? Saying the right thing at the right time?

Imustaskforhelp · 53m ago
I agree with you that it is infact a DDOS protection but still, the fact that it is open source and created by a really cool dev (she is awesome), I think I don't really mind it gaining popularity. And also they had created it out of their own necessity which is also really nice.

Anubis is getting real love out there and I think I am all for it. I personally host a lot of my stuff on cloudflare due to it being free with cloudflare workers but if I ever have a vps, I am probably going to use anubis as well

alias_neo · 31m ago
I'm not sure why there's so many negative comments here. This looks nice, appears to work, is open source and MIT licensed. Why _wouldn't_ I use this?
fmajid · 15m ago
It also doesn’t cede more market power to CloudFlare, which tends to block non-mainstream browsers, users with adblockers, Tor, or cookies and JavaScript disabled.
amarcheschi · 45s ago
I don't know what have I done but I'd say I get blocked by cloudflare a few visits per week. It's not a huge deal but it's very annoying
JodieBenitez · 51m ago
> I don't understand why people are using Anubis instead of all the other tools that already exist. Is it just marketing? Saying the right thing at the right time?

Care to share existing solutions that can be self-hosted ? (genuine question, I like how Anubis works, I just want something with a more neutral look and feel).

consp · 1h ago
Knowing something exists is half the challenge. Never used it but ,maybe ease of use/setup or license?
immibis · 1h ago
marketing plus a product that Just Does The Thing, it seems like. No bullshit.

btw it only works on AI scrapers because they're DDoSes.

chrisnight · 8h ago
> Solving the challenge–which is valid for one week once passed–

One thing that I've noticed recently with the Arch Wiki adding Anubis, is that this one week period doesn't magically fix user annoyances with Anubis. I use Temporary Containers for every tab, which means that I constantly get Anubis regenerating tokens, since the cookie gets deleted as soon as the tab is closed.

Perhaps this is my own problem, but given the state of tracking on the internet, I do not feel it is an extremely out-of-the-ordinary circumstance to avoid saving cookies.

jsheard · 8h ago
It could be worse, the main alternative is something like Cloudflares death-by-a-thousand-CAPTCHAs when your browser settings or IP address put you on the wrong side of their bot detection heuristics. Anubis at least doesn't require any interaction to pass.

Unfortunately nobody has a good answer for how to deal with abusive users without catching well behaved but deliberately anonymous users in the crossfire, so it's just about finding the least bad solution for them.

lousken · 8h ago
I hated everyone who enabled the cloudflare validation thing on their website, because it was blocked for months (I got stuck on that captcha that was refusing my Firefox). Eventually they fixed it but it was really annoying.
goku12 · 4h ago
The CF verification page still appears far too often in some geographic regions. It's such an irritant that I just close the tab and leave when I see it. It's so bad that seeing the Anubis page instead is actually a big relief! I consider the CF verification and its enablers as a shameless attack the open web - a solution nearly as bad as the problem it tries to solve.
_bin_ · 1h ago
Forget esoteric areas, I'm an average American guy who gets them running from a residential IP or cell IP. It even happens semi-frequently on my iPhone which is insane. I guess I must have "bot-like" behavior in my browsing, even from a cell.
throwaway562if1 · 2h ago
I am still unable to pass CF validation on my desktop (sent to infinite captcha loop hell). Nowadays I just don't bother with any website that uses it.
gruez · 5h ago
>It could be worse, the main alternative is something like Cloudflares death-by-a-thousand-CAPTCHAs when your browser settings or IP address put you on the wrong side of their bot detection heuristics.

Cloudflare's checkbox challenge is probably the better challenge systems. Other security systems are far worse, requiring either something to be solved, or a more annoying action (eg. holding a button for 5 seconds).

Dylan16807 · 2h ago
Checking a box is fine when it lets you through.

The problem is when cloudflare doesn't let you through.

notpushkin · 2h ago
Yeah. A “drag this puzzle piece” captcha style is also relatively easy, but things like reCaptcha or hCaptcha are just infuriating.

For pure POW (no fingerprinting), mCaptcha is a nice drop-in replacement you can self-host: https://mcaptcha.org/

trod1234 · 8h ago
> Unfortunately nobody has a good answer for how to deal with abusive users without catching well behaved but deliberately anonymous users in the crossfire...

Uhh, that's not right. There is a good answer, but no turnkey solution yet.

The answer is making each request cost a certain amount of something from the person, and increased load by that person comes with increased cost on that person.

halosghost · 8h ago
Note that this is actually one of the things Anubis does. That's what the proof-of-work system is, it just operates across the full load rather than targeted to a specific user's load. But, to the GP's point, that's the best option while allowing anonymous users.

All the best,

-HG

No comments yet

tpxl · 1h ago
This makes discussions such as this have a negative ROI for an average commenter. Spamming scam and grift links still has a positive ROI, albeit a slightly smaller one.

I use a certain online forum which sometimes makes users wait 60 or 900 seconds before they can post. It has prevented me from making contributions multiple times.

immibis · 1h ago
I'm using one with a 5 in 14400 seconds timer right now. Ditto.
Spivak · 8h ago
I know that you mean a system that transfers money but you are also describing Anubis because PoW is literally to make accessing the site cost more and scale that cost proportional to the load.
trod1234 · 8h ago
> I know that you mean a system that transfer money ....

No, cost is used in the fullest abstract meaning of the word here.

Time cost, effort cost, monetary cost, work cost, so long as there is a functional limitation that prevents resource exhaustion that is the point.

lelandbatey · 7h ago
If cost can be anything, does Anubis implement such a system then, by using proof-of-work as the cost function?
goku12 · 4h ago
TiredOfLife · 1h ago
It's not a problem. You have configured your system to show up as a new visitor every time you visit a website. And you are getting expected behaviour.
jillyboel · 7h ago
> One thing that I've noticed recently with the Arch Wiki adding Anubis

Is that why it now shows that annoying slow to load prompt before giving me the content I searched for?

esseph · 7h ago
Would you like to propose an alternative solution that meets their needs and on their budget?
goku12 · 3h ago
Anubis has a 'slow' and a 'fast' mode [1], with fast mode selected by default. It used to be so fast that I rarely used to get time to read anything on the page. I don't know why it's slower now - it could be that they're using the slower algorithm, or else the algorithm itself may have become slower. Either way, it shouldn't be too hard to modify it with a different algorithm or make the required work a parameter. This of course has the disadvantage of making it easier for the scrapers to get through.

[1] https://anubis.techaro.lol/docs/admin/algorithm-selection

jeroenhd · 11s ago
The DIFFICULTY environment variable already allows for configuring how many iterations the program will run (in powers of 10).

The fast/slow selection still applies, but if you put up the difficulty, even the fast version will take some time.

jillyboel · 7h ago
a static cache for anyone not logged in, and only doing this check when you are authenticated which gives access to editing pages?

edit: Because HN is throwing "you're posting too fast" errors again:

> That falls short of the "meets their needs" test. Authenticated users already have a check (i.e., the auth process). Anubis is to stop/limit bots from reading content.

Arch Wiki is a high value target for scraping so they'll just solve the anubis challenge once a week. It's not going to stop them.

lelanthran · 41m ago
> Arch Wiki is a high value target for scraping so they'll just solve the anubis challenge once a week.

ISTR that Anubis allows the site-owner to control the expiry on the check; if you're still getting hit by bots, turn the check to 5s with a lower "work" effort so that every request will take (say) 2s, and only last for 5s.

(Still might not help though, because that optimises for bots at the expense of humans - a human will only do maybe one actual request every 30 - 200 seconds, while a bot could do a lot in 5s).

pynappo · 4h ago
> Arch Wiki is a high value target for scraping so they'll just solve the anubis challenge once a week. It's not going to stop them.

The goal of Anubis isn't to stop them from scraping entirely, but rather to slow down aggressive scraping (e.g. sites with lots of pages being scraped every 6 hours[1]) so that the scraping doesn't impact the backend nearly as much

[1] https://pod.geraspora.de/posts/17342163, which was linked as an example in the original blog post describing the motivation for anubis[2]

[2]: https://xeiaso.net/blog/2025/anubis/

glenngillen · 5h ago
That falls short of the "meets their needs" test. Authenticated users already have a check (i.e., the auth process). Anubis is to stop/limit bots from reading content.
bscphil · 6h ago
It's even worse if you block cookies outright. Every time I hit a new Anubis site I scream in my head because it just spins endlessly and stupidly until you enable cookies, without even a warning. Absolutely terrible user experience; I wouldn't put any version of this in front of a corporate / professional site.
Dylan16807 · 4h ago
Blocking cookies completely is just asking for a worse method of tracking sessions. It's fine for a site to be aware of visits. As someone who argues that sites should work without javascript, blocking all cookies strikes me as doing things wrong.
bscphil · 1h ago
A huge proportion of sites (a) use cookies, (b) don't need cookies. You can easily use extensions to enable cookies for the sites that need them, while leaving others disabled. Obviously some sites are going to do shitty things to track you, but they'd probably be doing that anyway.

The issue I'm talking about is specifically how frustrating it is to hit yet another site that has switched to Anubis recently and having to enable cookies for it.

goku12 · 3h ago
I will take Anubis any day over its alternative - the cloudflare verification page. I just close the tab as soon as I see it.
jezek2 · 6h ago
If you want to browse the web without cookies (and without JS in an usable manner) you may try FixProxy[1]. It has a direct support for Anubis in the development version.

[1]: https://www.fixbrowser.org/blog/fixproxy

Spivak · 6h ago
Browsers that have cookies and/or JS disabled have been getting broken experiences for well over a decade, it's hard to take this criticism seriously when professional sites are the most likely to break in this situation.
forty · 50m ago
Anubis is nice, but could we have a PoW system integrated in protocols (http or TLS, I'm not sure) so we don't have to require JS ?
tpool · 5h ago
It's so bad we're going to the old gods for help now. :)
Hamuko · 9m ago
I’d sic Yogg-Saron on these scrapers if I could.
gitroom · 3h ago
Kinda love how deep this gets into the whole social contract side of open source. Honestly, it's been a pain figuring out what feels right when folks mix legal rules and personal asks.
lytedev · 2h ago
Yeah I had no idea that some folks would get so passionate about making changes to a piece of FOSS based on a request on a certain footer-esque documentation page.

I think its a great discussion though that gets to the heart of open source and software freedom and how that can seem orthogonal to business needs depending on how you squint.

vachina · 44m ago
It’s not Anubis that saved your website, literally any sort of Captcha, or some dumb modal with a button to click into the real contents would’ve worked.

These crawlers are designed to work on 99% of hosts, if you tweak your site just so slightly out of spec, these bots wouldn’t know what to do.

anonfordays · 1h ago
Looks similar to haproxy-protection: https://gitgud.io/fatchan/haproxy-protection/
Tiberium · 7h ago
From looking at some of the rules like https://github.com/TecharoHQ/anubis/blob/main/data/bots/head... it seems that Anubis explicitly punishes bots that are "honest" about their user agent - I might be missing something, but isn't this just pressuring anyone who does anything bot-related to just lie about their user agent?

Flat out user-agent blacklist seems really weird, it's going to reward the companies that are more unethical in their scraping practices than the ones who report their user agent truthfully. From the repo it also seems like all the AI crawlers are also DENY, which, again, would reward AI companies that don't disclose their identity in the user agent.

userbinator · 7h ago
User-agent header is basically useless at this point. It's trivial to set it to whatever you want, and all it does is help the browser incumbents.
Tiberium · 7h ago
You're right, that's why I'm questioning the reason Anubis implemented it this way. Lots of big AI companies are at least honest about their crawlers and have proper user agents (which Anubis outright blocks). So "unethical" companies who change the user-agent to something normal will have an advantage with the way Anubis is currently set up by default.

I'm aware that end users can modify the rules, but in reality most will just use the defaults.

xena · 6h ago
Shitty heuristics buy time to gather data and make better heuristics.
MillironX · 4h ago
Despite broadcasting their user agents properly, the AI companies ignore robots.txt and still waste my server resources. So yeah, the dishonest botnets will have an advantage, but I don't give swindlers a pass just because they rob me to my face. I'm okay with defaults that punish all bots.
goku12 · 3h ago
You can have a bot allow list. I think it's also being planned as a subscription service (not sure about this part).
wzdd · 1h ago
The point of anubis is to make scraping unprofitable by forcing bots to solve a sha256-based proof-of-work captcha, so another point of view is that the explicit denylist is actually saving those bot authors time and/or money.
EugeneOZ · 1h ago
The point is to reduce the server load produced by bots.

Honest AI scrapers use the information to learn, which increases their value, and the owner of the scraped server has to pay for it, getting nothing back — there's nothing honest about it. Search engines give you visitors, AI spiders only take your money.

justusthane · 6h ago
I don’t really understand why this solved this particular problem. The post says:

> As an attacker with stupid bots, you’ll never get through. As an attacker with clever bots, you’ll end up exhausting your own resources.

But the attack was clearly from a botnet, so the attacker isn’t paying for the resources consumed. Why don’t the zombie machines just spend the extra couple seconds to solve the PoW (at which point, they would apparently be exempt for a week and would be able to continue the attack)? Is it just that these particular bots were too dumb?

maeln · 40m ago
Most DDoS bot don't bother running JS. A lot of botnets don't even really allow it, because the malware they run on the infected target only allow for basic stuff like simple HTTP request. This is why they often do some reconnaissance to find pages that take a long time to load, and therefore are probably using a lot of I/O and/or CPU time on the target server. Then they just spam the request. Huge botnet don't even bother with all that, they just kill you with the bandwidth.
judge2020 · 6h ago
Anubis is new, so there may not have been foresight to implement a solver to get around it. Also, I wouldn't be surprised if the botnet actor is using vended software, not making it themselves to where they could quickly implement a solver to continue their attack.
cbarrick · 1h ago
I think the explanation "you’ll end up exhausting your own resources" is wrong for this case. I think you are correct that the bots are simply too dumb.

The likely explanation is that the bots are just curling the expensive URLs without a proper JavaScript engine to solve the challenge.

E.g. if I hack a bunch of routers around the world to act as my botnet, I probably wouldn't have enough storage to install Chrome or Selenium. The lightweight solution is just to use curl/wget (which may be pre-installed) or netcat/telnet.

rubyn00bie · 4h ago
Sort of tangential but I’m surprised folks are still using Apache all these years later. Is there a certain language that makes it better than Nginx? Or it just the ease of use configuration that still pulls people? I switched to Nginx I don’t even know how many years ago and never looked back, just more or less wondering if I should.
mrweasel · 1h ago
Apache does everything, it's fairly easy to configure. If there's something you want to do, Apache mostly knows how, or have a module.

If you run a fleet of servers, all doing different things, Apache is a good choice because all the various uses are going to be supported. It might not be the best choice in each individual case, but it is the one that works in all of them.

I don't know why some are so quick to write off Apache. Is just because it's old? It's still something like the second most used webserver in the world.

anotherevan · 3h ago
Equally tangential, but I switched form Nginx to Caddy a few years ago and never looked back.
ahofmann · 2h ago
I'm using nginx since what feels like decades and occasionally I miss the ability to use .htaccess files. This is a very nice way to configure stuff on a server.
felsqualle · 1h ago
I use it because that’s the one I’m most familiar with. Using it since 15 years and counting. And since it doesn’t the job for me, I never had the urge to look into alternatives.
ranger_danger · 9h ago
Seems like rate-limiting expensive pages would be much easier and less invasive. Also caching...

And I would argue Anubis does nothing to stop real DDoS attacks that just indiscriminately blast sites with tens of gbps of traffic at once from many different IPs.

PaulDavisThe1st · 9h ago
In the last two months, ardour.org's instance of fail2ban has blocked more than 1.2M distinct IP addresses that were trawling our git repo using http instead of just fetching the goddam repository.

We shut down the website/http frontend to our git repo. There are still 20k distinct IP addresses per day hitting up a site that issues NOTHING but 404 errors.

felsqualle · 3h ago
Hi, author here.

Caching is already enabled, but this doesn’t work for the highly dynamic parts of the site like version history and looking for recent changes.

And yes, it doesn’t work for volumetric attacks with tens of gbps. At this point I don’t think it is a targeted attack, probably a crawler gone really wild. But for this pattern, it simply works.

Ocha · 9h ago
Rate limit according to what? It was 35k residential IPs. Rate limit would end up keeping real users out.
linsomniac · 7h ago
Rate limit according to destination URL (the expensive ones), not source IP.

If you have expensive URLs that you can't serve more than, say 3 of at a time, or 100 of per minute, NOT rate limiting them will end up keeping real users out simply because of the lack of resources.

danielheath · 7h ago
Right - but if you have, say, 1000 real user requests for those endpoints daily, and thirty million bot requests for those endpoints, the practical upshot of this approach is that none of the real users get to access that endpoint.
Groxx · 5h ago
Yeah, at that point to might as well just turn off the servers. It's even cheaper at cutting off requests, and it'll serve just as many legitimate users.
EugeneOZ · 1h ago
No, it's not equal. These URLs might not be critical for users — they can still browse other parts of the site. If rate limiting is implemented for, let’s say, 3% of URLs, then 97% of the website will still be usable during a DoS attack.
pluto_modadic · 7h ago
this feels like something /you can do on your servers/, and that other folks with resource constraints (like time, budget, or the hardware they have) find anubis valuable.
bastawhiz · 9h ago
Rate limiting does nothing when your adversary has hundreds or even thousands of IPs. It's trivial to pay for residential proxies.
supportengineer · 8h ago
Why aren't there any authorities going after this problem?
danielheath · 7h ago
Most of the "free" analytics tools for android/iOS are "funded" by running residential / "real user" proxies.

They wait until your phone is on wifi / battery, then make requests on behalf of whoever has paid the analytics firm for access to 'their' residential IP pool.

nicce · 6h ago
Do you happen to have any link for blog/something about this?
willhbr · 4h ago
nicce · 4h ago
Thanks.
marginalia_nu · 7h ago
These residential botnets are pretty difficult to shut down, and often operated out of countries with poor diplomatic relations with the west.
o11c · 7h ago
Because in a "free" nation, that means "free to run malware" not "free from malware".

By far most malware is legal and a portion of its income is used to fund election campaigns.

eikenberry · 8h ago
They could be doing it legally.
toast0 · 2h ago
> And I would argue Anubis does nothing to stop real DDoS attacks that just indiscriminately blast sites with tens of gbps of traffic at once from many different IPs.

Volumetric DDoS and application layer DDoS are both real, but volumetric DDoS doesn't have an opportunity for cute pictures. You really just need a big enough inbound connection and then typically drop inbound UDP and/or IP fragments and turn off http/3. If you're lucky, you can convince your upstream to filter out UDP for you, which gives you more effective bandwidth.

lousken · 8h ago
Yes, have everything static (if you can't, use caching), optimize images, rate limit anything you have to generate dynamically
herpdyderp · 9h ago
Can Anubis be restyled to be more... professional? I like the playfulness, but I know at least some of my clients will not.
samhclark · 9h ago
You can, but they ask that you contact them to set up a contract. It's addressed here on the site:

>Anubis is provided to the public for free in order to help advance the common good. In return, we ask (but not demand, these are words on the internet, not word of law) that you not remove the Anubis character from your deployment.

>If you want to run an unbranded or white-label version of Anubis, please contact Xe to arrange a contract.

https://anubis.techaro.lol/docs/funding

pentagrama · 7h ago
Thanks for the information. Just to confirm, with the stock deployment it is not possible to remove the character, but there is an option to set the interface language for users? Spanish is supported?
xena · 2h ago
I think the project is now mature enough for i18n, I've been putting it off because adding it ossifies a lot of the design but I think it's ready now.
otterley · 7h ago
This is a very innovative way to earn a living with open source! Make the free version sickeningly cutesy (no offense to the author intended), and charge for the professional-looking version. No change in functionality, just chrome.
xena · 7h ago
I am actually working on changing functionality for paid customers, it's just access to a bigger database of default rules and IP reputation tracking.
otterley · 7h ago
I wish you best of luck! You're a very talented developer and artist. I'd be thrilled to work with you someday.
xena · 4h ago
Thanks! I'll be sure to post through it either way. My failure condition is going back to work somewhere else, so worst case it'll be more likely to happen :)

Really though my dayjob kinda burns me out because I have to focus on AEO, which is SEO but for AI. I get by making and writing about cool things, but damn does it hurt having to write for machines instead of humans.

No comments yet

lytedev · 8h ago
My "workaround" for this MIT-licensed software that does not allow me a simple and common customization was to have my reverse proxy redirect requests to the images. https://git.lyte.dev/lytedev/nix/pulls/92/files

Hope this is useful to others!

willriches · 8h ago
If you're going to break the social contract, just do so. Jumping through hoops to complicate the matter doesn't solve anything.
lytedev · 8h ago
I did so, though I would hardly call using MIT FOSS for my personal projects a breach of the social contract of open source. This was easier than forking, building a docker image, etc. I'm guessing it will be much easier for others, too, since the recommended config has you dink around with reverse proxy configuration no matter what.
idle_zealot · 6h ago
You are breaking the social contract of the project, not the legal one. The MIT license is the legal contract. The additional social contract is established by the author asking (without legal teeth) that you not do exactly what you did by removing the branding.

Compare to a take-a-penny-leave-a-penny tray from an era past. You are legally allowed to scoop up all the pennies into a bag, and leave the store, then repeat at the neighboring store, and make a few bucks. You'd be an asshole, but not face legal trouble. You "followed the rules" to the letter. But guess what? If you publish an easy how-to guide with "one weird trick" for making some quick cash, and people start adopting your antisocial behavior and emptying out change trays, you've forced the issue and now either a) businesses will stop offering this convenience or b) the rules around it will be tightened and the utility will be degraded. In the concrete case of Anubis, the maintainers may decide to stop contributing their time to this useful software or place a non-FOSS license on it in an attempt to stop gain-maximizing sociopaths from exploiting their efforts.

xena · 4h ago
To be fair, I'm not angry, I just think they're a coward because the UN kept the anime mascot intact. https://policytoolbox.iiep.unesco.org/

I even it out by how I prioritize feature requests, bug reports, and the like :)

lytedev · 3h ago
I'm surprised to read this from you, somebody I and many others hold in high regard as accepting and knowledgeable, insulting someone's character because they didn't like some specific aspect of your work or opinions or chose to ignore an ask in this particular use case.

I didn't implement this out of fear or some lack of courage. In fact I had the original avatars up for quite a while. I simply wanted my own logo so visitors wouldn't be potentially confused. It seemed to fit the use case and there was no way to achieve what I wanted without reaching out. I didn't feel comfortable bugging you or anybody on account of my tiny little no-traffic git forge even though, yes, that is what you politely asked for (and did not demand).

I think if you do feel this strongly you might consider changing the software's license or the phrasing of the request in the documentation. Or perhaps making it very clear that no matter how small, you want to be reached out to for the whitelabel version.

I think the success story of Anubis has been awesome to read about and follow and seeing how things unfold further will be fun to watch and possibly even contribute to. I'm personally rooting for you and your project!

jezek2 · 7m ago
And the author is breaking a social contract of not shoving stuff I don't want to see in an excessive amount (or being a contributor of it). Before I wouldn't mind to see some anime here or there, it's quite cute for most people. But lately I see it in much more places and more aggressive.

Some project even took it to the next level and displayed a furry porn. I think anime and furry graphics are related, esp. in the weird obsession of the people to shove it to the unsuspecting people, but since it's "cute" it's passable. Well unless it gets into the porn territory.

On the other hand I applaud the author for an interesting variation of making the free product slightly degraded so people are incentived to donate money. The power of defaults and their misuse.

Personally I'm not fan of enshittification of any kind even a slight one even when it's to my own detriment.

lytedev · 3h ago
You are correct in that I ignored a specific request, but you are also ignoring the larger social contract of open source that is also at play. To release software with a certain license has a social component of its own that seems to be unaccounted for here.

Your analogy to me seems imprecise, as analogies tend to be when it comes to digital goods. I'm not taking pennies in any sense here, preventing the next person from making use of some public good.

You can make a similar argument for piracy or open source, and yet... Here we all still are and open source has won for the most part.

LPisGood · 9h ago
I’ve heard people say that before. They would love to use it if there wasn’t a playful animated character.

The code is open source, so I can’t imagine making a fork to remove that is a Herculean effort.

unsnap_biceps · 9h ago
When I last looked into it, they are planning a white label service to customize the look and has been requesting folks to not fork and modify the images.

> Regardless, Xe did ask nicely to not change out the images shipped as a whitelabel service is planned in the future

https://github.com/TecharoHQ/anubis/pull/204#issuecomment-27...

xena · 8h ago
I've soft launched the commercial offering and I'm working on expanding the commercial features before I announce it more publicly. If you pay $50 a month on GitHub sponsors, you get access to BotStopper complete with custom CSS support. You'll also get access to the reputation database I'm working on named hivemind.
yjftsjthsd-h · 7h ago
> You'll also get access to the reputation database I'm working on named hivemind.

That feels uncomfortably close to returning to the privacy-and-CGNAT-hating embrace of cloudflare et al.

xena · 7h ago
My goal is to not have it outright block, but use the reputation database as a signal of when to throw a challenge.

However, you are allowed to believe what you want and I can't stop you from being wrong.

yjftsjthsd-h · 7h ago
> My goal is to not have it outright block, but use the reputation database as a signal of when to throw a challenge.

Oh, if it's just to make things potentially easier while leaving the baseline where it is then that's fine.

> However, you are allowed to believe what you want and I can't stop you from being wrong.

For instance, you appear to believe that I'm attacking you?

glenngillen · 5h ago
>> However, you are allowed to believe what you want and I can't stop you from being wrong.

>For instance, you appear to believe that I'm attacking you?

FWIW, that's not what I read. You made an assumption about implementation and the effects based on very little information. Xe simply said you can believe (i.e., make assumptions about) whatever you want. You then assumed (another one) that your comment was interpreted as an attack.

Maybe it was, maybe it wasn't. There's not enough context in here to know either way.

ketzo · 7h ago
> reputation database I'm working on named hivemind.

Anywhere I can read more about this? Sounds super interesting, and a cursory search didn’t show anything for it on your site.

Otherwise I’m sure I’ll hear about it soon anyway, at the rate Anubis is going!

xena · 7h ago
I'd be happy to talk about it if it existed, I'm still working out the details. But the basic idea is to take advantage of the fact that Anubis is a very popular project from what I've seen with logs that server admin have submitted the same IP blocks and the like hit instances of Anubis so some kind of IP reputation thing would work for this.

I am also working on some noJS checks, but I want to test them with paid customers in order to let a thousand flowers bloom.

pabs3 · 1h ago
That sounds a bit like what crowdsec does for SSH.

https://github.com/crowdsecurity/crowdsec

ketzo · 6h ago
Cool. Good luck on both that and Anubis generally — seems like you’ve found something that’s both a meaningful benefit to the common good AND could maybe make a buncha money, or at least enough to pay for development, which is awesome.
xena · 5h ago
Thanks! There's a lot of really hard problems to solve and most of them hinge around trust. I usually default into solving trust by making things open, but security software needs a bit of cloak and dagger by necessity. I'll find a balance I'm sure, but it's an annoying thing to balance.
LPisGood · 8h ago
That’s the beautiful thing about open source, they ask but do not demand.

Of course, if you use this service for your enterprise, the Right Thing To Do would be support the excellent project financially, but this is by no means required.

If you want to use this project on your site and don’t like the logo, you are free to change it. If the site is personal and this project is not something you would spend money on, I don’t even think it is unethical to change the image.

altairprime · 8h ago
Seems pretty unethical to me. Exercising a liberty in direct contradiction to its creator’s wishes for personal gain with no recompense to them is about as crassly selfish and non-prosocial as it gets. Perhaps your ethics don’t include “being prosocial towards those whose work benefits you”? That’s the usual difference I encounter between my ethics and those who disagree that it’s crass — and I do respect such differing beliefs.

Note that I’m not faulting you for behaving this way, no insult or disparagement intended, etc.! Open source inherited this dissonance between giving it all away to anyone who asks for free, and giving nothing of yours back in return because prosocial is not an ethical standard, from its predecessor belief system. It remains unsolved decades later, in both open source and libertarianism, and I certainly don’t hold generic exploiters of the prosocial-imbalance defect accountable for the underlying flaw in both belief systems.

LPisGood · 7h ago
If the authors wanted to disallow people to be free (as in freedom) to change the source code for free (as in beer), then the authors had every chance to publish the source code under a more restrictive license.

I’m trying to imagine how this might be unethical. The only scenario I can think of is if the authors wanted the code to not be modified in certain ways, but felt based on more deeply held principles that the code should be made FOSS. But I struggle to see how both ideas could exist simultaneously - if you think code should be free then you think there is no ethical issue with people modifying it to fit their use.

altairprime · 7h ago
Yep, that’s the struggle in a nutshell!

If you believe in giving away code because that’s open-source prosocial, then open-source adherents will claim that taking advantage of you is ethical, because if you didn’t want to be exploited, you shouldn’t have been open-source prosocial in the first place. And by treating “pay me if you get paid for my code” licenses as treated as evil and shameful, exploiters place pressures on prosocial maintainers into adopting open source licenses, even though they’ll then be exploited by people who don’t care about being prosocial, eventually burning out the maintainer who either silent-quits or rage-quits.

Of course, if OSI signed off on “if you get rich from my source code you have to share some of that wealth back to me” as a permissible form of clause in open source licensing, that would of course break the maintainer burnout cycle — but I’m certainly not holding my breath.

blackoil · 5h ago
That only applies if author wants to call software "Open Source". You can license it under "SourceAvailableForSmallGuy" with no resistance.
imiric · 1h ago
> Seems pretty unethical to me. Exercising a liberty in direct contradiction to its creator’s wishes for personal gain with no recompense to them is about as crassly selfish and non-prosocial as it gets.

You're ignoring the possibility that users of the software might not agree with the author's wishes. There's nothing unethical about that.

A request to not change a part of the software is the same as a request to not use the software in specific industries, or for a specific purpose. There are many projects that latch on open source for brand recognition, but then "forbid" the software to be used in certain countries, by military agencies, etc. If the author wants to restrict how the software can be used, then it's not libre software.

altairprime · 29m ago
I disagree. Having the freedom to choose to ignore someone’s wishes does not necessarily make it ethical to exercise that freedom. Ethics are not as simple as “what is not prohibited is therefore ethical”.
sgc · 7h ago
You are presuming this is their primary concern. Releasing software with a permissive license is a pretty strong signal you are ok with people not doing exactly as you ask.
altairprime · 6h ago
It’s certainly a legal signal, insofar as once you have that signal, you have the ability to make a legally-sound decision on usage — but I don’t presume that it’s in any way an indication of how strongly the author is or isn’t invested in whatever license they chose. Unless accompanied by something written by the maintainer, the only certain statement is that the maintainer released with a metadata attribute set to a value; nothing more.

See also: “Npm should remove the default license from new packageshttps://news.ycombinator.com/item?id=43864518

imiric · 1h ago
The purpose of a software license is to codify the rights the author grants to its users. The author can't claim to use a free software license, while also making separate demands about how the software can be used. These demands should either be part of the license, or removed altogether. This moral shaming for breaking a "social contract" is ridiculous. The software is either free or not. You can't have it both ways.
altairprime · 57m ago
“Don’t use this for evil” is a legal and valid software license. This is anathema to programmers and law-as-code adherents, but it’s perfectly acceptable to bring to a court of law in a licensing dispute. Different courts and different acts of accused evil will result in different judgments. It would be very difficult for a corporation to accept that license; it would be very simple for an individual to do so.

Such a license does not comply with your requirements; yet, it is also valid under case law, even if it is statistically unlikely to permit enforcement against most claimed evils. Each society has certain evils that are widely accepted by the courts, so it certainly isn’t a get out of all possible jails free card.

The purpose of a license is to inform of the rights available. The user is responsible for evaluating the license, or for trusting the judgment of a third party if they are uninterested in evaluating themselves.

If the author’s entire license “This is free software for free uses, please contact me for a paid license for paid uses” then that is statistically likely to be court enforceable against exploitation, so long as the terms offered are reasonable to the judge and any expert witnesses called. The Free Software Foundation does not have exclusive rights to the words “free software”. Adoption will be much reduced for someone who writes such a license, of course, and perhaps someone will exploit a loophole that a lengthier outsourced license would have covered. Neither of those outcomes are necessarily worth the time and effort to try and prevent, especially when use of any open source license guarantees the right of exploitation for unshared profit in plain language versus the homegrown one which does not.

(I am not your lawyer, this is not legal advice.)

imiric · 18m ago
This is not a legal matter, nor is it related to the FSF and any of the "open source" licenses. My argument is philosophical.

Using a license that allows the software to be distributed and modified, while placing restrictions or exemptions to those permissions outside of the license, at the very least sends mixed signals. My point is that if the author wants to make those restrictions, that's fine, but the license is the correct place for it. What's shitty from my moral perspective is using a commonly accepted free software license for marketing purposes, but then judging people for not following some arbitrary demands. If anything, _that_ is the unethical behavior.

lelanthran · 1h ago
> Seems pretty unethical to me.

I'm seeing this sentiment multiple times on this thread - "fine, it's legal, but it's still wrong!"

That's an extremely disrespectful take on someone adhering to a contract that both parties agreed to. You are using shaming language to pressure people into following your own belief system.

In this specific instance, the author could have chosen any damn license they wanted to. They didn't. They chose one to get the most adoption.

You appear to want both:

1. Widespread adoption

and

2. Restrict what others can do.

The MIT license is not compatible with #2 above. You can ask nicely, but if you don't get what you want you don't get to jump on a fucking high horse and religiously judge others using your own belief system.

Author should have used GPL (so any replaced images get upstreamed back and thus he has control) OR some other proprietary license that prevents modifications like changing the image.

A bunch of finger-pointers gabbing on forums about those "evil" people who stick to both the word and the spirit of the license are nothing more than the modern day equivalent of witch-hunters using "intent" to secure a prosecution.

Be better than that - don't join the mob in pointing out witches. We don't need more puritans.

altairprime · 1h ago
I do not agree with your position that two parties who enter into a contract are no longer subject to ethical judgment by others. Contract law does not invalidate ethics, no matter how appealing it is to opt out of them. As one of the asocial / decoupled people who has no social compulsion whatsoever, I voluntarily opt-in to preferring prosocial outcomes and typically deem anti-prosocial actions unethical even if our society currently accepts them.

For example, if an employee does something hostile towards society at their employer when they have the freedom to choose not to do so — and since employment is at will, they always have that freedom to choose — I will tend to judge their antisocial actions unethical, even if their contract allows it. (This doesn’t mean I will therefore judge the person as unethical! One instance does not a pattern make, etc.)

So, for me, ethical judgments are not opt-out under any circumstance, nor can they be abrogated by contract or employment or law. I hold this is a non-negotiable position, so I will withdraw here; you’re welcome to continue persuading others if you wish.

lelanthran · 1h ago
> Contract law does not invalidate ethics, no matter how appealing it is to opt out of ethics

I didn't claim it does, I am claiming that since ethics is subjective and the contract is not, you subjecting your moral standard to others is no different than a mob subjecting an old woman to accusations of being a witch.

Now, you may not have a problem publicly judging others, but your actions are barely different from those of the Westboro Baptist Church.

IOW, sure, you are allowed to publicly condemn people who hold different moral beliefs to you, but the optics are not good for you.

pabs3 · 1h ago
The LGPL/GPL/AGPL family of licenses don't require upstreaming, only passing source code downstream to end users.

In this case upstreaming replaced images wouldn't be useful to the author anyway, they are going to keep the anime image.

lelanthran · 1h ago
> In this case upstreaming replaced images wouldn't be useful to the author anyway, they are going to keep the anime image.

In this case, it would be, because (presumably) the new images are the property of the user, and they would hardly want (for example) their company logo to be accidentally GPL'ed.

jillyboel · 7h ago
The license explicitly allows you to make such changes. They could have picked a different license, but didn't.

> Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software

altairprime · 6h ago
> They could have picked a different license, but didn’t.

I disagree.

Licenses that prohibit exploitation of source code for personal reward are treated with hostility, shame, and boycotts — claiming that to restrict in any way the liberty of another person to exploit one’s work is unethical. Human beings are social creatures, and most human beings are not asocial with decoupled ethical systems like myself; so, given the social pressures in play, few human beings truly have the liberty to pick another license and endure the shame and vitriol that exercising that freedom earns from us.

natebc · 9h ago
It's also mentioned on the docs site: https://anubis.techaro.lol/docs/funding/
xena · 7h ago
Hit me up at https://xeiaso.net/contact, more than willing to talk :)
clvx · 6h ago
Just fork, change and move on. If you like it, contribute back or pay some sponsorship.
ranger_danger · 9h ago
yes it's open source

https://git.kernel.org/ changed theirs

natebc · 9h ago