LLM Botnet: Are companies using botnets to scrape content?

3 flyriver 1 5/12/2025, 1:34:51 AM
I have a web site with several million pages of articles, generated by the llama/gpt/and gemini. As you can imagine, there is a lot of scraping happening. Generally speaking, I allow the crawlers that respect robots.txt and identity themselves as bots to go wild. I figure it might get the site more exposure if it is "in" the LLMs. Otherwise, I try to block them.

Over time, especially recently, I have seen thousands of diverse IP addresses scraping the site. They use random/varying user-agents. I originally was blocking Brazil /16s since it appeared that most of the traffic was coming from there, but over the past few weeks the IPs come from everywhere. Each IP makes only a few requests, trying to stay under the radar. Right now, I have set some scripts to block and log the IPs as they come in.

I am blocking between 50 and 100 unique IP addresses per minute, and this is after I already blocked the main Chinese LLM scrapers and several /16s. Few of the IPs belong to obvious providers. Many just seem to be home users. Many are from countries that do not have the money to build LLMs. There are even wireless phone company IPs.

None of the requests are particularly malicious. They are just downloading pages.

Am I missing something? Is there a new botnet scraping the web ? A quick grep through my logs shows I have blocked 15,000 requests in the past 90 minutes, but only 1300 of them are repeats of IPs that have been added to my block list. Yesterday, I blocked 220,000 requests and only 13,000 of them were repeats.

Comments (1)

Retr0id · 10h ago
They're usually residential proxies, enabled by "SDKs" shipped as a means of monetizing mobile apps. Basically a legalized(ish) botnet.

If you have AI-generated content, expect an AI-generated audience.