Cloudflare CEO: AI is killing the business model of the web

15 kordlessagain 7 5/8/2025, 3:23:22 PM searchengineland.com ↗

Comments (7)

lwo32k · 3h ago
The business model of the web was killing itself long before AI showed up.

Read the UN report on the Attention Economy from 2 years ago - https://www.un.org/sites/un2.un.org/files/attention_economy_...

It says a simple thing - Content has vastly exceeded Eyeballs and Time available to consume it all.

So what happens when Supply exceeds Demand by a huge margin?

Attention Economy CEOs (basically the monopoly platforms) have handled this question by doing a fantastic job convincing Content Creators if your content is not getting eyeballs either something is wrong with you, OR you got to pay us more for Reach and Visibility/buy more ads/produce more engaging garbage. If its engaging we will take a cut. If its not, pay us to get the algo to prop you up.

This is a parasitic model which is eating itself.

toomuchtodo · 5h ago
AI is (potentially) killing Cloudflare's business model. If it transitions to push (where you have to get your content into LLM models if you want users to see it) from pull (where users pull it with browsers and find it with search engines), this is not good for what Cloudflare offers.

Do I want to rely on search engines? Or do I want to live within the Anthropic or ChatGPT client with everything at my fingertips it has trained on (as well as tooling access via the MCP ecosystem)? Desktop->Browser->AI terminal is the rough story arc. Do I want an open web? We haven't had that for a long time; we've had Big Tech building moats and monopolies to siphon up all the value (most recently evident in the Google DOJ antitrust suit, their ad monopoly, potentially being forced to divest Chrome, their agreement for default search with Apple, and so on). Generative AI is a watershed moment where users can get some control back over how they consume and ETL the data they are interested in, and this is not great for incumbents.

lelanthran · 4h ago
WARNING: Some very speculative scenarios ahead.

> Or do I want to live within the Anthropic or ChatGPT client with everything at my fingertips it has trained on (as well as tooling access via the MCP ecosystem)?

If advancements in hardware continue at a fairly rapid pace, we'll all eventually be using large models locally. We won't be talking about MCP and requests to OpenAI, Anthropic or Gemini, we'll be talking about which of the latest models to download.

Pricing-wise, right now, it's probably not that expensive to set up a free LLm on a server in your house that everyone will use.

The only "moat" that there is, is the model weights. And the only way to get a trained model is by slurping content.

So, sure, while right now everyone is going to Anthropic and ChatGPT for answers, pretty soon everyone will be going to whoever has the most current model. And that's where Cloudflare can make a killing, because they are literally serving so much content, they can train their own model on the content that is passing through without any need to run a bot.

aurareturn · 2h ago

  So, sure, while right now everyone is going to Anthropic and ChatGPT for answers, pretty soon everyone will be going to whoever has the most current model. And that's where Cloudflare can make a killing, because they are literally serving so much content, they can train their own model on the content that is passing through without any need to run a bot.

I don't understand this logic. This assumes that Cloudflare can train a model remotely as good as OpenAI and Anthropic. I have doubts about that. You're also assuming that the model with the most up to date content is the best but I disagree. ChatGPT can search the web now so not having up to date training data is not a big deal anymore.

I think where Cloudflare can make a killing is policing which AI agent can access which service. IE. Content providers use Cloudflare to block AI agents/bots. AI Agents/bots pay Cloudflare for easier access.

znpy · 2h ago
> And that's where Cloudflare can make a killing, because they are literally serving so much content, they can train their own model on the content that is passing through without any need to run a bot.

I think you're missing an important aspect: CloudFlare is in the unique position to be able to:

1. train a model on content that's actually getting visited by people (they are fairly good at cutting out bots)

2. They don't even have to scrape the website, they can collect the content while they serve request. it would be completely transparent (zero cost, actually a benefit) to website owners.

@jgrahamc if you're reading this and you end up doing what i wrote, I want a cut of the profits (or a job offer) :P

meheleventyone · 5h ago
As soon as there is a new gatekeeper like Anthropic or OpenAI there will be a new moat and monopoly. They're still running the same playbook.
toomuchtodo · 5h ago
Maybe! It was hard to build Google, but it is rapidly getting easier to train models and serve inference (DeepSeek). The learning rate and cost decline curve are direct inputs into democratization (imho). So perhaps we should hope the gold rush continues for now? That aggregate investment FOMO is our collective opportunity. Until then, keep the racks of storage and GPUs comin'.