Like robots.txt, it has the fatal flaw of being unenforceable.
OsmanDKitay · 32m ago
You're right...
Aura.json file itself is completely voluntary. A badly behaved agent can just ignore it.
But this is where the model differs from robots.txt. Aura isnt the fence, it s the official map to the gates. The real enforcement happens on the backend at the API endpoint, just like any normal web app.
For example, aura manifest says the create_post capability needs auth. If an agent ignores that and POSTs to /api/posts without a valid cookie, our server's API will reject it with a 401. The manifest doesnt do the blocking, the backend does. It just tells the cooperative agents the rules of the road ahead of time.
So the real incentive for an agent to use Aura isnt about avoiding punishment, it s about the huge upside in efficiency and reliability. Why scrape a page and guess at DOM elements when you can make a single, clean API call that you know will work? It saves the agent developer time, compute resources, and the headache of maintaining brittle scrapers.
So;
robots.txt tells good bots what they shouldn't do.
aura.json tells them what they can do, and gives them the most efficient way to do it, all backed by the server's actual security logic.
JohnFen · 21m ago
The primary purpose of robots.txt isn't to deny access. That's just a sideline. The intended purpose is to do exactly what this aura proposal does: to provide guidance to crawlers as to what parts of the site are valuable to crawl. That's why it's voluntary: it's main reason for existing is to benefit the crawlers in the first place.
In that light, I guess your proposal makes a certain amount of sense. I don't think it addresses what a lot of web sites want, but that's not necessarily a bad thing. Own your niche.
For example, aura manifest says the create_post capability needs auth. If an agent ignores that and POSTs to /api/posts without a valid cookie, our server's API will reject it with a 401. The manifest doesnt do the blocking, the backend does. It just tells the cooperative agents the rules of the road ahead of time.
So the real incentive for an agent to use Aura isnt about avoiding punishment, it s about the huge upside in efficiency and reliability. Why scrape a page and guess at DOM elements when you can make a single, clean API call that you know will work? It saves the agent developer time, compute resources, and the headache of maintaining brittle scrapers.
So;
robots.txt tells good bots what they shouldn't do.
aura.json tells them what they can do, and gives them the most efficient way to do it, all backed by the server's actual security logic.
In that light, I guess your proposal makes a certain amount of sense. I don't think it addresses what a lot of web sites want, but that's not necessarily a bad thing. Own your niche.