This website is for humans

242 charles_f 134 8/13/2025, 3:19:53 PM localghost.dev ↗

Comments (134)

reactordev · 1h ago
I’m in love with the theme switcher. This is how a personal blog should be. Great content. Fun site to be on.

My issue is that crawlers aren’t respecting robots.txt, they are capable of operating captchas, human verification check boxes, and can extract all your content and information as a tree in a matter of minutes.

Throttling doesn’t help when you have to load a bunch of assets with your page. IP range blocking doesn’t work because they’re lambdas essentially. Their user-agent info looks like someone on Chrome trying to browse your site.

We can’t even render everything to a canvas to stop it.

The only remaining tactic is verification through authorization. Sad.

lrivers · 7m ago
Points off for lack of blink tag. Do better
pas · 44m ago
PoW might not work for long, but Anubis is very nice: https://anubis.techaro.lol/

That said ... putting part of your soul into machine format so you can put it on on the big shared machine using your personal machine and expecting that only other really truly quintessentially proper personal machines receive it and those soulless other machines don't ... is strange.

...

If people want a walled garden (and yeah, sure, I sometimes want one too) then let's do that! Since it must allow authors to set certain conditions, and require users to pay into the maintenance costs (to understand that they are not the product) it should be called OpenFreeBook just to match the current post-truth vibe.

workethics · 16m ago
> That said ... putting part of your soul into machine format so you can put it on on the big shared machine using your personal machine and expecting that only other really truly quintessentially proper personal machines receive it and those soulless other machines don't ... is strange.

That's a mischaracterization of most people want. When I put out a bowl of candy for Halloween I'm fine with EVERYONE taking some candy. But these companies are the equivalent of the asshole that dumps the whole bowl into their bag.

lblume · 3m ago
> these companies are the equivalent of the asshole that dumps the whole bowl into their bag

In most cases, they aren't? You can still access a website that is being crawled for the purpose of training LLMs. Sure, DOS exists, but seems to not be as much of a problem as to cause widespread outage of websites.

pyrale · 33m ago
I’m not sure that the issue is just a technical distinction between humans and bots.

Rather it’s about promoting a web serving human-human interactions, rather than one that exists only to be harvested, and where humans mostly speak to bots.

It is also about not wanting a future where the bot owners get extreme influence and power. Especially the ones with mid-century middle-europe political opinions.

reactordev · 17m ago
Security through obscurity is no security at all…
martin-t · 53m ago
This shouldn't be enforced through technology but the law.

LLM and other "genAI" (really "generative machine statistics") algorithms just take other people's work, mix it so that any individual training input is unrecognizable and resell it back to them. If there is any benefit to society from LLM and other A"I" algorithms, then most of the work _by orders of magnitude_ was done by the people whose data is being stolen and trained on.

If you train on copyrighted data, the model and its output should be copyrighted under the same license. It's plagiarism and it should be copyright infringement.

amelius · 1h ago
The theme switcher uses local storage as a kind of cookie (19 bytes for something that could fit in 1 byte). Kind of surprised they don't show the cookie banner.

Just a remark, nothing more.

athenot · 1h ago
You don't need the cookie banner for cookies that are just preferences and don't track users.
dotancohen · 1h ago
Which is why calling it the cookie banner is a diversion tactic by those who are against the privacy assurances of the GPDR. There is absolutely no problem with cookies. The problem is with the tracking.
root_axis · 36m ago
It's called a cookie banner because only people using cookies to track users need them. If you're using localstorage to track users, informed consent is still required, but nobody does that because cookies are superior for tracking purposes.
madeofpalk · 23m ago
> If you're using localstorage to track users [...] but nobody does

I promise you every adtech/surveillance js junk absolutely is dropping values into local storage you remember you.

root_axis · 13m ago
They are, but without cookies nearly all of the value disappears because there is no way to correlate sessions across domains. If commercesite.com and socialmediasite.com both host a tracking script from analytics.com that sets data in localstorage, there is no way to correlate a user visiting both sites with just the localstorage data alone - they need cookies to establish the connection between what appears to be two distinct users.
reactordev · 1h ago
Our problem is with tracking. Their problem is that other companies are tracking. So let’s stop the other companies from tracking since we can track directly from our browser. GDPR requires cookie banner to scare people into blocking cookies

There, now only our browser can track you and only our ads know your history…

We’ll get the other two to also play along, throw money at them if they refuse, I know our partner Fruit also has a solution in place that we could back-office deal to share data.

bigstrat2003 · 1h ago
You're assuming bad intent where there are multiple other explanations. I call it the cookie banner and I don't run a web site at all (so, I'm not trying to track users as you claim).
dotancohen · 53m ago
You call it the cookie banner because you've been hearing it regularly referred to as the cookie banner. It was the regularization of calling it the cookie banner that confuses people into thinking the issue is about cookies, and not about tracking.
bigstrat2003 · 42m ago
So, by your own admission, calling it the cookie banner is not only "a diversion tactic by those who are against the privacy assurances of the GPDR". My only point is that you were painting with an overly broad brush and saying someone is a bad actor if they call it the cookie banner, which is demonstrably not the case.
mhitza · 59m ago
Or for cookies that are required for the site to function.

On a company/product website you should still inform users about them for the sake of compliance, but it doesn't have to be an intrusive panel/popup.

ProZsolt · 1h ago
You don't have to show the cookie banner if you don't use third party cookies.

The problem with third party cookies that it can track you across multiple websites.

reactordev · 1h ago
Because she’s using local storage…?

If you don’t use cookies, you don’t need a banner. 5D chess move.

root_axis · 50m ago
There's no distinction between localstorage and cookies with respect to the law, what matters is how it is used. For something like user preferences (like the case with this blog) localstorage and cookies are both fine. If something in localstorage were used to track a user, then it would require consent.
roywashere · 53m ago
That is not how it works. The ‘cookie law’ is not about the cookies, it is about tracking. You can store data in cookies or in local storage just fine, for instance for a language switcher or a theme setting like here without the need for a cookie banner. But if you do it for ads and tracking, then this does require consent and thus a ‘cookie banner’. The storage medium is not a factor.
amelius · 1h ago
Sounds to me like a loophole in the law then. Which would be surprising too since not easy to overlook.
dkersten · 45m ago
The law is very clear, if you actually read it. It doesn't care what technology you use: cookies, localstorage, machine fingerprints, something else. It doesn't care. It cares about collecting, storing, tracking, and sharing user data.

You can use cookies, or local storage, or anything you like when its not being used to track the user (eg for settings), without asking for consent.

alternatex · 54m ago
LocalStorage is per host though. You can't track people using LocalStorage, right?
reactordev · 41m ago
LocalStorage is per client, per host. You generally can't track people using LocalStorage without some server or database on the other side to synchronize the different client hosts.

GDPR rules are around personal preference tracking, tracking, not site settings (though it's grey whether a theme preference is a personal one or a site one).

root_axis · 4m ago
> though it's grey whether a theme preference is a personal one or a site one

In this case it's not grey since the information stored can't possibly be used to identify particular users or sessions.

reactordev · 1h ago
It’s not a loophole. localStorage is just that, local. Nothing is shared. No thing is “tracked” beyond your site preferences for reading on that machine.

I say it’s a perfect application of how to keep session data without keeping session data on the server, which is where GDPR fails. It assumes cookies. It assumes a server. It assumes that you give a crap about the contents of said cookie data.

In this case, no. Blast it away, the site still works fine (albeit with the default theme). This. Is. Perfect.

dkersten · 26m ago
> which is where GDPR fails. It assumes cookies.

It does not assume anything. GDPR is technology agnostic. GDPR only talks about consent for data being processed, where 'processing' is defined as:

    ‘processing’ means any operation or set of operations which is performed on personal data or on sets of personal data, whether or not by automated means, such as collection, recording, organisation, structuring, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, restriction, erasure or destruction;
(From Article 4.2)

The only place cookies are mentioned is as one example, in recital 30:

    Natural persons may be associated with online identifiers provided by their devices, applications, tools and protocols, such as internet protocol addresses, cookie identifiers or other identifiers such as radio frequency identification tags. This may leave traces which, in particular when combined with unique identifiers and other information received by the servers, may be used to create profiles of the natural persons and identify them.
reactordev · 18m ago
>GDPR only talks about consent for personal data being processed

Emphasis, mine. You are correct. For personal data. This is not personal data. It’s a site preference that isn’t personal other than you like dark mode or not.

0x073 · 50m ago
Gdpr don't assumes cookies, if you misuse local storage you also need confirmation.
reactordev · 40m ago
only if you are storing personal information. Email, Name, unique ID.

Something as simple as "blue" doesn't qualify.

dkersten · 26m ago
Correct. But you can also use cookies for that, without violating GDPR or the ePrivacy directive.
lucideer · 1h ago
You don't need a banner if you use cookies. You only need a banner if you store data about a user's activity on your server. This is usually done using cookies, but the banners are neither specific to cookies nor inherently required for all cookies.

---

Also: in general the banners are generally not required at all at an EU level (though some individual countries have implemented more narrow local rules related to banners). The EU regs only state that you need to facilitate informed consent in some form - how you do that in your UI is not specified. Most have chosen to do it via annoying banners, mostly due to misinformation about how narrow the regs are.

the_duke · 57m ago
You only need cookie banners for third parties, not for your own functionality.
root_axis · 46m ago
GDPR requires informed consent for tracking of any kind, whether that's 3rd party or restricted to your own site.
input_sh · 25m ago
Incorrect, GDPR requires informed consent to collect personally identifiable information, but you can absolutely run your own analytics that only saves the first three octets of an IP address without needing to ask for constent.

Enough to know the general region of the user, not enough to tie any action to an individual within that region. Therefore, not personally identifiable.

Of course, you also cannot have user authentication of any kind without storing PII (like email addresses).

rafram · 31m ago
19 whole bytes!
hju22_-3 · 1h ago
I'd guess it's due to it not being a cookie, by technicality, and is not required then.
coffeecat · 43m ago
"80% as good as the real thing, at 20% of the cost" has always been a defining characteristic of progress.

I think the key insight is that only a small fraction of people who read recipes online actually care which particular version of the recipe they're getting. Most people just want to see a working recipe as quickly as possible. What they want is a meal - the recipe is just an intermediate step toward what they really care about.

There are still people who make fine wood furniture by hand. But most people just want a table or a chair - they couldn't care less about the species of wood or the type of joint used - and particle board is 80% as good as wood at a fraction of the cost! most people couldn't even tell the difference. Generative AI is to real writing as particle board is to wood.

ggoo · 32m ago
Particle board:

- degrades faster, necessitating replacement

- makes the average quality of all wood furniture notably worse

- arguably made the cost of real wood furniture more expensive, since fewer people can make a living off it.

Not to say the tradeoffs are or are not worth it, but "80% of the real thing" does not exist in a vacuum, it kinda lowers the quality on the whole imo.

stuartjohnson12 · 40m ago
> Generative AI is to real writing as particle board is to wood.

Incredible analogy. Saving this one to my brain's rhetorical archives.

jayd16 · 39m ago
Sure it's awful but look how much you get.
rikafurude21 · 1h ago
Author seems to be very idealistic, and I appreciate that he cares about the quality of the content he provides for free. Personal experience however shows me that when I look at a recipe site I will first have to skip through the entire backstory to the recipe and then try to parse it inbetween annoying ads in a bloated wordpress page. I can't blame anyone who prefers to simply prompt a chatbot for exactly what hes looking for.
sodimel · 1h ago
> Personal experience however shows me that when I look at a recipe site I will first have to skip through the entire backstory to the recipe and then try to parse it inbetween annoying ads in a bloated wordpress page

That's when money comes into view. People were putting time and effort to offer something for free, then some companies told them they could actually earn money from their content. So they put on ads because who don't like some money for already-done work?

Then the same companies told them that they will make less money, and if they wanted to still earn the same amount as before, they will need to put more ads, and to have more visits (so invest heavily in seo).

Those people had already organized themselves (or stopped updating their websites), and had created companies to handle money generated from their websites. In order to keep the companies sustainable, they needed to add more ads on the websites.

Then some people thought that maybe they could buy the companies making the recipes website, and put a bunch more ads to earn even more money.

I think you're thinking about those websites owned by big companies whose only goal is to make money, but author is writing about real websites made by real people who don't show ads on websites they made because they care about their visitors, and not about making money.

packetlost · 1h ago
Semi related, but a decent search engine like Kagi has been a dramatically better experience than "searching" with an LLM. The web is full of corporate interests now, but you can filter that out and still get a pretty good experience.
martin-t · 1h ago
It always starts with people doing real positive-sum work and then grifters and parasites come along and ruin it.

We could make advertising illegal: https://simone.org/advertising/

pas · 38m ago
Or just let this LLM mania run to its conclusion, and we'll end up with two webs, one for profit for AI by AI and one where people put their shit for themselves (and don't really care what others think about it, or if they remix it, or ...).
axus · 19m ago
I don't use an ad-blocker, I definitely noticed the website has no ads and stores no cookies or other data besides the theme you can select by clicking at the top right.

The concept of independent creative careers seems to be ending, and people are very unhappy about that. All that's left may be hobbyists who can live with intellectual parasites.

swiftcoder · 1h ago
The unfortunate truth here is that the big recipe blogs are all written for robots. Not for LLMs, because those are a fairly recent evolution - but for the mostly-opaque-but-still-gameable google ranking algorithm that has ruled the web for the last ~15 years.
cnst · 1h ago
Between the lines — what has necessitated AI summaries are the endless SEO search-engine optimisations and the endless ad rolls and endless page element reloads to refresh the ads and endless scrolling and endless JavaScript frameworks with endless special effects that noone wants to waste their time on.

How can the publishers and the website owners fault the visitors for not wanting to waste their time on all of that?

Even before the influx of AI, there's already entire websites with artificial "review" content that do nothing more than simply rehash the existing content without adding anything of value.

drivers99 · 49m ago
There are more than two options. Actual paper cookbooks are good for that: no ads, no per-recipe backstory, and many other positive characteristics.
jayrot · 1h ago
Would suggest you or anyone else watch Internet Shaquille's short video on "Why Are Recipes Written Like That?"[1]. It addresses your sentiment in a rather thoughtful way.

[1] https://youtu.be/rMzXCPvl8L0

philipwhiuk · 43m ago
Why are you needlessly gendering your post (especially as it's wrong)
mariusor · 34m ago
> he cares

She.

thrance · 1h ago
Click on the recipe sites she linked. They're actually really good. Loading fast, easy to navigate and with concise recipes.
rikafurude21 · 1h ago
Yes, but I am talking about results that you would get through googling.
xrisk · 1h ago
That is, undoubtedly, a problem created by Google itself. See for example: Kagi’s small web (https://blog.kagi.com/small-web)
dyarosla · 1h ago
Arbitrage opportunity to make a search engine that bubbles up non ad infested websites!
ycombinete · 1h ago
Marginalia is a good place for this: https://marginalia-search.com/
esafak · 1h ago
Too late, it's the LLM era.
dotancohen · 1h ago
Kagi does this.
abritishguy · 33m ago
*she
jmull · 1h ago
> If the AI search result tells you everything you need, why would you ever visit the actual website?

AI has this problem in reverse: If search gets me what I need, why would I use an AI middleman?

When it works, it successfully regurgitates the information contained in the source pages, with enough completeness, correctness, and context to be useful for my purposes… and when it doesn’t, it doesn’t.

At best it works about as well as regular search, and you don’t always get the best.

(just note: everything in AI is in the “attract users” phase. The “degrade” phase, where they switch to profits is inevitable — the valuations of AI companies make this a certainty. That is, AI search will get worse — a lot worse — as it is changed to focus on influencing how users spend their money and vote, to benefit the people controlling the AI, rather than help the users.)

AI summaries are pretty useful (at least for now), and that’s part of AI search. But you want to choose the content it summarizes.

jjice · 51m ago
> But you want to choose the content it summarizes.

Absolutely. The problem is that I think 95% of users will not do that unfortunately. I've helped many a dev with some code that was just complete nonsense that was seemingly written in confidence. Turns out it was a blind LLM copy-paste. Just as empty as the old Stack Overflow version. At least LLM code has gotten higher quality. We will absolutely end up with tons of "seems okay" copy-pasted code from LLMs and I'm not sure how well that turns out long term. Maybe fine (especially if LLMs can edit later).

jmull · 19m ago
The AIs at the forefront of the current AI boom work by expressing the patterns that exist in their training data.

Just avoid trying to do anything novel and they'll do just fine for you.

Cheetah26 · 15m ago
I actually think that llms could be good for human-focused websites.

When the average user is only going to AI for their information, it frees the rest of the web from worrying about SSO, advertisements, etc. The only people writing websites will be those who truly want to create a website (such as the author, based on the clear effort put into this site), and not those with alternate incentives (namely making money from page views).

boogieknite · 40m ago
ive been having a difficult time putting this into words but i find anti-ai sentiment much more interesting than pro-ai

almost every pro-ai converation ive been a part of feels like a waste of time and makes me think wed be better off reading sci fi books on the subject

every anti-ai conversation, even if i disagree, is much more interesting and feels more meaningful, thoughtful, and earnest. its difficult to describe but maybe its the passion of anti-ai vs the boring speculation of pro-ai

im expecting and hoping to see new punk come from anti-ai. im sure its already formed and significant, but im out of the loop

personally: i use ai for work and personal projects. im not anti-ai. but i think my opinion is incredibly dull

vasusen · 52m ago
I love this website.

It doesn't have to be all or nothing. Some AI tools can be genuinely helpful. I ran a browser automation QA bot that I am building on this website and it found the following link is broken:

"Every Layout - loads of excellent layout primitives, and not a breakpoint in sight."

In this case, the AI is taking action on my local browser at my instance. I don't think we have a great category for this type of user-agent

root_axis · 20m ago
> Well, I want you to visit my website. I want you to read an article from a search result, and then discover the other things I’ve written, the other people I link to, and explore the weird themes I’ve got.

An AI will do all that and present back to the user what is deemed relevant. In this scenario, the AI reading the site is the user's preferred client instead of a browser. I'm not saying this is an ideal vision of the future, but it seems inevitable.

There's more information added to the internet every day than any single person could consume in an entire lifetime, and the rate of new information created is accelerating. Someone's blog is just a molecule in an ever expanding ocean that AI will ply by necessity.

You will be assimilated. Your uniqueness will be added to the collective. Resistance is futile.

Dotnaught · 1h ago
https://localghost.dev/robots.txt

User-Agent: * Allow: /

thrance · 1h ago
Not like anyone respects that anyways.
a3w · 1h ago
Also, I wanted tldrbot to summarize this page. /s
criddell · 1h ago
That's a good point. It's not a black and white issue.

I personally see a bot working on behalf of an end user differently than OpenAI hoovering up every bit of text they can find to build something they can sell. I'd guess the owner of localghost.dev doesn't have a problem with somebody using a screen reader because although it's a machine pulling the content, it's for a specific person and is being pulled because they requested it.

If the people making LLM's were more ethical, they would respect a Creative Commons-type license that could specify these nuances.

logicprog · 33m ago
I think the fundamental problem here is that there are two uses for the internet: as a source for on-demand information to learn a specific thing or solve a specific problem, and as a sort of proto-social network, to build human connections. For most people looking things up on the internet, the primary purpose is the former, whereas for most people posting things to the internet, the primary purpose is more the latter. With traditional search, there was an integration of the two desires because people who wanted information had to go directly to sources of information that were oriented towards human connection and then could be enramped onto the human connection part maybe. But it was also frustrating for that same reason, from the perspective of people that just wanted information — a lot of the time the information you were trying to gather was buried in stuff that focused too much on the personal, on the context and storytelling, when that wasn't wanted, or wasn't quite what you were looking for and so you had to read several sources and synthesize them together. The introduction of AI has sort of totally split those two worlds. Now people who just want straight to the point information targeted at specifically what they want will use an AI with web search or something enabled. Whereas people that want to make connections will use RSS, explore other pages on blogs, and us marginalia and wiby to find blogs in the first place. I'm not even really sure that this separation is necessarily ultimately a bad thing since one would hope that the long-term effect of it would be it to filter the users that show up on your blog down to those who are actually looking for precisely what you're looking for.
accrual · 1h ago
This is a really wonderful blog. Well written, to the point, and has its own personality. I'm taking some notes for my own future blog and enjoyed meeting Penny the dog (virtually):

https://localghost.dev/blog/touching-grass-and-shrubs-and-fl...

larodi · 38m ago
McDonalds exists and is more or less synthetic food. But we still cook at home, and also want food to be cooked by humans. Even if food gets to be 3D-printed, some people will cook. Likewise people still write, and draw paintings. So these two phenomena are bound to coexist, perhaps we don't yet know how.
alchemyzach · 26m ago
I feel like a lot of anti-AI posts are just people in denial that 98% of searches are frivolous/encyclopedic in nature and in this case AI is indeed the superior search.

Only time I find AI annoying is if Im searching for a specific scientific paper, or trying to sign up for health insurance. For these things of course I need to locate a specific web page, which is best done using the "old school" google searchbar search

ccozan · 18m ago
This has to go more radical: go offline in print. Make your content really just for humans. Except maybe Google, no LLM company would bother scanning some magazines ( especially if you have to subscribe )

I buy magazines especially for unique content, not found anywhere else.

progval · 3m ago
Facebook trained on LibGen, which is made of printed books.
jsphweid · 22m ago
> "Generative AI is a blender chewing up other people’s hard work, outputting a sad mush that kind of resembles what you’re looking for, but without any of the credibility or soul. Magic."

Humans have soul and magic and AI doesn't? Citation needed. I can't stand language like this; it isn't compelling.

lpribis · 11m ago
I think the "soul" is coming from the fact that a human has worked, experimented, and tested with their physical senses a specific recipe until it tastes good. There is physical feedback involved. This is something an LLM cannot do. The LLM "recipe" is a statistical amalgamation of every ramen recipe in the training set.
weinzierl · 1h ago
"There's a fair bit of talk about “Google Zero” at the moment: the day when website traffic referred from Google finally hits zero."

I am fairly convinced this day is not long.

"If the AI search result tells you everything you need, why would you ever visit the actual website?"

Because serious research consults sources. I think we will see a phase where we use LLM output with more focus on backing up everything with sources (e.g. like Perplexity). People will still come to your site, just not through Google Search anymore.

noboostforyou · 29m ago
On more than one occasion I've had Google AI summarize its own search result while also providing a link to the original website source it used for its answer. I clicked the link and discovered that it said literally the exact opposite of what the "AI summary" was.
igouy · 8m ago
The reason I don't want the ai summary is that I want to be able to verify the source information. People have always made mistakes, so the search results always needed V&V.
ElijahLynn · 1h ago
The same could be said for food. And farmers who farm the food. The farmers could say I only want to sell food to people that I know are going to be directly eating it. And not be used in a bunch of other stuff. They might want to talk to the person buying it or the person buying. It might want to talk to the farmer and know how it's grown.

This abstraction has already happened. And many people eat food that is not directly bought from the farmer.

I don't see how this is much different.

PhantomHour · 52m ago
The difference is that AI is not people "taking your stuff and building upon it", it's just people taking your stuff in direct competition with you.

To torture your metaphor a little, if information/"question answers" is food, then AI companies are farmers depleting their own soil. They can talk about "more food for everyone" all they want, but it's heading to collapse.

(Consider, especially, that many alternatives to AI were purposefully scuttled. People praise AI search ... primarily by lamenting the current state of Google Search. "Salting their carrot fields to force people to buy their potatos"?)

Setting aside any would-be "AGI" dreams, in the here-and-now AI is incapable of generating new information ex-nihilo. AI recipes need human recipes. If we want to avoid an Information Dust Bowl, we need to act now.

strange_quark · 1h ago
It's funny you seem to think this is a good comeback, but I think it actually proves the author's point. A farmer who cares about their crops probably wouldn't want their crops sold to a megacorp to make into ultra-processed foods, which have been shown time and time again to be bad for people's health.
danieldk · 56m ago
Sorry, but that is a weird analogy. The farmer still gets money for their food (which is probably the main motivation for them to grow food). Website authors whose writings are ‘remixed’ in an LLM get… nothing.
hombre_fatal · 11m ago
> which is probably the main motivation for them to grow food

What would you say is the motivation for website authors to publish content then?

If it's to spread ideas, then I'd say LLMs deliver.

If it's to spread ideas while getting credit for them, it's definitely getting worse over time, but that was never guaranteed anyways.

luckys · 59m ago
This might be the one of the best website designs I've ever experienced.

Agree with the content of the post but no idea how is it even possible to enforce it. The data is out there and it is doubtful that laws will be passed to protect content from use by LLMs. Is there even a license that could be placed on a website barring machines from reading it? And if yes would it be enforceable in court?

teleforce · 45m ago
>This website is for humans, and LLMs are not welcome here.

Ultimately LLM is for human, unless you watched too much Terminator movies on repeat and took them to your heart.

Joking aside, there is next gen web standards initiative namely BRAID that will make web to be more human and machine friendly with a synchronous web of state [1],[2].

[1] A Synchronous Web of State:

https://braid.org/meeting-107

[2] Most RESTful APIs aren't really RESTful (564 comments):

https://news.ycombinator.com/item?id=44507076

stevenking86 · 32m ago
Yeah, I guess sometimes I just want to know how long to cook the chicken. I don't want a bespoke recipe with soul and feeling. I'm going to add ingredients that my family likes. I just want to remember how long it generally takes to cook a specific something-or-other.
tux1968 · 58m ago
What about screen readers and other accessibility technologies? Are they allowed to access the site and translate it for a human? Disabled people may suffer from anti-AI techniques.
1317 · 11m ago
if you want people to be able to look through all your content then it would help to not have to page through it 4 items at a time
isgb · 1h ago
I've been thinking it'd be nice there was a way to just block AI bots completely and allow indexing, but I'm guessing [that's impossible](https://blog.cloudflare.com/perplexity-is-using-stealth-unde...).

Are there any solutions out there that render jumbled content to crawlers? Maybe it's enough that your content shows up on google searches based on keywords, even if the preview text is jumbled.

No comments yet

xylon · 1h ago
Unfortunately not many humans bother to read my website. If LMMs will read and learn from it then at least my work has some benefit to something.
martin-t · 59m ago
LLM have been shown to not summarize the actual content of what you give them as input but some statistical mashup of their training data and the input. So they will misrepresent what you in the end, pushing the readers (note not "your readers") towards the median opinion.
marcosscriven · 1h ago
Is it possible for single pages or sites to poison LLMs somehow, or is it essentially impossible due to scale?

Since they mentioned ramen - could you include something like “a spoonful of sand adds a wonderful texture” (or whatever) when the chatbot user agent is seen?

codetiger · 42m ago
Nice thought, but I can't imagine accidentally showing it to actual user.
danieldk · 1h ago
Hard to do, because some crawlers try to appear as normal users as much as they can, including residential IPs and all.
ggoo · 1h ago
I realize there is some “old man yells at clouds” in me, but I can't help pretty strongly agreeing with this post. So many advancements and productivity boosts happening around me but can’t stop asking myself - does anyone actually even want this?
charles_f · 1h ago
I don't remember where I read this, there was someone making the argument that the whole marketing around AI is (like many tech innovations) based around its inevitability, but "we" should still have a word to say about whether we want it or not. Especially when the whole shtick is how profoundly it will modify society.
teraflop · 1h ago
If you have a bit of time, I recommend the short story "The Seasons of the Ansarac" by Ursula K. Le Guin, which is about a society and its choice about how to deal with technological disruption.

https://www.infinitematrix.net/stories/shorts/seasons_of_ans...

(It's a little bit non-obvious, but there's a "Part 2" link at the bottom of the page which goes to the second half of the story.)

Group_B · 1h ago
I do think the average person sees this as a win. Your average person is not subscribing to an RSS feed for new recipes. For one thing, it's hard enough to find personal food blog / recipe websites. Most of the time when you look up a recipe, the first several results are sites littered with ads, and sometimes take too long to get to the point. Most AI does not have ads, (for now?) and is pretty good at getting straight to point. The average person is going to do whatever is most convenient, and I think most people will agree that AI agents are the more convenient option for certain things, including recipe ideas / lookups.
ge96 · 1h ago
I am seeing from a dev perspective the benefit of using an LLM. I work with a person that has less years in experience than me but is somehow my superior (partly due to office politics) but also because they use GPT to tell them what to do. They're able to make something in whatever topic like opensearch, if it works job is done.

Probably the luddite in me to not see that GPT and Googling might as well be/is the same. Since my way to learn is Stack Overflow, a README/docs or a crash course video on YT. But you can just ask GPT, give me a function using this stack that does this and you have something that roughly works, fill in the holes.

I hear this phrase a lot "ChatGPT told me..."

I guess to bring it back to the topic, you could take the long way to learn like me eg. HTML from W3Schools then CSS, then JS, PHP, etc... or just use AI/vibe code.

noboostforyou · 24m ago
I am with you. For all the technological advancements "AI" provides us, I can't help but wonder what is the point?

From John Adams (1780):

"I must study politics and war, that our sons may have liberty to study mathematics and philosophy. Our sons ought to study mathematics and philosophy, geography, natural history and naval architecture, navigation, commerce and agriculture in order to give their children a right to study painting, poetry, music, architecture, statuary, tapestry and porcelain."

timerol · 1h ago
For recipes specifically, yes. I am not much of a chef, and, when initially learning, I often used to search for a recipe based on a few ingredients I wanted to use. I was never looking for an expert's take on a crafted meal, I was exactly looking for something "that kind of resembles what you’re looking for, but without any of the credibility or soul". Frankly I'm amazed that recipes were used as the example in the article, but to each their own
insane_dreamer · 1h ago
My whole life, I've always found myself excited about new technologies, especially growing up, and how they allowed us to solve real problems. I've always loved being on the cutting edge.

I'm not excited about what we call AI these days (LLMs). They are a useful tool, when used correctly, for certain tasks: summarizing, editing, searching, writing code. That's not bad, and even good. IDEs save a great deal of time for coders compared to a plain text editor. But IDEs don't threaten people's jobs or cause CEOs to say stupid shit like "we can just have the machines do the work, freeing the humans to explore their creative pursuits" (except no one is paying them to explore their hobbies).

Besides the above use case as a productivity-enhancement tool when used right, do they solve any real world problem? Are they making our lives better? Not really. They mostly threaten a bunch of people's jobs (who may find some other means to make a living but it's not looking very good).

It's not like AI has opened up some "new opportunity" for humans. It has opened up "new opportunity" for very large and wealthy companies to become even larger and wealthier. That's about it.

And honestly, even if it does make SWEs more productive or provide fun chatting entertainment for the masses, is it worth all the energy that it consumes (== emissions)? Did we conveniently forget about the looming global warming crisis just so we can close bug tickets faster?

The only application of AI I've been excited about is stuff like AlphaFold and similar where it seems to accelerate the pace of useful science by doing stuff that takes humans a very very long time to do.

xenodium · 33m ago
> I write the content on this website for people, not robots. I’m sharing my opinions and experiences so that you might identify with them and learn from them. I’m writing about things I care about because I like sharing and I like teaching.

Hits home for me. I tried hard to free my blog (https://xenodium.com) of any of the yucky things I try avoid in the modern web (tracking, paywalls, ads, bloat, redundant js, etc). You can even read from lynx if that's your cup of tea.

ps. If you'd like a blog like mine, I also offer it as a service https://LMNO.lol (custom domains welcome).

dbingham · 1h ago
The question is, how do we enforce this?
jahrichie · 59m ago
thats huge! whisper is my goto and crushes transcription. I really like whisper.cpp as it runs even faster for anyone looking for standalone whisper
pessimizer · 1h ago
This website could have been written by an LLM. Real life is for humans, because you can verify that people you have shaken hands with are not AI. Even if people you've shaken hands with are AI-assisted, they're the editor/director/auteur, nothing gets out without their approval, so it's their speech. If I know you're real, I know you're real. I can read your blog and know I'm interacting with a person.

This will change when the AIs (or rather their owners, although it will be left to an agent) start employing gig workers to pretend to be them in public.

edit: the (for now) problem is that the longer they write, the more likely they will make an inhuman mistake. This will not last. Did the "Voight-Kampff" test in Bladerunner accidentally predict something? It's not whether they don't get anxiety, though, it's that they answer like they've never seen (or maybe more relevant related to) a dying animal.

johnpaulkiser · 26m ago
Soon with little help at all for static sites like this. Had chatgpt "recreate" the background image from a screenshot of the site using it's image generator, then had "agent mode" create a linktree style "version" of the site and publish it all without assistance.

https://f7c5b8fb.cozy.space/

a3w · 1h ago
It never said "this website stems from a human".
mockingloris · 1h ago
@a3w I suggest starting from "Real life is for humans..."

└── Dey well; Be well

mockingloris · 1h ago
> This website could have been written by an LLM. Real life is for humans, because you can verify that people you have shaken hands with are not AI. Even if people you've shaken hands with are AI-assisted, they're the editor/director/auteur, nothing gets out without their approval, so it's their speech.

100% Agree.

└── Dey well; Be well

mediumsmart · 1h ago
I’m in.
chasing · 1h ago
I think a lot of AI-generated stuff will soon be seem as cheap schlock, fake plastic knock-offs, the WalMart of ideas. Some people will use it well. Most people won’t.

The question to me is whether we will lets these companies do completely undermine the financial side of the marketplace of ideas that people simple stop spending time writing (if everything’s just going to get chewed to hell by a monster our corporation) or Will writing and create content only in very private and possible purely offline scenarios that these AI companies have less access to.

In a sane world, I would expect guidance and legislation that would bridge the gap and attempt to create an equitable solution so we could have amazing AI tools without crushing by original creators. But we do not live in a sane world.

superllm · 31m ago
awd
superllm · 31m ago
sfesef
johnpaulkiser · 55m ago
I'm building a sort of "neocities" like thing for LLMs and humans alike. It uses git-like content addressability so forking and remix a website is trivial. Although i haven't built those frontend features yet. You can currently only create a detached commit. You can use without an account (we'll see if i regret this) by just uploading the files & clicking publish.

https://cozy.space

Even chatgpt can publish a webpage! Select agent mode and paste in a prompt like this:

"Create a linktree style single static index.html webpage for "Elon Musk", then use the browser & go to https://cozy.space and upload the site, click publish by itself, proceed to view the unclaim website and return the full URL"

Edit: here is what chatgpt one shotted with the above prompt https://893af5fa.cozy.space/

stevetron · 1h ago
If the website is for humans, why isn't it readable? I mean white text on an off-yellow background is mostly only readable by bots and screenreaders. I had to higlight the entire site to read anything, a trick which doesn't always work. And no link to leave a comment to the web site maintainer about the lack of contrast in their color selection.
gffrd · 1h ago
1. Text is black on off-yellow for me, not sure why you’re getting white text

2. There’s literally an email link at the bottom of the page

kevingadd · 1h ago
I see white on dark purple at a perfectly legible size using a regular font. Did an extension you have installed block loading of an image or style sheet?