So…ChatGPT can recreate ye olde Dreamweaver templates but with exponentially more resources consumed, and this is a good thing?
I don’t think we need to foist such basic tasks into a prediction machine, so much as we just need to go back to making competent software for end users again.
Like sure, it’s a neat concept, but man I just do not see the value of this as opposed to prior, lighter, and easier methods of static site generation. If anything, I see an author lamenting the lack of consumer options for site building that don’t involve extortionate subscriptions to overly powerful tools, and trying to reframe ChatGPT as some form of godsend of simplicity when it…kinda isn’t.
bko · 5h ago
> So…ChatGPT can recreate ye olde Dreamweaver templates but with exponentially more resources consumed, and this is a good thing?
The ultimate resource is human time and effort. Why should I care about efficiency of the underlying process. That's captured by the price. So considering an LLM call to this likely a few pennies if that much, and it's saving me even a few seconds, then I would say it's worth it.
There's costs to optimizations. We don't need to optimize everything. This is a nice general solution
stego-tech · 5h ago
> Why should I care about efficiency of the underlying process.
Because, as you just said yourself…
> The ultimate resource is human time and effort.
This is a disconnect I don’t think most people appreciate. Developers seem to operate as if Moore’s law will continue forever, that components are infinite, that cloud resources appear and disappear at the snap of their fingers, and that optimization is pointless in an era of excess.
Meanwhile, the actual engineers are building out new data centers to support these fancy prediction machines, supply chains are exploiting labor abroad for the raw materials necessary to power these guessing boxes, and we’re tearing our hair out at developers casually demanding dozens of CPU cores or terabytes of memory for their newest product specifically because they did no optimizations.
Actual humans - millions of them, in and outside of technology fields - are working in concert to support the least optimized software product in human history, just so you can squander water, energy, and land to run inference on a farm of GPUs to output a static website that Microsoft Word could generate in 2003 on a Pentium 4.
Jesus christ I am sick of this nonsensical argument that because something is cheap, it is somehow optimized and/or superior.
nullwarp · 5h ago
I don't really see where it saves any time. You still have to build the initial template. Then every time you post you go copy and paste that template into ChatGPT?
I guess you could write a script that reads your template and body and submits the whole thing to chatgpt API then dumps out the file. But you also have to double check the whole thing again to make sure it didn't mess anything up or change any text?
But then all you do is invent a shittier SSG that costs more money and more resources for literally no benefit.
Seems dumb
profsummergig · 5h ago
I agree. A goal of humanity should be to make energy too cheap to meter. Using AI will help us get there. Every minute OP saves using ChatGPT instead of a text editor, OP can devote to other things that could incrementally get us there.
chneu · 3h ago
This is willfull ignorance disguised as hopeful.
OP isn't going to save the world. Them saving a few minutes isn't going to free them up to discover unlimited energy. This is just excess consumption masked as something else.
This is just more excuses to justify laziness. "Its worth it cuz one day I'll solve world hunger."
Ergo, plenty more time to work on facilitating energy too cheap to meter.
(You won't understand. It's okay. You lack the imagination and mindset necessary to understand.)
navane · 24m ago
AGI will invent a time machine which will get us to the past where we didnt burn up the oil yet, infinite oil!
profsummergig · 1m ago
Not surprising that you didn't understand my point. (You won't understand. It's okay. You lack the imagination and mindset necessary to understand.)
satiated_grue · 2h ago
Tragedy of the commons. Much of the cost is not born by the purchaser; the true cost of pollution (including greenhouse gases and global warming) is placed on others who had no input.
xnorswap · 5h ago
I'm in two minds whether the OP is satire.
This quote in particular sells me on the idea of satire:
> Almost anything that relies on structured input would be more convenient with an AI solution
You could take what you know about that structure and deterministically transform it, or you could just vibe it over to chatGPT! That doesn't seem like a serious statement at all.
But this feels real:
> I looked into some other generators: Jekyll, Pelican, and so on. But everything seemed to have two problems. First, most of these seemed overkill for what I wanted to do. And second, despite the overhead, many didn’t seem to provide the kind of flexibility I wanted — they relied on writing strict Markdown or fitting within a predefined way of structuring the site.
That's an understandable problem. The purists will suggest, "Just write HTML", but markdown is convenient.
For anyone facing this issue, I'd suggest pandoc does a great job at transforming markdown to html. One of the benefits of HTML was that it would work around what it considered "bad" input. You didn't even need to close tags properly a lot of the time. If you made an error, you'd still get mostly correct output. Pandoc markdown isn't too fussy either. They have a markdown_strict variant if you want strict spec adherence. They also have a gfm (github flavoured markdown) option for those of us used to the github extensions.
Pandoc parses a small markdown page into html in, well:
Milliseconds : 71
So, certainly faster than any LLM response time.
npilk · 4h ago
Perhaps my statement in the first quote is a bit unclear...
Markdown is easier to write than raw HTML, and deterministically transforms. But writing strict Markdown (or using any specific syntax) creates an efficiency penalty - e.g., how do I format a URL again? How do I insert an image? Guess I'll have to look that up. What if I want to use more advanced CSS formatting? I'll have to update the HTML afterwards by hand.
In contrast, using an LLM to format means I can put any kind of text in and the machine will just figure it out for me.
That's what I mean by "more convenient".
profsummergig · 5h ago
In the late 90's there were free WYSIWYG apps that generated reliable HTML/CSS. I did a deep search recently into finding the Microsoft FrontPage equivalent of today, and the only one I could find that worked reliably was Adobe Dreamweaver, and it's not free.
(Kompozer, Brackets, SeaMonkey, BlueGriffon, etc. did not work reliably for me. I found that TinyMCE etc. are WYSIWYG for text, but not for positioning [i.e. they are "rich text editors"])
Why did such free WYSIWYG apps to generate HTML/CSS die out? Anyone have theories?
stanac · 5h ago
They went commercial (wix, squarespace, ...). Commercial offerings are solving not only building but also hosting, while old WYSIWYG apps where solving only building. Now I haven't use any of the apps (old, or new, so I may be wrong), but it would be great to have something that is free/oss and easier to use than SSGs.
stego-tech · 5h ago
Because they enabled too many people to have a web presence independent of social media, and therefore outside of the technosurveillance apparatus of Google/Meta/Amazon/Microsoft.
There’s a healthy appetite for WYSIWYG editors still, but nobody wants to make anything that simple anymore. It’s all about building moats, using the latest frameworks, and ultra-slick designs with spy pixels and a deluge of cookies. Everything must be in a CMS, on a hosting provider, with load balancers and CDNs, and saddled with pop-ups, pop-ins, chat boxes, e-mail signups, metric collection, data hoarding, and third-party tie-ins.
lobsterthief · 5h ago
Just like websites dying out, I think people back then made them as passion projects they hoped would turn into something more (or not).
I remember one I used in the early 2000s called Selida. It had this bug where it would crash randomly on save and I’d lose all of my work. I didn’t know any better and just kept using it. The end result is I had to retype my HTML so many times that it hammered it into my head until I was proficient :) This was a WYSIWYG with a code editor view (like Dreamweaver).
Dachande663 · 5h ago
I remember a lot of them initially struggled with the switch to CSS (but then got much better) and then eventually responsive design just hit them hard. Normal users just didn't know how to handle making things flow and the tools struggled to explain the necessary primitives to them. IMO.
c22 · 5h ago
I think even if you created your site with Frontpage there was still a steep learning curve to getting that content hosted. Now there are many hosts that flatten that curve but they tend to come with their own templates/site generators which are good enough for most users. For power users who want more control HTML/CSS is not that hard to learn.
xnorswap · 5h ago
In the very early days there wasn't. The ISP provided web hosting and free FTP space.
It was literally just a case of going to (from memory, the details might be wrong) ftp://user@password:web.pipex.co.uk/ , and then putting whatever you wanted into the ~/public_html directory there.
Browsers even started bringing out FTP support so you didn't need to find a client, although I found it more reliable to use WS_FTP.
5-20mb of free hosting for every ISP customer was just something ISPs did back then. Everyone was more innocent (or naive, depending how you look at it), but there wasn't much of a barrier to getting a static site up and running, and there really wasn't too much of a learning curve beyond, "Put your files here".
DogRunner · 2h ago
FTP, Email, Webspace and newsgroups access was the norm. good old times at uunet ...
Toritori12 · 5h ago
iirc they generated really bad and unmaintainable code (no css classes, absolute position, etc...), unlike (imo) modern LLMs.
nullwarp · 5h ago
In my experience, LLMs happily generate just as much unmaintainable code as Dreamweaver ever did.
nullwarp · 5h ago
Seems extremely.... wasteful?
Why use something that can take next to no CPU and instead use massive energy sucking LLM?
stego-tech · 5h ago
Because in the current tech cycle, everything must be a subscription forever, controlled within the walled garden of a service provider or building upon some other complex infrastructure or framework to let someone justify their expertise in said technology to themselves.
Static site generation really should be as simple as “Export to HTML”, and upload to your web server. The fact it’s not anymore shows that none of these are about “democratizing” anything, but just locking people into prisons of their own making with highly specific tools.
Yeah, looking at that blog source, I can definitely tell it's been generated by an LLM.
```
<!-- Footer -->
<footer class="py-3" style="margin-top:5rem;">
```
npilk · 4h ago
What makes you say this? Using inline CSS styling alongside Bootstrap classes? Applying both vertical padding and margin?
(For what it's worth, that was part of the handwritten template already, not something the LLM generated on its own.)
Deukhoofd · 1h ago
Adding absolutely useless comments everywhere. In this case, adding a footer comment above a footer element.
blendergeek · 5h ago
I've been doing this for a few years now. It usually works but occasionally rewrite the posts or forgets what it is doing. Really frustrating in my opinion
b0a04gl · 5h ago
needed a quick internal tool page, typed the prompt into it during a meeting. by the time call ended, had a working layout with header, form, and footer. didn’t ship it as-is, but 80% was usable without edits. quietly started using it as my default scratchpad
tantalor · 5h ago
Missing "[joke]" tag.
deadbabe · 5h ago
Why stop at static sites?
ChatGPT could develop your dynamic website too, per request.
msgodel · 4h ago
LLM in a CGI script sounds like a chaos monkey by another name.
I don’t think we need to foist such basic tasks into a prediction machine, so much as we just need to go back to making competent software for end users again.
Like sure, it’s a neat concept, but man I just do not see the value of this as opposed to prior, lighter, and easier methods of static site generation. If anything, I see an author lamenting the lack of consumer options for site building that don’t involve extortionate subscriptions to overly powerful tools, and trying to reframe ChatGPT as some form of godsend of simplicity when it…kinda isn’t.
The ultimate resource is human time and effort. Why should I care about efficiency of the underlying process. That's captured by the price. So considering an LLM call to this likely a few pennies if that much, and it's saving me even a few seconds, then I would say it's worth it.
There's costs to optimizations. We don't need to optimize everything. This is a nice general solution
Because, as you just said yourself…
> The ultimate resource is human time and effort.
This is a disconnect I don’t think most people appreciate. Developers seem to operate as if Moore’s law will continue forever, that components are infinite, that cloud resources appear and disappear at the snap of their fingers, and that optimization is pointless in an era of excess.
Meanwhile, the actual engineers are building out new data centers to support these fancy prediction machines, supply chains are exploiting labor abroad for the raw materials necessary to power these guessing boxes, and we’re tearing our hair out at developers casually demanding dozens of CPU cores or terabytes of memory for their newest product specifically because they did no optimizations.
Actual humans - millions of them, in and outside of technology fields - are working in concert to support the least optimized software product in human history, just so you can squander water, energy, and land to run inference on a farm of GPUs to output a static website that Microsoft Word could generate in 2003 on a Pentium 4.
Jesus christ I am sick of this nonsensical argument that because something is cheap, it is somehow optimized and/or superior.
I guess you could write a script that reads your template and body and submits the whole thing to chatgpt API then dumps out the file. But you also have to double check the whole thing again to make sure it didn't mess anything up or change any text?
But then all you do is invent a shittier SSG that costs more money and more resources for literally no benefit.
Seems dumb
OP isn't going to save the world. Them saving a few minutes isn't going to free them up to discover unlimited energy. This is just excess consumption masked as something else.
This is just more excuses to justify laziness. "Its worth it cuz one day I'll solve world hunger."
Add it all up = trillions of hours saved.
Ergo, plenty more time to work on facilitating energy too cheap to meter.
(You won't understand. It's okay. You lack the imagination and mindset necessary to understand.)
This quote in particular sells me on the idea of satire:
> Almost anything that relies on structured input would be more convenient with an AI solution
You could take what you know about that structure and deterministically transform it, or you could just vibe it over to chatGPT! That doesn't seem like a serious statement at all.
But this feels real:
> I looked into some other generators: Jekyll, Pelican, and so on. But everything seemed to have two problems. First, most of these seemed overkill for what I wanted to do. And second, despite the overhead, many didn’t seem to provide the kind of flexibility I wanted — they relied on writing strict Markdown or fitting within a predefined way of structuring the site.
That's an understandable problem. The purists will suggest, "Just write HTML", but markdown is convenient.
For anyone facing this issue, I'd suggest pandoc does a great job at transforming markdown to html. One of the benefits of HTML was that it would work around what it considered "bad" input. You didn't even need to close tags properly a lot of the time. If you made an error, you'd still get mostly correct output. Pandoc markdown isn't too fussy either. They have a markdown_strict variant if you want strict spec adherence. They also have a gfm (github flavoured markdown) option for those of us used to the github extensions.
Pandoc parses a small markdown page into html in, well:
So, certainly faster than any LLM response time.Markdown is easier to write than raw HTML, and deterministically transforms. But writing strict Markdown (or using any specific syntax) creates an efficiency penalty - e.g., how do I format a URL again? How do I insert an image? Guess I'll have to look that up. What if I want to use more advanced CSS formatting? I'll have to update the HTML afterwards by hand.
In contrast, using an LLM to format means I can put any kind of text in and the machine will just figure it out for me.
That's what I mean by "more convenient".
(Kompozer, Brackets, SeaMonkey, BlueGriffon, etc. did not work reliably for me. I found that TinyMCE etc. are WYSIWYG for text, but not for positioning [i.e. they are "rich text editors"])
Why did such free WYSIWYG apps to generate HTML/CSS die out? Anyone have theories?
There’s a healthy appetite for WYSIWYG editors still, but nobody wants to make anything that simple anymore. It’s all about building moats, using the latest frameworks, and ultra-slick designs with spy pixels and a deluge of cookies. Everything must be in a CMS, on a hosting provider, with load balancers and CDNs, and saddled with pop-ups, pop-ins, chat boxes, e-mail signups, metric collection, data hoarding, and third-party tie-ins.
It was literally just a case of going to (from memory, the details might be wrong) ftp://user@password:web.pipex.co.uk/ , and then putting whatever you wanted into the ~/public_html directory there.
Browsers even started bringing out FTP support so you didn't need to find a client, although I found it more reliable to use WS_FTP.
5-20mb of free hosting for every ISP customer was just something ISPs did back then. Everyone was more innocent (or naive, depending how you look at it), but there wasn't much of a barrier to getting a static site up and running, and there really wasn't too much of a learning curve beyond, "Put your files here".
Why use something that can take next to no CPU and instead use massive energy sucking LLM?
Static site generation really should be as simple as “Export to HTML”, and upload to your web server. The fact it’s not anymore shows that none of these are about “democratizing” anything, but just locking people into prisons of their own making with highly specific tools.
``` <!-- Footer -->
```(For what it's worth, that was part of the handwritten template already, not something the LLM generated on its own.)
ChatGPT could develop your dynamic website too, per request.