XSLT – Native, zero-config build system for the Web

172 _kush 110 6/27/2025, 5:00:41 AM github.com ↗

Comments (110)

badmintonbaseba · 1h ago
I have worked for a company that (probably still is) heavily invested in XSLT for XML templating. It's not good, and they would probably migrate from it if they could.

  1. Even though there are newer XSLT standards, XSLT 1.0 is still dominant. It is quite limited and weird compared to the newer standards.

  2. Resolving performance problems of XSLT templates is hell. XSLT is a Turing-complete functional-style language, with performance very much abstracted away. There are XSLT templates that worked fine for most documents, but then one document came in with a ~100 row table and it blew up. Turns out that the template that processed the table is O(N^2) or worse, without any obvious way to optimize it (it might even have an XPath on each row that itself is O(N) or worse). I don't exactly know how it manifested, but as I recall the document was processed by XSLT for more than 7 minutes.
JS might have other problems, but not being able to resolve algorithmic complexity issues is not one of them.
mark_and_sweep · 42s ago
From my experience, most simple websites are fine with XSLT 1.0 and don't experience any performance problems.
bux93 · 4m ago
Are you using the commercial version of Saxon? It's not expensive, and IMHO worth it for the features it supports (including the newer standards) and the performance. If I remember correctly (it was a long time ago) it does some clever optimizations.
woodpanel · 23m ago
Same here.

A couple of blue chip websites I‘ve seen that could be completely taken down just by requesting the sitemap (more than once per minute).

PS: That being said it is an implantation issue. But it may speak for itself that 100% of the XSLT projects I‘ve seen had it.

p0w3n3d · 2h ago
Ok, so it might be a long shot, but I would say that

1. the browsers were inconsistent in 1990-2000 so we started using JS to make them behave the same

2. meanwhile the only thing we needed were good CSS styles which were not yet present and consistent behaviour

3. over the years the browsers started behaving the same (mainly because Highlander rules - there can be only one, but Firefox is also coping well)

4. but we already got used to having frameworks that would make the pages look the same on all browsers. Also the paradigm was switched to have json data rendered

5. at the current technology we could cope with server generated old-school web pages because they would have low footprint, work faster and require less memory.

Why do I say that? Recently we started working on a migration from a legacy system. Looks like 2000s standard page per HTTP request. Every action like add remove etc. requires a http refresh. However it works much faster than our react system. Because:

1. Nowadays the internet is much faster

2. Phones have a lot of memory which is wasted by js frameworks

3. in the backend all's almost same old story - CRUD CRUD and CRUD (+ pagination, + transactions)

ozim · 6m ago
AJAX and updating DOM wasn't there just to "make things faster" it was implemented there to change paradigm of "web sites" or "web documents" — because web was for displaying documents. Full page reload makes sense if you are working in a document paradigm.

It works well here on HN for example as it is quite simple.

There are a lot of other examples where people most likely should do a simple website instead of using JS framework.

But "we could all go back to full page reloads" is not true, as there really are proper "web applications" out there for which full page reloads would be a terrible UX.

To summarize there are:

"websites", "web documents", "web forms" that mostly could get away with full page reloads

"web applications" that need complex stuff presented and manipulated while full page reload would not be a good solution

viraptor · 1h ago
That timeline doesn't sound right to me. JS was rarely used to standardise behaviour - we had lots of user agent detection and relying on quirks ordering to force the right layout. JS really was for the interactivity at the beginning - DHTML and later AJAX. I don't think it even had easy access to layout related things? (I may be mistaken though) CSS didn't really make things more consistent either - once it became capable it was still a mess. Sure, CSS garden was great and everyone was so impressed with semantic markup while coding tables everywhere. It took ages for anything to actually pass first two ACIDs. I'm not sure frameworks ever really impacted the "consistent looks" side of things - by the time we grew out of jQuery, CSS was the looks thing.

Then again, it was a long time. Maybe it's me misremembering.

jonwinstanley · 1h ago
For me, JQuery was the thing that fixed the browser inconsistencies. If you used JQuery for everything, your code worked in all the browsers.

This was maybe 2008?

Cthulhu_ · 15m ago
Before jQuery there was Prototype.js, part of early AJAX support in RoR, which fixed inconsistencies in how browsers could fetch data, especially in the era between IE 5 and 7 (native JS `XMLHttpRequest` was only available from IE 7 onwards, before that it was some ActiveX thing. The other browsers supported it from the get go). My memory is vague, but it also added stuff like selectors, and on top of that was script.aculo.us which added animations and other such fanciness.

jQuery took over very quickly though for all of those.

JimDabell · 1h ago
jQuery in ~2008 was when it kinda took off, but jQuery was itself an outgrowth of work done before it on browser compatibility with JavaScript. In particular, events.

Internet Explorer didn’t support DOM events, so addEventListener wasn’t cross-browser compatible. A lot of people put work in to come up with an addEvent that worked consistently cross-browser.

The DOMContentLoaded event didn’t exist, only the load event. The load event wasn’t really suitable for setting up things like event handlers because it would wait until all external resources like images had been loaded too, which was a significant delay during which time the user could be interacting with the page. Getting JavaScript to run consistently after the DOM was available, but without waiting for images was a bit tricky.

These kinds of things were iterated on in a series of blog posts from several different web developers. One blogger would publish one solution, people would find shortcomings with it, then another blogger would publish a version that fixed some things, and so on.

This is an example of the kind of thing that was happening, and you’ll note that it refers to work on this going back to 2001:

https://robertnyman.com/2006/08/30/event-handling-in-javascr...

When jQuery came along, it was really trying to achieve two things: firstly, incorporating things like this to help browser compatibility; and second, to provide a “fluent” API where you could chain API calls together.

benediktwerner · 1h ago
Wasn't it more about inconsistencies in JS though? For stuff which didn't need JS at all, there also shouldn't be much need for JQuery.
dspillett · 1m ago
jQuery, along with a number of similar attempts and more single-item-focused polyfills¹ was as much about DOM inconsistencies as JS ones. It was also about making dealing with the DOM more convenient² even where it was already consistent between commonly used browsers.

DOM manipulation of that sort is JS dependent, of course, but I think considering language features and the environment, like the DOM, to be separate-but-related concerns is valid. There were less kitchen-sink-y libraries that only concentrated on language features or specific DOM features. Some may even consider a few parts in a third section: the standard library, though that feature set might be rather small (not much more than the XMLHTTPRequest replacement/wrappers?) to consider its own thing.

> For stuff which didn't need JS at all, there also shouldn't be much need for JQuery.

That much is mostly true, as it by default didn't do anything to change non-scripted pages. Some polyfills for static HTML (for features that were inconsistent, or missing entirely in, usually, old-IE) were implemented as jQuery plugins though.

--------

[1] Though I don't think they were called that back then, the term coming later IIRC.

[2] Method chaining³, better built-in searching and filtering functions⁴, and so forth.

[3] This divides opinions a bit though was generally popular, some other libraries did the same, others tried different approaches.

[4] Which we ended up coding repeatedly in slightly different ways when needed otherwise.

jbverschoor · 1h ago
Probably 2005.

2002, I was using “JSRS”, and returning http 204/no content, which causes the browser to NOT refresh/load the page.

Just for small interactive things, like a start/pause button for scheduled tasks. The progress bar etc.

But yeah, in my opinion we lost about 15 years of proper progress.

The network is the computer came true

The SUN/JEE model is great.

It’s just that monopolies stifle progress and better standards.

Standards are pretty much dead, and everything is at the application layer.

That said.. I think XSLT sucks, although I haven’t touched it in almost 20 years. The projects I was on, there was this designer/xslt guru. He could do anything with it.

XPath is quite nice though

JimDabell · 56m ago
> But yeah, in my opinion we lost about 15 years of proper progress.

Internet Explorer 6 was released in 2001 and didn’t drop below 3% worldwide until 2015. So that’s a solid 14 years of paralysis in browser compatibility.

middleagedman · 15m ago
Old guy here. Agreed- the actual story of web development and JavaScript’s use was much different.

HTML was the original standard, not JS. HTML was evolving early on, but the web was much more standard than it was today.

Early-mid 1990s web was awesome. HTML served HTTP, and pages used header tags, text, hr, then some backgound color variation and images. CGI in a cgi-bin dir was used for server-side functionality, often written in Perl or C: https://en.m.wikipedia.org/wiki/Common_Gateway_Interface

Back then, if you learned a little HTML, you could serve up audio, animated gifs, and links to files, or Apache could just list files in directories to browse like a fileserver without any search. People might get a friend to let them have access to their server and put content up in it or university, etc. You might be on a server where they had a cgi-bin script or two to email people or save/retrieve from a database, etc. There was also a mailto in addition to href for the a (anchor) tag for hyperlinks so you could just put you email address there.

Then a ton of new things were appearing. PhP on server-side. JavaScript came out but wasn’t used much except for a couple of party tricks. ColdFusion on server-side. Around the same time was VBScript which was nice but just for IE/Windows, but it was big. Perl then PhP were also big on server-side. If you installed Java you could use Applets which were neat little applications on the page. Java Web Server came out serverside and there were JSPs. Java Tomcat came out on server-side. ActionScript came out to basically replace VBScript but do it on serverside with ASPs. VBScript support went away.

During this whole time, JavaScript had just evolved into more party tricks and thing like form validation. It was fun, but it was PhP, ASP, JSP/Struts/etc. serverside in early 2000s, with Rails coming out and ColdFusion going away mostly. Facebook was PhP mid-2000s, and LAMP stack, etc. People breaking up images using tables, CSS coming out with slow adoption. It wasn’t until mid to later 2000s until JavaScript started being used for UI much, and Google’s fostering of it and development of v8 where it was taken more seriously because it was slow before then. And when it finally got big, there was an awful several years where it was framework after framework super-JavaScript ADHD which drove a lot of developers to leave web development, because of the move from server-side to client-side, along with NoSQL DBs, seemingly stupid things were happening like client-side credential storage, ignoring ACID for data, etc.

So- all that to say, it wasn’t until 2007-2011 before JS took off.

bob1029 · 21m ago
> at the current technology we could cope with server generated old-school web pages because they would have low footprint, work faster and require less memory

I've got a .NET/Kestrel/SQLite stack that can crank out SSR responses in no more than ~4 milliseconds. Average response time is measured in hundreds of microseconds when running release builds. This is with multiple queries per page, many using complex joins to compose view-specific response shapes. Getting the data in the right shape before interpolating HTML strings can really help with performance in some of those edges like building a table with 100k rows. LINQ is fast, but approaches like materializing a collection per row can get super expensive as the # of items grows.

The closer together you can get the HTML templating engine and the database, the better things will go in my experience. At the end of the day, all of that fancy structured DOM is just a stream of bytes that needs to be fed to the client. Worrying about elaborate AST/parser approaches when you could just use StringBuilder and clever SQL queries has created an entire pointless, self-serving industry. The only arguments I've ever heard against using something approximating this boil down to arrogant security hall monitors who think developers cant be trusted to use the HTML escape function properly.

em-bee · 1h ago
at the current technology we could cope with server generated old-school web pages because they would have low footprint, work faster and require less memory.

unless you have a high latency internet connection: https://news.ycombinator.com/item?id=44326816

p0w3n3d · 1h ago
however when you have a high latency connection, the "thick client" json-filled webapp will only have its advantages if the most of the business logic happens on the browser. I.e. Google Docs - great and much better than it used to be in 2000s design style. Application that searches the apartments to rent? Not really I would say.

-- edit --

by the way in 2005 I programmed using very funny PHP framework PRADO that was sending every change in the UI to the server. Boy it was slow and server heavy. This was the direction we should have never gone...

em-bee · 1h ago
Application that searches the apartments to rent? Not really I would say.

not a good example. i can't find it now, but there was a story/comment about a realtor app that people used to sell houses. often when they were out with a potential buyer they had bad internet access and loading new data and pictures for houses was a pain. it wasn't until they switched to using a frontend framework to preload everything with the occasional updates that the app became usable.

low latency affects any interaction with a site. even hackernews is a pain to read over low latency and would improve if new comments where loaded in the background. the problem creeps up on you faster than you think.

catmanjan · 51m ago
Lol you'd hate to see what blazor is doing then
Tade0 · 44m ago
Or Phoenix.LiveView for that matter.
bayindirh · 21m ago
People love to complain about verbosity of XML, and it looks complicated from a distance, but I love how I can create a good file format based on XML, validate with a DTD and format with XSLT if I need to make it very human readable.

XML is the C++ of text based file formats if you ask me. It's mature, batteries included, powerful and can be used with any language, if you prefer.

Like old and mature languages with their own quirks, it's sadly fashionable to complain about it. If it doesn't fit the use case, it's fine, but treating it like an abomination is not.

CiaranMcNulty · 2h ago
It's sad how the bloat of '00s enterprise XML made the tech seem outdated and drove everyone to 'cleaner' JSON, because things like XSLT and XPath were very mature and solved a lot of the problems we still struggle with in other formats.

I'm probably guilty of some of the bad practice: I have fond memories of (ab)using XSLT includes back in the day with PHP stream wrappers to have stuff like `<xsl:include href="mycorp://invoice/1234">`

This may be out-of-date bias but I'm still a little uneasy letting the browser do the locally, just because it used to be a minefield of incompatibility

aitchnyu · 56s ago
In the 2003 The Art of Unix Programming, the author advocated bespoke text formats and writing parsers for them. Writing xml by hand is his list of war crimes. Since then syntax highlighting and autocomplete and autoformatting narrowed the effort gap and tolerant parsers (browsers being the main example) got a bad rap. Would Markdown and Yaml exist with modern editors?
Cthulhu_ · 11m ago
It's been 84 years but I still miss some of the "basics" of XML in JSON - a proper standards organization, for one. But things like schemas were (or, felt like) so much better defined in XML land, and it took nearly a decade for JSON land to catch up.

Last thing I really did with XML was a technology called EXI, a transfer method that converted an XML document into a compressed binary data stream. Because translating a data structure to ASCII, compressing it, sending it over HTTP etc and doing the same thing in reverse is a bit silly. At this point protobuf and co are more popular, but imagine if XML stayed around. It's all compatible standards working with each other (in my idealized mind), whereas there's a hard barrier between e.g. protobuf/grpc and JSON APIs. Possibly for the better?

rwmj · 56m ago
XML is fine. A bit wordy, but I appreciate its precision and expressiveness compared to YAML.

XPath is kind of fine. It's hard to remember all the syntax but I can usually get there with a bit of experimentation.

XSLT is absolutely insane nonsense and needs to die in a fire.

codeulike · 2h ago
Xpath would have been nice if you didnt have to pedantically namespace every bit of every query
masklinn · 1h ago
That… has nothing to do with xpath?

If your document has namespaces, xpath has to reflect that. You can either tank it or explicitly ignore namespaces by foregoing the shorthands and checking `local-name()`.

maxloh · 1h ago
However, XML is actually a worse format to transfer over the internet. It's bloated and consumes more bandwidth.
JimDabell · 43m ago
XML is a great format for what it’s intended for.

XML is a markup language system. You typically have a document, and various parts of it can be marked up with metadata, to an arbitrary degree.

JSON is a data format. You typically have a fixed schema and things are located within it at known positions.

Both of these have use-cases where they are better than the other. For something like a web page, you want a markup language that you progressively render by stepping through the byte stream. For something like a config file, you want a data format where you can look up specific keys.

Generally speaking, if you’re thinking about parsing something by streaming its contents and reacting to what you see, that’s the kind of application where XML fits. But if you’re thinking about parsing something by loading it into memory and looking up keys, then that’s the kind of application where JSON fits.

rwmj · 56m ago
Only if you never use compression.
susam · 2h ago
These days I use XSLT to style my feeds. For example:

https://susam.net/feed.xml

https://susam.net/feed.xsl

kome · 1h ago
beautiful, well done! i hope people will copy that for their own websites. and use it creatively.
alexjplant · 2h ago
One of my first projects as a professional software engineer at the ripe age of 19 was customizing a pair of Google Search Appliances that my employer had bought. They'd shelled out hundreds of thousands of dollars to rack yellow-faced Dell servers running CentOS with some Google-y Python because they thought that being able to perform full-text searches of vast CIFS document stores would streamline their business development processes. Circa 2011 XHTML was all the rage and the GSA's modus operandi was to transform search results served from the backend in XML into XHTML via XSLT. I took the stock template and turned it into an unholy abomination that served something resembling the rest of the corporate intranet portal by way of assets and markup stolen from rendered Coldfusion application pages, StackOverflow, and W3Schools tutorials.

I learned quickly to leave this particular experience off of my resume as sundry DoD contractors contacted me on LinkedIn for my "XML expertise" to participate in various documentation modernization projects.

The next time you sigh as you use JSX to iterate over an array of Typescript interfaces deserialized from a JSON response remember this post - you could be me doing the same in XSLT :-).

ZYbCRq22HbJ2y7 · 55m ago
When I a teenager around 2002, I made what one might call a blogging platform today, and it was using asp, xhtml, xslt, and xml. It worked well in browsers at that time. When I look back on it, it depresses me that I didn't even realize someone could make money hacking together web applications until like a decade later.
Calwestjobs · 49m ago
Epub is this, compressed into one file/package. So you could be amazon ;)
Wololooo · 3h ago
Me simple man. Me see caveman readme, me like. Sometimes me feel like caveman hitting keyboard to make machine do no good stuff. But sometimes, stuff good. Me no do websites or web things, but me not know about XSLT. Me sometimes hack XML. Me sometimes want show user things. Many many different files format makes head hurt. Me like pretty things though. Me might use this.

Thank you reading specs.

Thank you making tool.

fergie · 2h ago
What is this "XSLT works natively in the browser" sourcery? The last time I used XSLT was like 20 years ago- but I used it A LOT, FOR YEARS. In those days you needed a massive wobbly tower of enterprise Java to make it work which sort of detracted from the elegance of XSLT itself. But if XSLT actually works in the browser- has the holy grail of host-anywhere static templating actually been sitting under our noses this whole time?
jillesvangurp · 16m ago
> massive wobbly tower of enterprise Java to make it work

It wasn't that bad. We used tomcat and some apache libraries for this. Worked fine.

Our CMS was spitting out XML files with embedded HTML that were very cachable. We handled personalization and rendering to HTML (and js) server side with a caching proxy. The XSL transformation ran after the cache and was fast enough to keep up with a lot of traffic. Basically the point of the XML here was to put all the ready HTML in blobs and all the stuff that needed personalization as XML tags. So the final transform was pretty fast. The XSL transformer was heavily optimized and the trick was to stream its output straight to the response output stream and not do in memory buffering of the full content. That's still a good trick BTW. that most frameworks do wrong out of the box because in memory buffering is easier for the user. It can make a big difference for large responses.

These days, you can run whatever you want in a browser via wasm of course. But back then javascript was a mess and designers delivered photoshop files, at best. Which you then had to cut up into frames and tables and what not. I remember Google Maps and Gmail had just come out and we were doing a pretty javascript heavy UI for our CMS and having to support both Netscape and Internet Explorer, which both had very different ideas about how to do stuff.

rsolva · 1h ago
Browsers support XSLT v1.0 only, and from what I understand, there has been talk of depricating it.

I would rather that they introduced support for v3, as that would make it easier to serving static webpages with native support for templating.

smartmic · 59m ago
I'm also more concerned about depreciation risk. However, you can still do a lot with XSLT 1.0. There is also SaxonJS, which allows you to run XSLT 3.0. However, embedding JavaScript to use XSLT defeats the purpose of this exercise.
Symbiote · 2h ago
I worked with a site using XSLT in the browser in 2008, but I think support goes back to the early 2000s.
fergie · 2h ago
I was _really_ deep into XSLT- I even wrote the XSLT 2 parser for Wikipedia in like 2009, so I'm not sure why I haven't been aware of browser native support for transformations until now. Or maybe I was and I just forgot.
arccy · 43m ago
it works, i think the most visible ones are where people style their atom / rss feeds instead of rendering separate xml / html pages https://rknight.me/blog/styling-rss-and-atom-feeds/
JimDabell · 1h ago
I used XSLT as a build system for websites way back in 1999–2000. The developer ergonomics were terrible. Looking at the example given, it doesn’t seem like anything much has changed.

Has there been any progress on making this into something developers would actually like to use? As far as I can tell, it’s only ever used in situations where it’s a last resort, such as making Atom/RSS feeds viewable in browsers that don’t support them.

hamdouni · 11m ago
Still maintaining an e-commerce site using XML/xslt and Java/servlet... Passed easily each wave of tech and survived 2 databases migrations (mainframe/db2 => sqlserver => ERP)
elcapitan · 2h ago
> how I can run it? open XML file > open blog.xml -a Safari

This didn't work for me on my browsers (FF/Chrome/Safari) on Mac, apparently XSLT only works there when accessed through HTTP:

    $ python3 -m http.server --directory .
    $ open http://localhost:8000/blog.xml

I remember long hours using XSLT to transform custom XML formats into some other representation that was used by WXWindows in the 2000s, maybe I should give it a shot again for Web :)
notpushkin · 1h ago
> --directory .

Huh, neat! Did’t know it supported that. (python3 -m http.server will default to current directory anyway though)

susam · 1h ago
Yes! I often use a command like this to test my statically generated website locally using a command like this:

  python3 -m http.server -d _site/
Example: https://github.com/susam/susam.net/blob/0.3.0/Makefile#L264-...
scotty79 · 1m ago
Long time ago somebody wanted to put a searchable directory of products on a CD. It was maybe 100MB. There was no sqlite back then and the best browser you could count on your client having was probably IE 5.5

JS was waay too slow, but it turned out that even back then XSLT was blazing fast. So I basically generated XML with all the data, wrote a simple XSLT with one clever XPath that generated search input form, did the search and displayed the results, slapped the xml file in CD auto-run and called it a day. It was finding results in a second or less. One of my best hacks ever.

Since then I always wanted to make a html templating system that compiles to XSLT and does the HTML generation on client side. I wrote some, but back then Firefox didn't support displaying XML+XSLT directly and the workaround I came up with I didn't like. Then the AJAX came and then JS got faster and client side rendering with JS became viable. But I still think it's a good idea if we ever want to come back to purely server driven request-response flow.

cyphax · 2h ago
In my first job, when .net didn't yet exist, xml + xslt was the templating engine we used for html and (html) e-mail and sometimes csv. I'd write queries in sql server using "for xml" and it would output all data needed for a page and feed it to an xsl template (all server side) which would output html. Microsoft had a caching xsl parser that would result in less than 10ms to load such a page. Up until we though "hey, let's start using xml namespaces, that sounds like a good idea!". Was a bit less fun after that! Looking back it was a pretty good stack, and it would still work fine today imho. I never started disliking it, but after leaving that job I never wrote another stylesheet.
meinersbur · 35m ago
There is a classic DailyWTF about this technique: https://thedailywtf.com/articles/Sketchy-Skecherscom

> [...] the idea of building a website like this in XML and then transforming it using XSL is absurd in and of itself [...]

In the comments the creators comment on it, like that it was a mess to debug. But I could not find anything wrong with the technique itself, assuming that it is working.

kimi · 21m ago
Just my two cents - the worst pieces of tech I ever worked with in my 40+ year career were Hibernate (second) and XSLT templating for an email templating system around 2005. Would not touch it with a stick if I can avoid it.
JonChesterfield · 3h ago
I looked into this a while ago and concluded that it works fine but browsers are making stroppy noises about deprecating it, so ended up running the transform locally to get html5. Disappointing.
kstrauser · 3h ago
Whoa, I just realized how much Zope’s page templates were basically XSLT that looked slightly different.

This gives me new appreciation for how powerful XSLT is, and how glad I am that I can use almost anything else to get the same end results. Give me Jinja or Mustache any day. Just plain old s-exprs for that matter. Just please don’t ever make me write XML with XML again.

pornel · 43m ago
Zope was cool in that you couldn't generate ill-formed markup, and optionally wrapping something in `<a>` didn't need repeating the same condition for `</a>`.

However, it was much simpler imperative language with some macros.

XSLT is more like a set of queries competing to run against a document, and it's easy to make something incomprehensibly complex if you're not careful.

sivanmz · 2h ago
I worked with XSLT almost from the beginning of my career and it was a blessing in disguise. Shoutout to Michael Kay.
azurezyq · 2h ago
My first internship was in intel on XSLT 2.0 processor. Michael Key is a legend indeed. IIRC, Saxon was his one-man creation. Crazy!
nmeofthestate · 2h ago
XSLT is cool and was quite mind-expanding for me when it came out - I wouldn't say it's "grug brain" level technology at all. An XML language for manipulating XML - can get quite confusing and "meta". I wouldn't pick it as a tool these days.
murukesh_s · 3h ago
Sometimes I wish we could have kept XML alive alongside JSON.. I miss the comments, CDATA etc, especially when you have to serialize complex state. I know there are alternatives to JSON like YAML but I felt XML was better than YAML. We adopted JSON for its simplicity but tried to retrofit schema and other things that made XML complex. Like we kind of reinvented JSON Schema, and ended up like what XSD did decades ago and still lacking a good alternative to XSLT..
mike_hearn · 1h ago
The XSL:T equivalent for JSON is React.

Let's not romanticize XML. I wrote a whole app that used XSL:T about 25 years ago (it was a military contract and for some reason that required the use of an XML database, don't ask me). Yes it had some advantages over JSON but XSL:T was a total pain to work with at scale. It's a functional language, so you have to get into that mindset first. Then it's actually multiple functional languages composed together, so you have to learn XPath too, which is only a bit more friendly than regular expressions. The language is dominated by hacks working around the fact that it uses XML as its syntax. And there are (were?) no useful debuggers or other tooling. IIRC you didn't even have any equivalent of printf debugging. If you screwed up in some way you just got the wrong output.

Compared to that React is much better. The syntax is much cleaner and more appropriate, you can mix imperative and FP, you have proper debugging and profiling tools, and it supports incremental re-transform so it's actually useful for an interactive UI whereas XSL:T never was so you needed JS anyway.

ahofmann · 1h ago
I just had to explain to some newbies that SOAP is a protocol with rigid rules; REST is an architectural style with flexibility. The latter means that you have to work and document really well and consumers of the API need tools like Postman etc. to be even able to use the API. With SOAP, you get most of that for free.
n_plus_1_acc · 3h ago
I agree wholeheartedly, but the XML library in them JS ecosystem is shit.
w3news · 2h ago
I remember that I did the same in 2005-2006, just combine XML with XSL(T) to let the browser transform the XML into HTML. After that, also combined XML with XSL(T) with PHP. At that time modern way of working, separate concerns in the frontend. Around 2008-2009 I stopped with this method, and start using e.g. smarty. I still like the idea of using all native methods from browsers, that are described at the W3c. No frameworks or libraries needed, keep it simple and robust.

I think there are just a few that know XSL(T) these days, or need some refresh (like me).

pjmlp · 1h ago
I love XSLT, that is what I ported my site to after the CGI phase.

Unfortunately it is not a sentiment that is shared by many, and many developers always had issues understanding the FP approach of its design, looking beyond the XML.

25 years later we have JSON and YAML formats reinventing the wheel, mostly badly, for that we already had nicely available on the XML ecosystem.

Schemas, validation, graphical transformation tools, structured editors, comments, plugins, namespaces,...

masklinn · 1h ago
> many developers always had issues understanding the FP approach of its design, looking beyond the XML.

It would probably help if xslt was not a god-awful language even before it was expressed via an even worse syntax.

pjmlp · 44s ago
The root cause is that many failed to grasp XML isn't to be manually written by hand on vi, rather it is a tool oriented format.

Now we have to reach for tooling to work around the design flaws of json and yaml.

windowsworkstoo · 1h ago
Agree, when MS moved their office file formats to xml, I made plenty of money building extremely customizable templating engines all based on a very small amount of XSLT - it worked great given all the structure and metadata available in xml
rpigab · 1h ago
My first resume was in XSLT, because I didn't want to duplicate HTML tags and style around, it worked really well, and it was fun to see the xml first when clicking "view source".
em-bee · 2h ago
i have a static website with a menu. keeping the menu synchronized over the half dozen pages is a pain.

my only option to fix this are javascript, xslt or a server side html generator. (and before you ask, static site generators are no better, they just make the generation part manual instead of automatic.)

i don't actually care if the site is static. i only care that maintenance is simple.

build tools are not simple. they tend to suffer from bitrot because they are not bundled with the hosting of the site or the site content.

server side html generators (aka content management systems, etc.) are large and tie me to a particular platform.

frontend frameworks by default require a build step and of course need javascript in the browser. some frameworks can be included without build tools, and that's better, but also overkill for large sites. and of course then you are tied to the framework.

another option is writing custom javascript code to include an html snippet from another file.

or maybe i can try to rig include with xslt. will that shut up the people who want to view my site without javascript?

at some point there was discussion for html include, but it has been dropped. why?

rsolva · 1h ago
I recently tried building a website using Server Side Includes (SSI) with apache/nginx to make templates for the head, header and footer. Then I found myself missing the way Hugo does things, using a base template and injecting the content into the base template instead.

This was easy do achieve with PHP with a super minimal setup, so I thought, why not? Still no build steps!

PHP is quite ubiquitous and stable these days so it is practically equivalent to making a static site. Just a few sprinkles of dynamism to avoid repeting HTML all over the place.

rossant · 2h ago
I made a website based on XML documents and XSLT transformations about 20 years ago. I really liked the concept. The infrastructure could have been made much simpler but I guess I wanted to have an excuse to play with these technologies.

After spending months working on my development machine, I deployed the website to my VPS, to realize to my utter dismay that the XSLT module was not activated on the PHP configuration. I had to ask the (small) company to update their PHP installation just for me, which they promptly did.

jbaiter · 2h ago
Does anybody remember Cocoon? It was an XSLT Web Framework that built upon Spring. It was pretty neat, you could do the stuff XSLT was great at with stylesheets that were mapped to HTTP routes, and it was very easy to extend it with custom functions and supporting Java code to do the stuff it wasn't really great at. Though I must say that as the XSLT stylesheets grew in complexity, they got *really* hard to understand, especially compared to something like a Jinja template.
tomduncalf · 3h ago
Early in my career I worked on a carrier's mobile internet portal in the days before smartphones. It was XSLT all the way down, including individual XSLT transforms for every single component the CMS had for every single handset we supported (hundreds) as they all had different capabilities and browser bugs. It was not that fun to write complex logic in haha but was kind of an interesting thing to work on, before iPhone etc came along and everything could just render normal websites.
calmbonsai · 3h ago
Same. I was part of the mobile media messaging (WAP) roll-out at Vodafone. Oh man, XSLT was one of those "theoretical" W3C languages that (rightfully) aged like milk. Never again.
tomduncalf · 3h ago
Ha! I was at Orange. I suspect all the carriers had similar setups. Yeah I don’t miss working with that lol
enqk · 2h ago
I worked in the same period for a finnish startup (iobox.fi) that ended up being acquired by telefonica.

Our mobile and web portal was made of j2ee services producing XML which were then transformed by XSLT into HTML or WAP

At the time it blew me away that they expected web designers to work in an esoteric language like that

But it was also nicely separated

smackeyacky · 31m ago
It’s weird to see the hate for xslt. I loved it, but maybe I just like stack based languages.
xg15 · 40m ago
I remember Blizzard actually using this concept for their battle.net site like, 10 years ago. I found it always really cool, but at some point I think they replaced it with a "regular" SPA stack.

I think one big problem with popularizing that approach is that XSLT as a language frankly sucks. As an architecture component, it's absolutely the right idea, but as long as actually developing in it is a world of pain, I don't see how people would have any incentive to adopt it.

The tragic thing is that there are other pure-functional XML transformation languages that are really well-designed - like XQuery. But there is no browser that supports those.

aarroyoc · 1h ago
It's worth mentioning that current XSLT version is 3.0 but browsers are only compatible with XSLT 1.0
chrismorgan · 2h ago
I’m disappointed that this uses a custom XML format, rather than RSS (tolerable) or Atom (better). Then you could just drop it into a feed reader fine.

A few years ago, I decided to style my own feeds, and ended up with this: https://chrismorgan.info/blog/tags/fun/feed.xml. https://chrismorgan.info/atom.xsl is pretty detailed, I don’t think you’ll find one with more comprehensive feature support. (I wrote a variant of it for RSS too, since I was contemplating podcasts at the time and almost all podcast software is stupid and doesn’t support Atom, and it’s all Apple’s fault: https://temp.chrismorgan.info/2022-05-10-rss.xsl.)

At the time, I strongly considered making the next iteration of my website serve all blog stuff as Atom documents—post lists as feeds, and individual pages as entries. In the end, I’ve decided to head in a completely different direction (involving a lot of handwriting!), but I don’t think the idea is bad.

tannhaeuser · 2h ago
I had done a couple of nontrivial projects with XSLT at the time and the problem with it is its lack of good mnemonics, discoverability from source code, and other ergonomics coupled with the fact that it's only used rarely so you find yourself basically relearning after having not used it for a couple of weeks. Template specifity matching is a particularly bad idea under those circumstances.

XSLT technically would make sense the more you're using large amounts of boilerplate XML literals in your template because it's using XML itself as language syntax. But even though using XML as language meta-syntax, it has lots of microsyntax ie XPath, variables, parameters that you need to cram into XML attributes with the usual quoting restrictions and lack of syntax highlighting. There's really nothing in XSLT that couldn't be implemtented better using a general-purpose language with proper testing and library infrastructure such as Prolog/Datalog (in fact, DSSSL, XSLT's close predecessor for templating full SGML/HTML and not just the XML subset, was based on Scheme) or just, you know, vanilla JavaScript which was introduced for DOM manipulation.

Note maintainance of libxml2/libxslt is currently understaffed [1], and it's a miracle to me XSLT (version v1.0 from 1999) is shipping as a native implementation in browsers still unlike eg. PDF.js.

[1]: https://gitlab.gnome.org/GNOME/libxml2/-/issues/913

p2detar · 1h ago
I have last used XSLT probably about 2 decades ago. Back then XML was king. Companies were transferring data almost always using XML and translating it to a visual web-friendly format with XSLT was pretty neat. Cool tech and very impressive.
captn3m0 · 2h ago
I use XSLT to generate a markdown README from a Zotero export XML file. It works well, but some simple things become much harder - sorting, counting, uniqueness.

https://github.com/captn3m0/boardgame-research

It also feels very arcane - hard to debug and understand unfortunately.

Dachande663 · 3h ago
Many, many years back I used Symphony21[0] for an events website. It’s whole premise was build an XML structure via blueprints and then your theme is just XSLT templates for pages.

Gave it up because it turns out the little things are just a pain. Formatting dates, showing article numbers and counts etc.

[0] https://www.getsymphony.com/

k4runa · 3h ago
Wow, blast from the past.
tgma · 1h ago
https://packages.grpc.io is an XML page styled with XSLT updated by a bash script in CI
_def · 2h ago
We've come full circle again. Yes this works great since many years, XML is just so much clutter.
kome · 1h ago
clutter? i find it MUCH more elegant and simple, but conceptually and practically, than the absolute clown-car of modern js driven web, css frameworks hacks, etc etc
intellectronica · 2h ago
Blast from the past. I actually used XSLT quite a bit in the early 00s. Eventually I think everyone figured out XML is an ugly way to write S-expressions.
almaight · 1h ago
What is needed more now is YAML, especially the visualization of the YAML format supported by k8s by default. On the contrary, in the devops community, people need to generate YAML through HTML to execute cicd. For example, this tool shows k8s-generator.vercel.app
HexDecOctBin · 3h ago
me busy fixing asan, "illegal instruction", blah blah blah, me sad and frustrated, much scowling.

me come to hn, see xml build system, me happy, much smiling, me hit up arrow, me thank good stranger.

7bit · 2h ago
Dear God the writing style on that article
podgorniy · 2h ago
Good old xslt. Was quite in the center of attention when strict xml was still a next standard candidate. html5 won.
Hendrikto · 2h ago
I hate this grug brain writing style. It sounds bad and is hard to read. Please just write normal, full sentences.
jurip · 2h ago
Yeah I don't get it. I had to stop reading after a couple of sentences, I just can't deal with that.
antonvs · 1h ago
Presumably part of the goal is to implicitly claim that what's being described is so simple a caveman could understand it. But writing such a post about XSLT is like satire. Next up, grug brain article about the Coq proof assistant?
Veen · 1h ago
I had ChatGPT translate it into the prose style of Samuel Johnson.
s4i · 2h ago
Maybe it’s just the way the author writes?
b0a04gl · 3h ago
xslt does one thing clean , walks trees on tree input. both data and layout stay in structured memory. no random jumps. browser-native xslt eval can hit perf spots most json-to-dom libs might miss. memory layout was aligned by design. we dropped it too early just cuz xml got unpopular
julius · 1h ago
Anyone with recent real-world experience?

From talking to AI, it seems the main issues would be:

- SEO (googlebot)

- Social Media Sharing

- CSP heavy envs could be trouble

Is this right?

cess11 · 36m ago
XML is great, one just need to have the appropriate tooling. XSLT, like XSD, is XML too, so the same tooling apply to those as well.

If you're manually writing the <>-stuff in an editor you're doing it wrong, do it programmatically or with applications that abstract it away.

Use things like JAXB or other mature libraries, eXist-db (http://exist-db.org), programs that can produce visualisations and so on.

ryoshu · 3h ago
Blizzard uses/used XSLT for WoW.
calmbonsai · 3h ago
Was that before/after the LUA adoption?
shakna · 2h ago
Before. And after.

XSLT controls the styling, Lua the running functions. When Lua adjusts a visible thing, it generates XSLT.

"FrameXML" is a thin Lua wrapper around the base XSLT.

petesergeant · 2h ago
XSLT is great fun as a general functional programming language! You can build native functional data-structures[1], implement graph-traversal algorithms[2], and even write test assertions[3]!

1: https://github.com/pjlsergeant/xslt-fever-dream/blob/main/ut...

2: https://github.com/pjlsergeant/xslt-fever-dream/blob/main/ut...

3: https://github.com/pjlsergeant/xslt-fever-dream/blob/main/ut...

bmacho · 1h ago
Files are missing from the repo(?). What about util-map.xsl, test-map.xsl, util-serialize.xsl
petesergeant · 1h ago
I've updated this, as well as included instructions on running the built-in unit tests, which are of course also written in XSLT.
brospars · 2h ago
All that fuss just to deploy a static website on Vercel? :p
preaching5271 · 1h ago
Cant take it seriously with that language, sorry