I wish I could write all the business logic I write on an NES and never have to worry about requirements going bad. I guess the thing is, if you're writing anything on top of a network layer of any kind, eventually it's going to require patches unless you literally own all the wires and all the nodes in the network, like a secure power plant or some money clearing system in a bank that's been running the same COBOL since the 1960s. And since you're probably not writing code that directly interfaces with the network layer, you're going to be reliant on all the libraries that do, which in turn will be subject to change at the whims of breaking changes in language specs and stuff like that, which in turn are subject to security patches, etc.
In other words, if you need your software to live in the dirty world we live in, and not just in a pristine bubble, things are gonna rot.
Picking tools and libraries and languages that will rot less quickly however seems like a good idea. Which to me means not chaining myself to anything that hasn't been around for a decade at least.
I got royally screwed because 50-60% of my lifetime code output before 2018, and pretty much all the large libraries I had written, were in AS3. In a way, having so much code I would have maintained become forced abandonware was sort of liberating. But now, no more closed source and no more reliance on any libs I don't roll or branch and heavily modify myself.
jeberle · 6h ago
Despite it being everyone's favorite shame language, COBOL's DATA DIVISION makes it uniquely well-suited for long-term stability. Common practice is to avoid binary fields. This means what you see in your program is what you see on the wire or in a file.
kragen · 4h ago
> In other words, if you need your software to live in the dirty world we live in, and not just in a pristine bubble, things are gonna rot.
I share your painful experience of losing my work to proprietary platforms.
gr4vityWall · 11h ago
> 50-60% of my lifetime code output before 2018, and pretty much all the large libraries I had written, were in AS3
Out of curiosity, what kind of work did you do? Regarding our old AS3, did you have any luck with Haxe? I assume it would be a straightforward port.
immibis · 10h ago
Building on bedrock is one extreme; the other is to become extremely fluid - build on quicksand but do it efficiently every time the quicksand shifts. AS3 may stop being useful, but if you can use some tool to recompile AS3 code to web JavaScript, you suffer no loss.
tracker1 · 6h ago
That's kind of my approach... KISS to the extreme. Something that's easy to replace tends to become surprisingly long lived and/or ported easily. I wrote a test launcher and client api for SCORM courseware in the early 00's. That code was later turned into a couple of LMS products, ported through a few different languages on the server and still effectively exists today. The DB schema and query code is almost exactly the same as baseline, except I didn't implement one feature that a decade in came across a course that used it and helped implement it.
Still friends with the company owner of that code. So I've had a bit more insight into follow-up on code over 2 decades old that isn't so typical for anything else I've done.
ozim · 7h ago
Another example is people from EU making fun of US buildings.
Lasting centuries may or may not be preferable.
There are places where you want to cheaply rebuild from scratch. Your castle after tornado and flooding will be irreparably bad. Most castles suck badly by not taking advantage of new materials and I myself would not like to live in 100 yo building.
Same for software, there are pieces that should be build to last but there are applications that should be replaceable in short time.
As much as I am not fan of vibe coding I don’t believe all software should be built for decades to last.
asa400 · 7h ago
SQLite has an explicitly stated policy on this: "The intent of the developers is to support SQLite through the year 2050."
People talk about SQLite's reliability but they should also mention its stability and longevity. It's first-class in both. This is what serious engineering looks like.
azemetre · 7h ago
Does SQLite talk about how they plan to exist beyond 2050 across multiple lifetimes?
Not trying to be chide but it seems like with such a young industry we need better social tools to make sure this effort is preserved for future devs.
Churn has been endemic in our industry and it has probably held us back a good 20 years.
jerf · 5h ago
As the time horizon increases, planning for the future is necessary, then prudent, then sensible, then optimistic, then aspirational, then foolish, then sheer arrogance. Claiming 25 years of support for something like SQLite is already on the farther end of the good set of those adjectives as it is. And I don't mean that as disrespect for that project; that's actually a statement of respect because for the vast majority of projects out there I'd put 25 years of support as already being at "sheer arrogance", so putting them down somewhere around "optimistic" is already high praise. Claiming they've got a 50 or 100 year plan might sound good but it wouldn't mean anything real.
What they can do is renew the promise going forward; if in 2030 they again commit to 25 years of support, that would mean something to me. Claiming they can promise to be supporting it in 2075 or something right now is just not a sensible thing to do.
azemetre · 4h ago
Having a plan for several hundred years is possible and we've seen such things happen in other facets of life. We as humans are clearly capable of building robust durable social organizations, religion and civics both being testaments.
I'm curious how these plans would look and work in the context of software development. That was more what my question is about (also only being familiar with sqlite taking this seriously).
We've seen what lawyers can accomplish with their BAR associations and those were created over 200 years ago in the US! Lawyers also work with one of the clunkiest DSLs ever (legalese).
Imagine what they could accomplished if they used an actual language. :D
clickety_clack · 3h ago
I’d be interested to know what you would classify as having been planned to last hundreds of years. Most of the long term institutions I can think of are the results of inertia and evolution, having been set up initially as an expediency in their time, rather than conforming to a plan set out hundreds of years ago.
azemetre · 22m ago
The Philadelphia BAR Association was established in ~1800. I doubt the profession of law is going to disappear anytime soon, and lawyers have done a good job building their profession all things considered. Imagine if the only way you could legally sell software was through partnerships with other developers?
Do you think such a thing would have helped or hurt our industry?
I honestly think help.
8n4vidtmkvmk · 1h ago
The easiest way would be to just write a spec for the data format, which I think they already have?
If any tooling fails in 25 years, you can at least write a new program to get your data back out. Then you can import it into the next hot thing.
lenkite · 4h ago
They should write a book on "Design and Implementation of SQLite". And make a course as well. That would interest a lot of people and ensure future generations pick up where they decided to retire.
azemetre · 3h ago
I do think this is a good approach for many open source projects.
Always thought neovim should do something like this. How to recreate the base of neovim, or how to recreate popular plugins with minimal lua code.
Got to wonder how more sustainable that would be versus relying on donations.
ectospheno · 5h ago
I’m trying and failing to think of another free software product I honestly expect to still work on my current data past 2050. And this isn’t good enough?
azemetre · 4h ago
It's good, but this also assume that the people taking care of this product in the future (which may not even born right now) will hold the same attitudes.
How do we plan to make sure the lessons we've learned during development now will still be taught 300 years from now?
I'm not putting the onus on sqlite to solve this but they are also the only organization I know of that is taking the idea seriously.
Just more thinking in the open and seeing what other people are trying to solve similar problems (ensure teachings continue past their lives) outside the context of universities.
SoftTalker · 3h ago
Thinking like this is how we ended up with a panic about Y2K. Programmers in the 1970s and 80s could not conceive that their code would still be running in 2000.
0cf8612b2e1e · 3h ago
The computing industry was in a huge amount of flux in the 1970s. How many bits are in a byte? Not a settled question. Code was being rewritten for new platforms all the time. Imagining that some design decisions would last for decades probably seemed laughable.
staticshock · 5h ago
Some churn is fads, but some is legitimate (e.g. "we know how to do this better now".) Every living system is bound to churn, and that's a good thing, because it means we're learning how to do things better. I'm happy to have rust and typescript, for instance, even though they represent some amount of churn for c and javascript.
SoftTalker · 3h ago
That’s only another 25 years. Granted it’s already been around for a good while.
begueradj · 7h ago
Maybe we could say something a bit similar about Express.js and other "boring technologies".
andrewmcwatters · 4h ago
I think it starts with us collectively not using boring tech as a term anymore. If boring helps me be productive, that's exciting, not boring.
Some people on the React team deciding in 2027 to change how everyone uses React again is NOT exciting, it's an exercise in tolerating senior amateurs and I hate it because it affects all of us down to the experience of speaking to under-qualified people in interview processes "um-ackchyually"-ing you when you forget to wrap some stupid function in some other stupid function.
Could you imagine how incredulous it would be if SQLite's C API changed every 2 years? But it doesn't. Because it was apparently designed by real professionals.
kragen · 4h ago
Hey, didn't you write kjbuckets and Gadfly? Or was that Aaron Watters? I was thinking about that the other day: that was one of the coolest pieces of software for Python 2 (though I think it predated Python 2): an embedded SQL database without needing SQLite's C API. I suppose it's succumbed to "software rot" now.
I think "boring software" is a useful term.
Exciting things are unpredictable. Predictable things aren't exciting. They're boring.
Stock car racing is interesting because it's unpredictable, and, as I understand it, it's okay for a race car to behave unpredictably, as long as it isn't the welds in the roll bars that are behaving unpredictably. But if your excitement is coming from some other source—a beautiful person has invited you to go on a ski trip with them, or your wife needs to get to the hospital within the next half hour—it's better to use a boring car that you know won't overheat halfway there.
Similarly, if a piece of software is a means to some other end, and that end is what's exciting you, it's better to use boring software to reach that end instead of exciting software.
8n4vidtmkvmk · 1h ago
I thought you were about to say go on a ski trip with your mistress while your wife is 9 months pregnant. That'd be exciting too, but in a bad/awful way.
kragen · 30m ago
It's probably better to have your wife with you and your mistress in that case.
andrewmcwatters · 4h ago
The name is close, but no cigar. :) I'm known in different circles than Python's.
kragen · 4h ago
Doh, sorry!
righthand · 3h ago
I agree, it’s “understood” tech, not “boring” tech. It’s only boring because it’s simplicity and usefulness is obvious. It’s only boring because there are few to zero use cases left to discover application of the tech. The tech isn’t boring, the person is boring.
roda73 · 9h ago
This is one of the reasons I absolutely hate Linux based development and operating systems built with it.
We all know it now as dependency hell, but what it is in fact is just a lazy shortcut for the current development that will bite you down the path. The corporate software is not a problem, because the corporate users don't care as long as it works now, in the future they will still rely on paid solutions that will continue working for them. For me, I run a local mirror of arch linux, because I don't want to connect to internet all the time to download a library that I might need or some software that I may require. I like it all here, but since I haven't updated in a while I might see some destructive update if I were to choose to update now. This should never happen, another thing that should never happen is if I were to compile an old version of some software. Time and time again, I will find a useful piece of software on github and I will naturally try compiling it, it's never easy, I will have to hunt the dependency it requires, then try compiling old versions of various libraries. It's just stupid, I wish it were easier and built smarter. Yes sometimes I want to run old software, that has no reason not to work. When you look at windows, it all works magically, well it's not magic it's just done smart. On GNU+Linux smart thinking like this is not welcome, it never has been. Instead they rely on huge amounts of people that develop this software, to perpetually update their programs for no reason, but to satisfy a meaningless number of a dependency.
skydhash · 8h ago
It’s all volunteer work, not some corporation with trillions laying around. If you want something easy, use Debian (or ubuntu). They pretty much have everything under the sun.
What you want to have (download software from the net and run it) is what most distro have been trying to avoid. Instead, they vet your code, build it, and add it to a reputable repo. Because no one wants to download postgres from some random sites.
8n4vidtmkvmk · 1h ago
Ubuntu is not perpetually stable. I mean.. I have some old instances running since 2018 or so and they continue to work fine, but I've been blocked from running/updating certain apps. So my choices now are upgrade the OS and risk breaking everything (almost certain) or... Just keep using the old stuff.
gjsman-1000 · 7h ago
Last time I checked, Ubuntu/Canonical is a multimillion dollar company, Red Hat is a multibillion dollar company, SuSE sold for $2.5B, and The Linux Foundation has over $250M in revenue to spend only 3% on development of Linux specifically.
Enough of the BS of "we're just volunteers" - it's fundamentally broken and the powers that be don't care. If multiple multibillion dollar entities who already contribute don't see the Linux desktop as having a future, if Linus Torvalds himself doesn't care enough to push the Foundation on it; honestly, you probably shouldn't care either. From their perspective, it's a toy, that's only maintained, to make it easier to develop the good stuff.
skydhash · 7h ago
Those companies sell server OS support, not consumer desktop. And Linux is rock solid for that purpose.
Desktop Linux is OK. And I think it’s all volunteer work.
8n4vidtmkvmk · 1h ago
OK at best. Barely functional. Incredibly unstable.
codeguro · 5h ago
Get over yourself. Linus himself said Linux is just a hobby. It just happened to be the best because of the lack of red tape dragging development down. It got as big as it did BECAUSE it was a volunteer project with the right choice of license and remains the best DESPITE big corps pouring money all around it. https://www.reddit.com/r/linux/comments/mmmlh3/linux_has_a_i...
ndriscoll · 7h ago
Use nixos. I install updates maybe every few months and it's fine. My desktop experience has been completely solid for almost a decade.
My work computer with Windows on the other hand requires restarts every day or two after WSL inevitably gets into a state where vscode can't connect to it for some reason (when it gets into such a state it also usually pegs the CPU at 100% so even unlocking takes a minute or more, and usually I just do a hard power off).
stereolambda · 5h ago
I sympathize with what you're saying. In theory Docker and Snaps and such are supposed to more explicitly package Linux programs along with their dependencies. Though Docker especially depends heavily on being networked and servers being up.
I'm not a fan of bundling everything under the sun personally. But it could work if people had more discipline of adding a minimal number of dependencies that would be themselves lightweight. OR be big, common and maintain backwards compatibility so they can be deduplicated. So sort of the opposite of the culture of putting everything through HTTP APIs, deprecating stuff left and right every month, Electron (which puts the browser complexity into anything), and pulling whole trees of dependencies in dynamic languages.
This is probably one of the biggest pitfalls of Linux, saying this as someone to whom it's the sanest available OS despite this. But the root of the problem is wider, it's just the fact that we tend to dump the reduction of development costs onto all users in more resources usage. Unless some big corp cares to make stuff more economical, or the project is right for some mad hobbyist. As someone else said, corps don't really care about Linux desktop.
ZiiS · 4h ago
Whilst this dose applies to all distro's to some extent; it is Arch's main distinguishing feature, it is a 'rolling release' with all parts being constantly updated. RHEL for instance gives you a 13-year cycle where you will definitely not get a destructive update.
SoftTalker · 3h ago
> We all know it now as dependency hell
Too young to remember Windows 3.1 and “DLL hell?” That was worse.
pca006132 · 4h ago
C/C++ dependency management is easy on windows? Seriously? What software did you build from source there?
8n4vidtmkvmk · 1h ago
Once it's compiled, it keeps running. I can still run win32 programs and what not. Is that true of Linux programs? Can I run one compilation on any distro for years to come? I honestly don't know.
codeflo · 15h ago
We as an industry need to seriously tackle the social and market dynamics that lead to this situation. When and why has "stable" become synonymous with "unmaintained"? Why is it that practically every attempt to build a stable abstraction layer has turned out to be significantly less stable than the layer it abstracts over?
dgoldstein0 · 14h ago
So one effect I've seen over the last decade of working: if it never needs to change, and no one is adding features, then no one works on it. If no one works on it, and people quit / change teams / etc, eventually the team tasked with maintaining it doesn't know how it works. At which point they may not be suited to maintaining it anymore.
This effect gets accelerated when teams or individuals make their code more magical or even just more different than other code at the company, which makes it harder for new maintainers to step in. Add to this that not all code has all the test coverage and monitoring it should... It shouldn't be too surprising there's always some incentives to kill, change, or otherwise stop supporting what we shipped 5 years ago.
In other words, if you need your software to live in the dirty world we live in, and not just in a pristine bubble, things are gonna rot.
Picking tools and libraries and languages that will rot less quickly however seems like a good idea. Which to me means not chaining myself to anything that hasn't been around for a decade at least.
I got royally screwed because 50-60% of my lifetime code output before 2018, and pretty much all the large libraries I had written, were in AS3. In a way, having so much code I would have maintained become forced abandonware was sort of liberating. But now, no more closed source and no more reliance on any libs I don't roll or branch and heavily modify myself.
Well, that's why I called BubbleOS BubbleOS: https://gitlab.com/kragen/bubbleos/
Not there yet, though...
I share your painful experience of losing my work to proprietary platforms.
Out of curiosity, what kind of work did you do? Regarding our old AS3, did you have any luck with Haxe? I assume it would be a straightforward port.
Still friends with the company owner of that code. So I've had a bit more insight into follow-up on code over 2 decades old that isn't so typical for anything else I've done.
Lasting centuries may or may not be preferable.
There are places where you want to cheaply rebuild from scratch. Your castle after tornado and flooding will be irreparably bad. Most castles suck badly by not taking advantage of new materials and I myself would not like to live in 100 yo building.
Same for software, there are pieces that should be build to last but there are applications that should be replaceable in short time.
As much as I am not fan of vibe coding I don’t believe all software should be built for decades to last.
https://www.sqlite.org/lts.html
People talk about SQLite's reliability but they should also mention its stability and longevity. It's first-class in both. This is what serious engineering looks like.
Not trying to be chide but it seems like with such a young industry we need better social tools to make sure this effort is preserved for future devs.
Churn has been endemic in our industry and it has probably held us back a good 20 years.
What they can do is renew the promise going forward; if in 2030 they again commit to 25 years of support, that would mean something to me. Claiming they can promise to be supporting it in 2075 or something right now is just not a sensible thing to do.
I'm curious how these plans would look and work in the context of software development. That was more what my question is about (also only being familiar with sqlite taking this seriously).
We've seen what lawyers can accomplish with their BAR associations and those were created over 200 years ago in the US! Lawyers also work with one of the clunkiest DSLs ever (legalese).
Imagine what they could accomplished if they used an actual language. :D
Do you think such a thing would have helped or hurt our industry?
I honestly think help.
If any tooling fails in 25 years, you can at least write a new program to get your data back out. Then you can import it into the next hot thing.
Always thought neovim should do something like this. How to recreate the base of neovim, or how to recreate popular plugins with minimal lua code.
Got to wonder how more sustainable that would be versus relying on donations.
How do we plan to make sure the lessons we've learned during development now will still be taught 300 years from now?
I'm not putting the onus on sqlite to solve this but they are also the only organization I know of that is taking the idea seriously.
Just more thinking in the open and seeing what other people are trying to solve similar problems (ensure teachings continue past their lives) outside the context of universities.
Some people on the React team deciding in 2027 to change how everyone uses React again is NOT exciting, it's an exercise in tolerating senior amateurs and I hate it because it affects all of us down to the experience of speaking to under-qualified people in interview processes "um-ackchyually"-ing you when you forget to wrap some stupid function in some other stupid function.
Could you imagine how incredulous it would be if SQLite's C API changed every 2 years? But it doesn't. Because it was apparently designed by real professionals.
I think "boring software" is a useful term.
Exciting things are unpredictable. Predictable things aren't exciting. They're boring.
Stock car racing is interesting because it's unpredictable, and, as I understand it, it's okay for a race car to behave unpredictably, as long as it isn't the welds in the roll bars that are behaving unpredictably. But if your excitement is coming from some other source—a beautiful person has invited you to go on a ski trip with them, or your wife needs to get to the hospital within the next half hour—it's better to use a boring car that you know won't overheat halfway there.
Similarly, if a piece of software is a means to some other end, and that end is what's exciting you, it's better to use boring software to reach that end instead of exciting software.
We all know it now as dependency hell, but what it is in fact is just a lazy shortcut for the current development that will bite you down the path. The corporate software is not a problem, because the corporate users don't care as long as it works now, in the future they will still rely on paid solutions that will continue working for them. For me, I run a local mirror of arch linux, because I don't want to connect to internet all the time to download a library that I might need or some software that I may require. I like it all here, but since I haven't updated in a while I might see some destructive update if I were to choose to update now. This should never happen, another thing that should never happen is if I were to compile an old version of some software. Time and time again, I will find a useful piece of software on github and I will naturally try compiling it, it's never easy, I will have to hunt the dependency it requires, then try compiling old versions of various libraries. It's just stupid, I wish it were easier and built smarter. Yes sometimes I want to run old software, that has no reason not to work. When you look at windows, it all works magically, well it's not magic it's just done smart. On GNU+Linux smart thinking like this is not welcome, it never has been. Instead they rely on huge amounts of people that develop this software, to perpetually update their programs for no reason, but to satisfy a meaningless number of a dependency.
What you want to have (download software from the net and run it) is what most distro have been trying to avoid. Instead, they vet your code, build it, and add it to a reputable repo. Because no one wants to download postgres from some random sites.
Enough of the BS of "we're just volunteers" - it's fundamentally broken and the powers that be don't care. If multiple multibillion dollar entities who already contribute don't see the Linux desktop as having a future, if Linus Torvalds himself doesn't care enough to push the Foundation on it; honestly, you probably shouldn't care either. From their perspective, it's a toy, that's only maintained, to make it easier to develop the good stuff.
Desktop Linux is OK. And I think it’s all volunteer work.
My work computer with Windows on the other hand requires restarts every day or two after WSL inevitably gets into a state where vscode can't connect to it for some reason (when it gets into such a state it also usually pegs the CPU at 100% so even unlocking takes a minute or more, and usually I just do a hard power off).
I'm not a fan of bundling everything under the sun personally. But it could work if people had more discipline of adding a minimal number of dependencies that would be themselves lightweight. OR be big, common and maintain backwards compatibility so they can be deduplicated. So sort of the opposite of the culture of putting everything through HTTP APIs, deprecating stuff left and right every month, Electron (which puts the browser complexity into anything), and pulling whole trees of dependencies in dynamic languages.
This is probably one of the biggest pitfalls of Linux, saying this as someone to whom it's the sanest available OS despite this. But the root of the problem is wider, it's just the fact that we tend to dump the reduction of development costs onto all users in more resources usage. Unless some big corp cares to make stuff more economical, or the project is right for some mad hobbyist. As someone else said, corps don't really care about Linux desktop.
Too young to remember Windows 3.1 and “DLL hell?” That was worse.
This effect gets accelerated when teams or individuals make their code more magical or even just more different than other code at the company, which makes it harder for new maintainers to step in. Add to this that not all code has all the test coverage and monitoring it should... It shouldn't be too surprising there's always some incentives to kill, change, or otherwise stop supporting what we shipped 5 years ago.