We're not innovating, we're just forgetting slower

132 obscurette 115 7/4/2025, 2:13:25 PM elektormagazine.com ↗

Comments (115)

raincole · 5h ago
> They can deploy applications to Kubernetes clusters but couldn’t design a simple op-amp circuit

And the ones who can design a op-amp circuit can't manufacture the laminate their circuit is going to be printed on. And the ones who know how to manufacture the laminate probably doesn't know how to refine or synthesize the material from the minerals. And probably none of them knows how to grow and fertilize the crop to feed themselves.

No one knows everything. Collaboration has been how we manage complexity since we were biologically a different species than H. sapiens.

cherryteastain · 5h ago
> THE GREATEST IMPROVEMENTS in the productive powers of labour, and the greater part of the skill, dexterity, and judgment, with which it is anywhere directed, or applied, seem to have been the effects of the division of labour.

> To take an example, therefore, from a very trifling manufacture, but one in which the division of labour has been very often taken notice of, the trade of a pin-maker...a workman not educated to this business...could scarce, perhaps..make one pin in a day, and certainly could not make twenty. I have seen a small manufactory...where ten men only were employed...Those ten persons, therefore, could make among them upwards of forty-eight thousand pins in a day.

- An Inquiry into the Nature and Causes of the Wealth of Nations, Adam Smith, 1776

alganet · 4h ago
When you divide and specialize manufacture, you get efficiency.

When you divide and specialize design, you get design by commitee.

marcosdumay · 4h ago
You clearly don't.

If you design a desk lamp, it wasn't designed by a committee just because a person designed the screws, another designed the plate stamping machine, another designed the bulb socket and etc.

alganet · 4h ago
Let's skip over to a real example, it's better.

If you start designing hardware for AI, together with AI designed to run just on that hardware, and tie those design cycles together, you'll get a design by commitee. It is very likely that requirements will make an overall bad hardware (but slightly better for AI), and an overall bad AI (but slightly better in some hardware).

Eventually, these shortcuts lead to huge commitees. Sometimes they're not even formally defined.

The screw company should make good screws, not good screws for a specific desk lamp model. A good designer then _chooses_ to use those screws _because they are good_, not because they can make specific design requirements to the screw company.

kragen · 3h ago
My left femur is a good femur for my specific height and species. It would be a lot worse if I had to use the same femur as the pit bull next door. Why are desk lamps different? I conjecture that it's only because of the cognitive limitations of the designers.
eternityforest · 46m ago
Because you weren't designed on a CAD app.

The desk lamp can be designed around common parts to reduce the total number of unique parts in the world, so everything is reusable, replaceable, manufacturable in larger quantities so there's more resources to optimize the process, etc.

alganet · 3h ago
I am sorry that you do not see yourself as different from a desk lamp.
kragen · 2h ago
On the contrary, my comment was all about the difference! I see my body as far better designed than any desk lamp I have ever heard of. So I am puzzled as to why you count the characteristics that give rise to desk lamps' deficiencies as advantages on the side of the desk lamp.
alganet · 2h ago
Your comparison with natural organisms _brightens my day_, but it is irrelevant for the context we're discussing.
Buttons840 · 4h ago
> When you divide and specialize design, you get design by commitee.

In your counter-example, the design was not divided, and thus it is not a counter-example at all.

spongebobstoes · 3h ago
The division of design doesn't stop because some of the pieces can be bought at a store.

The lamp design clearly was divided -- the final designer did not design the screws, lightbulb, wiring, and perhaps many other components. Someone had to design those components that were then combined creatively into a lamp.

Dividing design into components that can be effectively reused is tricky, but it remains essential.

alganet · 3h ago
Real examples often work better.

Last week I was learning about Itanium. It was a processor designed specifically for HP. Its goal was to replace both x86 and PowerPC.

HP would design the new systems to run on Itanium, and Intel would design the chip.

There was an attempt at specializing design here, with both companies running on design constraints from another. They formed a design comittee.

This was like the screw company making screws _specifically_ for one kind of desk lamp. It's division and specialization of design.

A natural specialization (one company gets very good at designing some stuff) is not divided, or orchestrated by a central authority.

In manufacture, it's the other way around. If you already have a good design, the more divisions alongside a main central figure, the better. You can get tighter tolerances, timing benefits, etc.

My argument is that these aspects are not transferrable from one concept to another. Design is different from manufacturing, and it gets worse if we try to apply the optimizations we often do with manufacturing to it.

jjmarr · 5h ago
I can design a simple op-amp circuit and deploy to a Kubernetes cluster because Canada has a "Computer Engineering" degree that's a hybrid between CS/Electrical Engineering.

It doesn't work in practice. CS graduates from my school are trained on git and Linux command lines. CE teaches none of this and students discover in 3rd year they cannot get an internship because they share all their code as IDE screenshots in Google Docs.

But we do know how the entire process of building a computer works, from quantum physics, semiconductor doping, npn junctions, CMOS logic, logic gates, hardware design languages, assembly, C, and Java.

If only all of this "important" knowledge didn't crowd out basic skills.

bee_rider · 4h ago
I think designing the EE and computer engineering curriculum is pretty tough. Because:

* the EE’s need to learn matlab or numpy, to use as a tool

* so do the computer engineering students, probably

* the computer engineering students also need to learn low level stuff, because they might reasonably end up writing drivers or embedded code

* separating out what should be in which classes is kind of tricky; keeping in mind that the students don’t necessarily know anything about programming at all, you probably need some kind of “intro to the general idea of programming” class

* realistically when they start the program they don’t know much about what the jobs look like, so it is good if they are able to switch paths, for the first couple years

* realistically a lot of people just want to get a stem degree and then go be a programmer anyway

WillAdams · 4h ago
Time was that this sort of thing was covered in courses such as:

https://ocw.mit.edu/courses/6-001-structure-and-interpretati...

but they've since switched to Python for reasons:

https://irreal.org/blog/?p=2331

alephnerd · 3h ago
> CE teaches none of this and students discover in 3rd year they cannot get an internship because they share all their code as IDE screenshots in Google Docs.

Which CE program did you study at? I've worked with Waterloo, UBC, and UT ECE grads and they have similar levels of knowledge of programming fundamentals as their CS grads. I would be shocked if a first or second year BS ECE cannot use Git or some alternative VCS - that means there are more fundamental issues with your university's engineering curriculum.

> I can design a simple op-amp circuit and deploy to a Kubernetes cluster because Canada has a "Computer Engineering" degree that's a hybrid between CS/Electrical Engineering.

Same in the States, ECE and EECS programs tend to teach both fairly equally, and there are plenty of top programs with a strong reputation in this (Cal, MIT, CMU, UIUC, UT Austin, UW, UCSD, UCLA, GT, etc)

The issue I have noticed though is the decline of "CSE" programs - CS programs with an added CompArch or OS internals focus. CS programs are increasingly making OS internals and CompArch optional at the undergrad level, and it is having an impact on the pipeline for adjacent fields like Cybersecurity, Distributed Systems, Database Internals, etc.

I've harped about this skills gap multiple times on HN.

no_wizard · 4h ago
I agree with your general thesis, but I will say someone who understands the conceptual and at least simplified way everything comes together are usually better at their speciality than those who know their speciality but don’t have some understanding of how things come together
9rx · 5h ago
> And probably none of them knows how to grow and fertilize the crop to feed themselves.

As a farmer and software developer, with a electronics hobby (and it being a part of the job of being a farming these days), I can check off growing crops, op-amp circuits, and Kubernetes deployments.

I don't own, or have reasonable access to, the necessary capital for laminating circuit boards and synthesizing minerals.

> No one knows everything.

But, really, access to capital is the real limiting factor. Getting to know something isn't usually all that difficult in and of itself, but if you have no way to do it then you're never going to get to know it. Collaboration is less important to efficiency than optimizing use of capital. Its just that we don't have many good ideas about how to optimize use of capital without also introducing collaboration.

anonzzzies · 5h ago
It was not a great remark indeed, however, most modern devops we meet (all seniors, big corps) know nothing lower level, often not much about Linux or what the coding etc. And it is always terrible: like LLMs, they like adding complexity: just put a nlb with an alb and private link with a gateway with a nlb with an alb! These guys (sorry, it is generally guys) believe they are experts but they are terrible. Get paid as experts though.

You don't have to know everything but BASIC understanding about the what is underneath would be nice.

SkyMarshal · 4h ago
Humans congregating into cities, specializing and developing expert capabilities in particular fields, and collaborating and relying on others for what they're not specialized in, is a big part of the story of the Enlightenment -> Industrial Revolution -> Electricity Revolution -> Information Age.
kragen · 5h ago
If I remember correctly, one of the first white people to successfully visit the Māori reported that when he told them he didn't know how to make pistols, black powder, porcelain, hemp rope, etc., they thought he was lying, because in their culture everyone knew how to make everything. There was a division between men's work and women's work, that was all. They had specialization, but not of the kind you are talking about.

The Little House on the Prairie books fictionalize the childhoods of Laura Ingalls Wilder and Almanzo Wilder in the US in the late 19th century. They expected their readers, whose grandparents had grown up in similar conditions, to believe that one or more of their parents knew how to shoot a bear, build a house, dig a well, poultice wasp stings, cast bullets, fertilize and grow crops, make cheese, whitewash walls, drive horses, run a business, read a book, play the fiddle, dance a jig, sing, keep bees, clear fields in forests, harvest honey, spin thread, weave cloth, thresh wheat, and many other activities. There were "store-bought" goods produced by the kind of specialization you're talking about, but Laura's family had a few durable goods of that sort (Pa's rifle and ax, the family Bible) and mostly they just did without.

More recently the Lykov family survived 40 years of total isolation from society, missing World War II completely, but did suffer some heartbreaking losses in material standard of living because they didn't know, for example, how to make ceramic or iron. Agafia Lykova is still living there on her parents' homestead, nearly a century later.

Specialization is indeed very efficient, but that answers the questions, "What can I do for others?" and "How can we survive?" Historical answers bespeaking specialization are archived in many of our surnames in the West: Cooper, Fuller, Goldschmidt, Herrero, Nailer, Roper, Molnar, and, of course, Potter.

But for those questions to matter, we also need to answer the questions, "How can I be happy?" and "How can we be happy?", and for thousands of years it has been at least widely believed that devoting your entire self to specialization runs counter to those goals—among other things, because it can open doors to the kinds of exploitation, unfreedom, and insecurity the article is lamenting. And sometimes regional specialization leads not to prosperity for every region but to impoverishment, and regaining the lost skills is the path out of the kind of abysmal poverty that produces regular famines; that's why there's a charkha on the Indian flag.

TI was no exemplar here; you can't even write your own machine code to run on the TI-99/4A, but the situation with Nest is in many ways far worse. I think it's worth distinguishing between situations where someone chooses not to learn about, modify, or repair artifacts, and situations like these where they are not permitted to learn, especially when the prohibition is established in order to exploit them economically, as in both the TI case and the Nest case, or as in medieval guilds.

Some specializations are thousands of years old; tin mining in Cornwall supported much of the Bronze Age, and silicosis was already known as an occupational disease of potters in Classical times. But 80 hours a week breaking rocks in a tin mine is not a path to human flourishing, nor to economic prosperity for the person doing it. Neither is buying thermostats you aren't allowed to understand. We shouldn't idealize it just because it's profitable.

WillAdams · 5h ago
Similarly, there were the _Foxfire_ books which attempted to document the knowledge and skills of Appalachians, or W. Ben Hunt's writings on woodcraft and his interpretation of Native American Indian lore which was shared with him.

For a mechanical approach to this, see the "Gingery" books which start with the basics of investment casting in the first volume, then using castings to make a lathe in the second (operating on the premise that a lathe is the only tool in a machine shop which can replicate itself), then using the lathe to make the balance of tools needed in a machine shop.

corimaith · 4h ago
>and for thousands of years it has been at least widely believed that devoting yourself to specialization runs counter to those goals.

Well no, civilizations like the Maori are the exception, not the norm. Rigid class roles and specialization have featured prominently in essentially every Eurasian civilization from Egypt to Han China, which held the bulk of humanity and uts developments. Nor did questions of individual happiness matter, what concerned people at the times were questions of martial duty or religious worship.

kragen · 4h ago
The Māori weren't civilized (they didn't have cities at the time) and were far from the exception; cities didn't hold the bulk of humanity until 11 years ago. We remember Middle Kingdom Egypt and (1500 years later) Han China because they were civilizations and consequently were literate. But throughout almost all of human history, only a small fraction of the population has lived in the civilizations we see looking back. Even in Eurasian civilizations, until the Industrial Revolution, 90+% of the population were relatively unspecialized peasants with skills nearly as broad-based as Laura Ingalls Wilder's parents. It's easy to forget about them because they weren't literate and so can't speak to us from beyond the grave like Epicurus, the Egyptian Book of the Dead, Sima Tan, or Marcus Aurelius.

And most people lived outside civilization entirely. They had very diverse lifestyles, but we can make some generalizations. Even when they didnt leave diaries for us to read, we can infer that they had much less specialization, both from economic principles and from archaeological evidence.

It's certainly true that people in civilizations are, and have always been, focused on martial duty, and everyone everywhere is concerned with religious worship, though they may call it something else for social reasons. But people have always been strongly interested in individual happiness, even in civilizations. The Buddha founded one of the most important religions 2500 years ago on the basis of individual happiness, to the point that after he died, one of the most contentious issues among his followers was whether holy people had any duty to help other people achieve happiness as well, the origin of the Mahayana bodhisattva vows. Epicurus's philosophy and Marcus Aurelius's writings are also centered on the pursuit of individual happiness, as is much of Plato and of course the Mohists. Even religions and philosophies that preached lifelong duty and sacrifice above all else, like Christianity and Islam, offer it as a path to individual happiness in the afterlife.

dingnuts · 5h ago
as you've identified, specialization of roles is a common trait in advanced societies, but not in pre-historic ones. And yes, American frontiersmen participated willingly in one of the last pre-historic societies. Specialization is what allowed us to stop subsistence farming and/or following herds.
kragen · 4h ago
Maybe people participating willingly in those societies—indeed, fleeing civilization en masse to the frontier and to join "Indian" tribes, while migration from the tribes to the civilized colonies was almost nil—should tell us that civilization isn't all upside?

Economic productivity is an important means to happiness, because it sucks to go blind or subsist on soft foods because you can't get the necessary medical and dental treatments. And it's terrible to never see your parents again because you don't have the material means to visit them. But there's a point of diminishing returns beyond which sacrificing more of your happiness for continued economic gains amounts to cutting off your nose to spite your face.

tspike · 5h ago
You could replace the word "advanced" with "unsustainable" and the thought still holds.
kragen · 3h ago
No, unsustainable societies like the Lykov family or the Tasmanians were also often unspecialized. I suspect that specialization improves sustainability in general, at least up to a point. But it depends on how it's structured. The Khwarezmian empire had a high degree of specialization for the time, but one bad decision by the emperor made it unsustainable.
dev_l1x_be · 3h ago
To a certain extent yes. However, when i get rid off a layer of abstraction from my supply chain I usually get better results. That is the scary part to me.
msgodel · 4h ago
In theory. The socialization premium is getting high enough that you are actually often better off doing everything on your own again.
WillAdams · 2h ago
That's a tough row to hoe --- classic example of that is a pencil (see Thoreau and his family's history/business), though interestingly, there is a specialized tool now for making one (from components):

John Economaki's "Pencil Precision" from Bridge City Tool Works:

https://bridgecitytools.com/products/pp-1-pencil-precision

I have the preceding "Chopstick Master v2" and it is a delight to use (and probably if there was a suitable set of instructions for collecting the materials for making a pencil lead and baking them, I'd probably have the successor).

jmulho · 5h ago
You’re part of a different species than H. Sapiens?
alganet · 5h ago
I'm fairily confident that someone like Ben from Applied Science can both laminate circuits and write modern code.

https://www.youtube.com/@AppliedScience

https://github.com/benkrasnow

If he can, what's stopping you?

There are extraordinary people doing extraordinary things all around you. Aiming for these things is important, and we need those kinds of people with ambitious learning goals.

rtkwe · 5h ago
Because specialization is vastly more efficient, productive, and I don't have an interest in cutting and laminating my own circuit boards. I can pay a tiny (relative to the alternative investment of DIY) to get multilayer boards in a day.
alganet · 5h ago
You're changing your argument.

Before, you said people _can't_ (in general, anyone that knows how to code cannot possibly learn how circuits work).

Now, you're saying that _you don't want to learn_. That's on you, buddy. Don't project your insecurities on the whole IT field. People can, and will, learn across many layers of abstraction.

PickledChris · 5h ago
That is because you are replying to two different people.

People can learn across layers of abstraction, but specialisation is generally a good thing and creates wealth, a Scottish guy wrote a good book on it.

alganet · 5h ago
There are many industries that specialized but kept and refined old knowledge instead of repeating the same mistakes over and over.

> That is because you are replying to two different people.

He chose to follow the argument of the previous dude, so, it's all the same for me. Everything I said still applies.

rtkwe · 1h ago
I don't know what raincole's meaning was but can't there doesn't have to be a permanent inability just a current one, IMO backed up by the next sentence about the people making circuit boards not knowing how to refine the raw materials into those that they use. That's what I took it as.
hluska · 5h ago
You missed the point of that entire comment didn’t you?
alganet · 5h ago
If I did, people failed to explain why.

I think I made an excellent counterpoint that is not against specialization, but complementary.

This counterpoint is particularly important in an age where specialization is being oversold and mixed with snake oil.

thinkingtoilet · 6h ago
> Software has followed the same trajectory, piling abstraction upon abstraction until we’ve created a tower of dependencies so precarious that updating a single package can break an entire application.

This is like saying old software is so simple that updating a line of code can break an entire application. It's a silly thing to say. No matter how complex or how simple a piece of software is, you can easily break it. If you have a program that prints out "hello world", guess what? Updating a single character can break the entire application!

The world is more complex now. We've stood on the shoulders of giants who stood on the shoulders of giants. A few centuries ago a renaissance man could make advances in multiple fields. Now people are specialized. It's the same thing with software. Of course, people take it to an extreme. However, you go ahead and write your own crypto library, I'll use a vetted one created by experts.

JanneVee · 4h ago
A crypto library is "essential complexity" and running a node.js runtime inside a container in kubernetes node in a virtual machine on a server owned by a hyperscalar cloudprovider is "accidental complexity". I'll take the crypto library any day of the week but if I get to decide how to host a webapplication I'll stay away from that fragile tower of abstractions. As it happens I currently make a living from that "accidental complexity" so it is not a matter of skill, I just see it as unnecessary.
softfalcon · 6h ago
This is exactly my take on folks bemoaning the evolution of complexity in software (and hardware) worldwide.

To lend some credence to other folks points of view, there are arguments I can agree with that are adjacent:

- "We don't need that complex framework for our needs, we stick to a simpler, older library."

- "We decided to not use <shiniest_new_toolkit> it had performance issues that the maintainers are still sorting out."

- "Working with the new framework showed a lot of promise, but there is still a lot of instability in the API since it's so new. We couldn't commit to using a toolkit that hasn't been nailed down yet."

These are actual concerns and shows caution towards adopting new things until they match your use-case, dev-timelines, and performance requirements.

shermantanktop · 5h ago
I’d add “The DOM api probably already has everything you need and has zero dependencies.” That applies to many lower-level APIs which have gotten smarter and better over the years—often in response to the needs of fancy frameworks, but now they are pretty rich by themselves.

“I don’t have time to learn a new framework, I have things to do.” Everybody’s cool new abstraction is a cognitive burden for someone else.

citrin_ru · 5h ago
A complex software much much easier to break accidentally. While one character can break "hello world" is not something one would do while trying hard not to break it. And potential bugs in a simple application much more likely will be cough during a review. For a complex application no-one have a good mental model to predict what a given change would do reliably. For a simple one it's possible.
alganet · 6h ago
If I break some source code by messing with it, I know what happened. I might even learn from it.

Now if npm breaks it, or Claude breaks it, a developer might not even know what was broken.

He's talking about that kind of thing, not the resilience of code to take random character deletions.

IT is very much non-specialized compared to older disciplines. It's so young. Every single one of us is still a jack of all trades in some degree.

You're relying on the "don't roll your own crypto" popular quote to drop the mic. That's misguided. This advice comes from an argument of strength by numbers, not anything related to abstractions. It tells me you don't understand it.

No comments yet

1718627440 · 6h ago
Doesn't it mean more that some packages aren't conforming to common sense like semver?
unyttigfjelltol · 5h ago
> (Google is less than helpful with a dumbed-down user interface that basically tells you that “something went wrong.")

This UI trend of denying access to under-the-hood complexity is deeply unsettling. It creates a cliff edge failure mechanism where the system (which often is essential) works and then suddenly doesn't. No warning approaching the failure state, no recourse on the far side, completely baffling how this became an industry standard.

convolvatron · 4h ago
the best part is when you look at the from the perspective of composition. if I build a thing using complicated 'industry standard' components that are just supposed to work, then when something happens I'm pretty much lost. now take my thing, add some marketing about how it 'just works', and have someone else use it in a high level concatenation of other such functions and pretend it 'just works'.

now we are actually employing large numbers of people just to babysit these half-assed things, and everyone is fine with it because we all used 'industry standard' components, and thats really the best we can do isn't it. armies of on-call system resetters are just part of the picture now.

bee_rider · 5h ago
> The VHS player in my basement could be fixed with a screwdriver and a service manual (OK, sometimes an oscilloscope). Meanwhile, my Wi-Fi router requires a PhD in reverse engineering just to figure out why it won’t connect to the internet.

This seems like a pretty weird example, right? WiFi routers don’t connect to the internet. If your modem can’t connect to the internet, something has probably broken outside your house. That’s the sort of locally-unsolvable problem that everybody last century was familiar with; the crappy copper telephone wire that was never designed to let the Internet blast through it and it will eventually rebel and start giving you noise.

If your router doesn’t work, I don’t know. Cheap routers are not a new invention or sign of the times, I think.

VHS players, if I remember correctly, often died in mysterious ways (they have all sorts of little motors and finicky sensors in them).

zbentley · 1h ago
WiFi routers a) do connect to the internet (as many of them are integrated modems as well as WiFi) and b) are often mentioned as “connecting” to the internet as a colloquialism that means “a device on my WiFi can’t reach the internet because the router has an issue”.

> If your modem can’t connect to the internet, something has probably broken outside your house

Of all the internet-only connectivity outages I’ve had that lasted for longer than a few minutes, nearly all of them were resolved by a modem or router reboot. These are ordinary, non-customized modem/routers from 3-4 ordinary ISPs serving ordinary apartments in a major US city, using ordinary mediums like DSL, cable, fiber.

The fact that a reboot resolved the issue means that the problem wasn’t outside the house. Of all the remaining, long and not-fixed-by-reboot outages, one was a hurricane, one was a bad amp … and all the remaining dozens were arcane technical issues with the modem, router, or devices on the network that required ISP troubleshooting to fix.

I suspect that this is not an uncommon distribution, which means that this isn’t the same problem folks in the last century faced; today, the shitware is coming from inside the house.

SoftTalker · 4h ago
VHS players went through a typical consumer electronics evolution. The first ones were large, heavy, and repairable if you knew what to do. By the end they were lightweight plastic and you didn't even consider repairing one, as an hour of time at a repair shop would cost you more than buying a new one. They were disposable.
freshtake · 5h ago
I don't know, I think the point of the example is one of transparent engineering. VHS players could break for any number of reasons but the manufacturers used to put in effort to make them repairable. Obviously a much simpler piece of hardware, but the relative effort felt much greater.

When I used to use Google Wifi, it regularly struggled to connect or establish/maintain connectivity to the outside world, even though my modem was successfully connected. Similar to nest devices, you often have to power cycle them several times to get them into a good state

bee_rider · 4h ago
I guess. All I know is, when I was a teenager we had a broken VHS player that I wanted to fix. I was a clever kid, so I took it from a broken VHS player, to a broken VHS player that was also in many pieces.
freshtake · 2h ago
Lol yeah, very familiar
xg15 · 39m ago
All the usual rebuttals in this thread, but I don't see anyone engaging with his assertion that we're forgetting and reinventing things.

If it really a were just division of labor, beneficial abstraction, shoulders of giants, etc, shouldn't we be able to distinguish genuinely new concepts from things we already had 40 years ago in a different context?

FilosofumRex · 4h ago
Abstraction explosion is surly detrimental to innovation, but so is "mathification", propelled by rapidly declining computational costs.

Romans tremendously progressed, despite using inferior "math" compared to the Greeks, and Americans compared to the English & French. Note how willingly China ships its top mathys to the US grad school while retaining its best engineers at all costs.

Only technologies that make it into tools/hardware (realware) will survive, the rest are destined to be e-wasted.

ryukoposting · 6h ago
> They can deploy applications to Kubernetes clusters but couldn’t design a simple op-amp circuit.

If the thesis is that we should understand the systems we work on, then sure, I can get behind that. At the same time, I wouldn't expect a mechanic to know how to process iron ore into an ingot.

alganet · 5h ago
Machinists know their materials. They might not know every step to process it, but they understand the manufacturing process of different kinds of metals and their properties.

They definitely don't "vibe machine" without thinking about underlying concepts.

MountainMan1312 · 4h ago
As a redneck I think you vastly underestimate the prolifity of vibe machining
alganet · 4h ago
I am sure hobbysts vibe machine a lot, and there's nothing wrong with that, but that's not supposed to be representative of the industry.
827a · 5h ago
Software engineers aren't to computers as mechanics are to cars. The closest correlate is, like, automotive engineers working at Ford or Toyota; and I would 100% expect that someone involved in the process at those companies has reasonably deep knowledge about, for example, the structural characteristics of casted metal.
thadk · 3h ago
idk, if you're going to center that good old TI computer, you gotta contrast the Conways law of that company, and the semi-solo person sharing 'grammable tidbits. The desired carrying capacity lies in that institution. Today it's in Shenzhen.

But the ~1980s corporation is no longer and it was driven by the hype cycle too, it's just not a recognizable one. You can google the adverts or read Soul of a New Machine.

phendrenad2 · 4h ago
If everyone could do Bode plots, design an op-amp circuit, design their own programming language and wrangle kubernetes clusters, would it really improve the world? Or would we just have a lot of useless knowledge?
xenocratus · 6h ago
I'm sure he wrote all of this on, sent it over, and now is re-reading it from his Texas Instruments TI-99/4A Home Computer.
analog31 · 6h ago
Ironically, the technology that amazingly still works after 40 years is shown in front of a shelf full of books.
paulryanrogers · 5h ago
Books are fragile. Their pages often include acid in order to self destruct. For resilience we should be recording on clay tablets. /s
analog31 · 5h ago
Oddly enough, I've read that the clay was originally re-usable, and the ones that are preserved were baked in accidental fires.

Kind of a weird opposite meaning of book-burning.

david927 · 6h ago
Someone made a video clip showing the Crumbl Cookies ingredients list and each one has around 100 ingredients

https://x.com/WallStreetApes/status/1940924371255939236

Our software is like that. A small system will have a crazy number of packages and dependencies. It's not healthy and it's expensive and it's stupid.

Culture tends to drive drunk, swinging wide to extremes and then over-correcting. We're already fully in the wrong lane when we start to have discussions about thinking about the possibility of change.

BiteCode_dev · 6h ago
Everything in the physical world is the same. There are hundreds of pieces even in the smallest toaster, and no one in the world knows how to make half of one without all those external dependencies:

https://www.youtube.com/watch?v=5ODzO7Lz_pw

It's not a software thing, it's just how humanity works.

SketchySeaBeast · 6h ago
Even if you consider something like "build a bridge", that classic comparison of software and traditional engineer, all the components of said bridge have been created by someone else with particular expertise. The civil engineer who designed where the iron beams will go played no part in the metallurgy of those beams.
hiAndrewQuinn · 5h ago
Milton Friedman put it best, in "I, Pencil". Everything around you is the result of hundreds to thousands of economic actors, loosely but robustly coordinated in a way to get you what you actually want at as low of a price as you can get away with. The complexity of the supply chain is the price of admission to the game.

https://youtu.be/67tHtpac5ws?si=eZk_5K32gL4PxDgv

nobodyandproud · 5h ago
Yes and no.

The physical world is bound by rules that are unchanging (more-or-less). Above this layer we’ve also devised and agreed upon standards that remain unchanging, though it’s regional: Voltage, screw/bolt sizes, tolerance levels, materials, material measurements, etc.

At this layer, we’ve commoditized and standardized because it’s useful: It makes the components cost-effective and predictable.

In software and computing, I can only think of the low-level networking standard that remain stable. And even that has to be reinvented somewhat for each OS or each nee language.

Everything else seems to be reinvented or rewritten, and then versioned.

Imagine having to upgrade your nuts and bolts in your car to v3.01 or lose support?

david927 · 4h ago
I'm not arguing against component-based architectures. I'm saying we're over-engineering, and it shows. Even toasters are less maintainable than they used to be.

Ingredients in the cookies? Yes. 100? No.

skybrian · 6h ago
If it works, he's lucky. For example, Commodore 64's often had a dodgy power supply that would eventually damage the computer.

My Vectrex still worked last I checked.

forinti · 6h ago
About every 15 years I have to change the caps on my Beeb. The third job is coming soon.
MontyCarloHall · 6h ago
>That TI-99/4A still boots because it was designed by people who understood every component, every circuit, every line of code. It works because it was built to work, not to generate quarterly revenue or collect user data or enable some elaborate software-as-a-service business model.

The methods and algorithms powering advances in modern science, medicine, communications, entertainment, etc. would be impossible to develop, much less run, on something so rudimentary as a TI-99/4A. The applications we harness our technology for have become much more sophisticated, and so too must the technology stacks underpinning them, to the point that no single individual can understand everything. Take something as simple as real time video communication, something we take for granted today. There is no single person in the world who deeply understands every single aspect, from the semiconductor engineering involved in the manufacture of display and image sensors, to the electronics engineering behind the communication to/from the display/sensor, to the signal processing and compression algorithms used to encode the video, to the network protocols used to actually transmit the video, to the operating system kernel's scheduler capable of performing at sufficiently low-latency to run the videochat app.

By analogy, one can understand and construct every component of a mud hut or log cabin, but no single person is capable of understanding, much less constructing, every single component of a modern skyscraper.

alganet · 5h ago
You're misdirecting.

He's criticizing the act of _not building_ on previous learnings. _It's in the damn title_.

Repeating mistakes from the past leads to a slow down in such advancements.

This has nothing to do with learning everything by yourself (which, by the way, is a worthy goal and every single person that tries knows by heart that it cannot be done, it's not about doing it).

fusionadvocate · 5h ago
Abstractions hide details, that does not mean they cease to exist. The problem with abstractions is that it makes it easier to create conflicts when making changes. Lots of hidden details are affected by a high level change.
zzzeek · 5h ago
I see a bunch of "nobody knows everything, this old man needs to appreciate modern technology stacks" comments, and in some ways I blame the post for this because it kind of meanders into that realm where it gets into abstractions being bad and kids not knowing how to make op-amp circuits (FTR, I am from the "you have to know op-amps!" generation and I intentionally decided deep hardware hacking was not going to be my thing), but the actual core thing I think is important here is that working hard is being devalued - putting in the time to understand the general workings underpinnings of software, the hardware, using trial and error to solve an engineering problem as opposed to "sticking LEDs on fruit", the entire premise of knowing how things work and achieving some deep expertise is no longer what people assume they should be striving for, and LLMs, useful or not, are only accelerating this.

Just yesterday I used an LLM to write some docs for me, and for a little bit where I mistakenly thought the docs were fine as they were (they weren't, but I had to read them closely to see this) it felt like, "wow if the LLM just writes all my docs now, I'm pretty much going to forget how to write docs. Is that something I should worry about?" The LLM almost fooled me. The docs sounded good. It's because they were documenting something I myself was too lazy to re-familiarize with, hoping the LLM would just do it for me. Fortunately the little bit of my brain that still wanted to be able to do things decided to really read the docs deeply, and they were wrong. I think this "the LLM made it convincing, we're done let's go watch TV" mentality is a big danger spot at scale.

There's an actual problem forming here and it's that human society is becoming idiocracy all the way down. It might be completely unavoidable. It might be the reason for the Fermi paradox.

WillAdams · 11m ago
Marshall Mcluhan called this out ages ago:

>every extension is also an amputation

that said, it is up to society, and to a lesser extent individuals to determine which skills will be preserved --- an excellent example of a rational preservation of craft teaching in formal education is the northern European tradition of Sloyd Woodworking:

https://rainfordrestorations.com/tag/sloyd/

>Students may never pick up a tool again, but they will forever have the knowledge of how to make and evaluate things with ... hand and ... eye and appreciate the labor of others.

bitwize · 6h ago
I love how he talks about knowing thermal characteristics, etc. and then cites the TI-99/4A as an example of something designed by people who Really Knew What They Were Doing. The TI-99/4A was notorious for being prone to overheat due to its power supply. Munch Man and Parsec were complete non-starters for me when it got hot in July. This was even mentioned, specifically, in Halt and Catch Fire. The early microcomputer engineers were spitballing. You want to talk about the number of bodge wires that were in every TRS-80? Or the Apple III having no thermal vents per order of Steve Jobs, and the "you're holding it wrong" of the 80s being "drop it on your desk to fix it"? We know how to build better, more reliable computers for cheaper today than we ever did in the 80s. Then we fuck them up with things like Windows, but still.
ivape · 6h ago
We’re creating a generation of developers and engineers who can use tools brilliantly but can't explain how those tools work.

There's an education gap that needs to be addressed, but I don't know how it will get addressed. A lot of the web in the past few decades came from industry so industry had a way of training up people. Most of this ML stuff is coming from academia, and they aren't really the best at training up an army at all.

It's hard to know who to blame for all of this because it's kind of like not having an early warning asteroid detection system. HN or various communities did not have discussions even five years prior to GPT about the impending doom (no early warning at all). If you just take HN, we sat around here discussing a million worthless things across Rust/Javascript/startup b.s for years like headless chickens (right here on the frontpage) without realizing what was really to come.

Makes me wonder if the places I go for tech news are enough to be prepared. Which brings me back to what I quoted:

We’re creating a generation of developers and engineers who can use tools brilliantly but can't explain how those tools work.

We aren't creating them. They are the existing devs that had no idea AI was going to be a thing. Had anyone known it was to be such a thing, everyone would have ditched going to web development bootcamps in the mid 2010s.

dangus · 4h ago
Ugh, another one of these articles.

This idea that we don’t understand the internals of anything anymore and nothing is reliable is a mix of nostalgic cherry-picking and willful ignorance of a lot of counter-examples.

Sure, a bunch of consumer appliances are nebulous, but they are designed for those tradeoffs. It’s not like your old VHS player was designed specifically to be easy to repair either.

The author is complaining about their of advanced networking feature breaking on a router intended for consumers. Why they haven’t upgraded to a prosumer setup is a mystery - OpnSense on a mini PC combined with some wireless access points is one way to go that offers a lot more configurablility and control.

Complaining that not everyone can understand low level hardware is ignorant of all the really cool low level hardware and maker communities that have exploded in recent years, and it’s ignorant of the fact that specialization existed back in the “good old days” as well. For example, we had separate transmission and body shop specialists in the mid-century, you couldn’t just go to any mechanic to fix any problem with your car.

I’d like to see someone in the VHS era design a printed circuit board using CAD software and get it printed on-demand, then design an enclosure and 3D print it in their house for pennies. You can design your own keyboards and other electronic gadgets and basically own a little factory in your own home these days. You can share designs with ease and many of the tools are open source. The amount of sophistication accessible to the average person is incredible these days.

hluska · 5h ago
> The same publications that use “disruptive AI” unironically are the ones that need to Google “what is a neural network” every time they write about machine learning.

This is called “good journalism”. It would be great if Elektor tried practicing it.

wyager · 6h ago
> Large language models are impressive statistical text predictors — genuinely useful tools that excel at pattern matching and interpolation.

Slightly OT: It's interesting how many (smart!) people in tech like the author of this article still can't conceptualize the difference between training objective and learned capability. I wonder at this point if it's a sort of willful ignorance adopted as a psychological protection mechanism. I wonder if they're going to experience a moment of severe shock, just gradually forget that they held these opinions, or take on a sort of delusional belief that AI can't do XYZ despite all mounting evidence to the contrary.

possiblyreese · 5h ago
Couldn't agree more. I thought we were past the "stochastic parrots" phase, but it seems some people are incapable of accepting these models have emergent capabilities.
ReptileMan · 6h ago
Can you elaborate a bit?
gjm11 · 5h ago
(Not GP, but:)

LLMs' initial training is specifically for token-prediction.

However, this doesn't mean that what they end up doing is specifically token-prediction (except in the sense that anything that generates textual output can be described as doing token-prediction). Nor does it mean that the only things they can do are tasks most naturally described in terms of token-prediction.

For instance, suppose you successfully train something to predict the next token given input of the form "[lengthy number] x [lengthy number] = ", where "successfully" means that the system ends up able to predict correctly almost all the time even when the numbers are ones it hasn't seen before. How could it do that? Only by, in some sense, "learning to multiply". (I haven't checked but my hazy recollection is that somewhere around GPT-3.5 or GPT-4 LLMs went from not being able to do this at all to being able to do it fairly well on moderate-sized numbers.)

Or suppose you successfully train something to complete things of the form "The SHA256 hash of [lengthy string] is "; again, a system that could do that correctly would have to have, in some sense, "learned to implement SHA256". (I am pretty sure that today's LLMs cannot do this, though of course they might have learned to call out to a tool that can.)

If you successfully train something to complete things of the form "One grammatical English sentence whose SHA256 hash is [value] is " then that thing has to have "learned to break SHA256". (I am very sure that today's LLMs cannot do this and I think it enormously unlikely that any ever will be able to.)

If you successfully train something to complete things of the form "The complete source code for a program written in idiomatic Rust that does [difficult task] is " then that thing has to have "learned to write code in Rust". (Today's LLMs can kinda do some tasks like this, and there are a lot of people yelling at one another about just how much they can do.)

That is: some token-prediction tasks can only be accomplished by doing things that we would not normally think of as being about token prediction. This is essentially the point of the "Turing test".

For the avoidance of doubt, I am making no particular claims (beyond the illustrative ones explicitly made above) about what if anything today's LLMs, or plausible near-future LLMs, or other further-future AI systems, are able to do that goes beyond what we would normally think of as token prediction. The point is that whether or not today's LLMs are "just stochastic parrots" in some useful sense, it doesn't follow from the fact that they are trained on token-prediction that that's all they are.

MountainMan1312 · 4h ago
It's like how when you wrote that comment, the thing you were doing wasn't "operating your finger muscles"
goopypoop · 6h ago
JavaScript is required to read this rant

edit: lol

charcircuit · 5h ago
This article is against accessibility. It's not that people are forgetting. We just don't require them to know everything to build things.

>Edge computing? That’s just distributed processing with better marketing.

Edge computing is not "just" distributed processing. That fails to recognize the point of minimizing latency.

>Microservices? Welcome to the return of modular programming, now with 300% more YAML configuration files.

Not all modules are microservices. It's again term to a more specific practice.

>Serverless? Congratulations, you’ve rediscovered time-sharing, except now you pay by the millisecond.

Those are somewhat related concepts but they still don't have the same meaning.

>Compare that to today’s black-box system-on-chip designs, where a single failure means the entire device becomes e-waste

If you really wanted to, you could fix the system-on-chip.

>We’ve mistaken complexity for sophistication and abstraction for advancement.

People are not adding complexity and abstractions just for fun.

>We’ve created a tower of dependencies so precarious that updating a single package can break an entire application

This has always been the case. Bugs still existed in the 1900s.

>What started as a legitimate return to hands-on engineering has been co-opted by influencer culture, where the goal isn’t to build something useful but to generate content about building something photogenic.

Social media being dominated by people good at social media and not by the top makers will happen in every endeavor. Accessibility has allowed many more people to be able to create basic things.

>We’re creating a generation of developers and engineers who can use tools brilliantly but can't explain how those tools work

They don't need to. And this has always been the case. There is too much to know and having different people specialize in different things is effective. Additionally there is great value in making software accessible making people with less knowledge to be able to make things. It allows for more things to be created that deliver value to people.

>The best engineering solutions are often elegantly simple. They work reliably, fail predictably, and can be understood by the people who use them. They don't require constant updates or cloud connectivity or subscription services.

Sure, but many people want a solution that can be delivered now and for cheap.

photochemsyn · 5h ago
Obviously a lot of design and engineering tasks these days don't have the goal of producing robust, repairable, long-lived hardware and software - where's the profit in that? If iphones lasted twice as long as they now do, wouldn't sales drop by 50% unless consumers decided they preferred the longer-lived phones? This would create pressure on all manufacturers to produce long-lived phones with easily replaceable batteries and publicly available repair kits. And what happens then? The entire market shrinks for all phone manufacturers, and the shareholders throw tantrums over lost profits.

In the late 19th and early 20th Gilded Age era, every industry was dominated by trusts, collusions of manufacturers who set up anti-compete systems to ensure such disruption of their industry by independent innovation wouldn't succeed. This is now being replayed in the tech industry for similar reasons.

There are two solutions to this problem that go together: anti-trust law and open-source hardware and software models - but for that to work, you need an educated population with and understanding of legal and scientific concepts, which is why the education system in the USA has been so deliberately degraded over the past few decades.

That's what happens when you let investment capitalists control everything, isn't it?

jeroenhd · 5h ago
> Modern developers debug through seventeen layers of frameworks to discover that their problem is a missing semicolon in a configuration file generated by a tool that abstracts away another tool that was created to simplify a process that was perfectly straightforward twenty years ago.

But it wasn't straightforward twenty years ago. Maybe it was to you, but it wasn't to others. There's a reason the world moved away from command line interfaces and it's not just to bully the nerds.

Same reason many Americans don't know how to use a clutch, and why chopping down trees for your own house has fallen out of fashion. As society specialises and technology advances, responsibilities are divided.

> The VHS player in my basement could be fixed with a screwdriver and a service manual (OK, sometimes an oscilloscope). Meanwhile, my Wi-Fi router requires a PhD in reverse engineering just to figure out why it won’t connect to the internet. We’ve mistaken complexity for sophistication and abstraction for advancement.

A VHS player is built on top of tons of abstractions. There's a component somewhere in the middle that will take electric pulses and turn them into subtitles you can turn on or off. Just like that WiFi router still has its analog pins you could hook your oscilloscope up to if you want to troubleshoot it.

We have lost service manuals for many electronics indeed, but that's because servicing these devices no longer earns anyone a living. Electronics as complex as VHS players have dropped in price from a month or two's wage for a whole family to the price of eating out. Spending half a year teaching yourself electrical engineering to maintain your TV isn't worth the time investment anymore, unless you're doing it out of personal interest.

You can repair the failed electronics on WiFi routers. You don't need to, though, because the electronics no longer constantly fail like they used to. The skills electrical engineers from the last century have proudly honed just aren't as relevant as they used to be. The "old man yells at cloud" complaint that kids these days don't even know assembly is just as relevant as it was in the days when assembly was commonplace, when kids those days didn't even know how to program spinning drums or knit magnetic core memory without the help of an assembler.

Billions of people drive cars every day. Most of those people have no idea how their car works beyond the ignition, yet the world relies on that technology and it's working just fine. Cars do break down sometimes, and that's when you call in the experts. The people who know the ins and outs of assembly, machine code, and CPU microcode, still exist. The difference between back then and now is that, like cars, you don't need years of education before you can comfortably use the devices anymore.

I too lament the overly complex software ecosystem of today, the simple apps that have grown to hundreds of megabytes, the Javascriptification of our world, but that's not a failure of society. It's what you get when you go for the cost-optimised computing approach that has lead to supercomputers in our pocket rather than the quality-over-features approach that you needed back when people spent more than a year's worth of meals on relatively simple electronic devices.

floppyd · 6h ago
"Old man yells at cloud", but in so-so-so many words
torlok · 6h ago
It's always old people complaining that 20-year-olds didn't learn programming 40 years ago like they did when computers had 5 assembly instructions, and a beeper for peripherals.
palata · 6h ago
I understand how it may sound like this, given that older people will talk about assembly and electronics which most young developers have absolutely no clue about today and are still considered "software developers".

But it's not specifically about assembly, it's about software design. You can take a modern programming language (say Swift or Rust) and look at how software written with those languages is architected, and the points still stand: abstractions above abstractions above abstractions because people don't understand the lower levels.

People routinely write completely wrong CMakeLists and then complain about CMake being "sooo bad". But give them Meson and they will make a mess as well. People have no clue about packaging and distribution, so they will say "it sucks sooo badly" and will distribute their code as a docker container, embedded in a 6GB Ubuntu image. Most emails you receive just contain a couple lines of useful information, yet they are generated by higher-level systems, full of HTML and bullshit and it's impossible to read in a simple client. Etc.

Software quality is going down year after year, it is a fact. Probably because it is becoming more and more accessible, but the fact remains.

CoastalCoder · 5h ago
> People routinely write completely wrong CMakeLists and then complain about CMake being "sooo bad".

I think this conflates a few issues.

I believe you that some people have problems with both CMake and Meson.

But in my opinion CMake's scripting language really is pretty poorly suited for it's role, e.g. because of its blurry distinction between strings and lists.

scott_w · 6h ago
> I understand how it may sound like this, given that older people will talk about assembly and electronics which most young developers have absolutely no clue about today and are still considered "software developers".

As a software engineer in his mid-30s now, I can assure you many "older people" will have little-to-no memory of messing around with assembly and electronics. When I was getting started, my boss told me about an engineer who had a deep knowledge of how to lay out data to efficiently read and process it. My response? "I just stick it in Postgres and it does it all for me!" No shade to that engineer but I do believe he was in his 50s/60s at the time, so it's quite likely he's retired on a decent pension by now!

MountainMan1312 · 4h ago
> Most emails you receive just contain a couple lines of useful information

No matter how many times I do it, I'm always re-shocked by the sheer size of email headers from mainstream email providers. Maybe it's a non-issue but just holy god that's a lot of crap that means absolutely nothing to me.

> [...] and then complain about CMake being "sooo bad"

OoOh yeah I'm one of those. I gave the whole heck up on C++ years ago because of the many many interlocking compilers and compiler-compilers and meta-compilers and makers and whatever else is going on. SOOOOO confusing. Like dude I just want to take this code I have right here... and compile it. Such a simple task but first I have to learn 3 layers of compilers and builder and uggggghhhh.

And don't even get me started on "projects" (i.e. in Visual Studio or whatever). "Project" is not inherent to the language, but they don't teach that to beginners. Everything is taught so anti-agnostically.

palata · 6h ago
I would be curious to know if:

- You have enough experience to agree with the post but feel like writing such a post won't change anything, or

- You are more of a junior and don't understand what the author means because you've grown up with the engineering quality we have today.

Given that you consider this to be "so-so-so many words", I would bet it is the second option. Older people wouldn't consider that this is a huge text.

Am I right? :-)

No comments yet

worldsayshi · 6h ago
> we’ve traded reliability and understanding for the illusion of progress.

I wish there was a somewhat rigorous way to quantify reliability of tech products so that we could conclude if tech on average was about as buggy in the past.

I mean I also feel that things are more buggy and unreliable, but I don't know if there's any good way to measure it.

palata · 6h ago
If you look e.g. at websites, I think we can measure that the average website today is slower to load than 15 years ago, even though hardware is orders of magnitude faster.

Another thing is that today, you receive updates. So bugs get fixed, new bugs get introduced, it makes it harder to track other than "well, there are always bugs". Back then, a bug was there for life, burnt on your CD-ROM. I'm pretty sure software shipped on CD-ROM was a lot more tested than what we deploy now. You probably wouldn't burn thousands of CD-ROM with a "beta" version. Now the norm is to ship beta stuff and use 0-based versioning [1] because we can't make anything stable.

Lastly, hardware today is a lot cheaper than 25 years ago. So now you buy a smartphone, it breaks after a year, you complain for 5min and you buy a new one. Those devices are almost disposable. 25 years ago you had to spend a lot of money on a PC, so it had to last for a while and be repairable.

[1]: https://0ver.org/

scott_w · 6h ago
From playing video games, I'm going to say that generally games are more reliable today than they were in 2005, and I recall a few absolute fucking howlers from the 90s and 2000s (looking at you, Digimon World!)

My comparison points: Fallout 3 being a shitshow of bugs on release vs Final Fantasy 7 Remake & Rebirth feeling practically bug-free on release. In fact, I don't think I hit any bugs in Final Fantasy 16 either.

zargon · 4h ago
Rest assured, Bethesda’s next release will be another bugfest. Not exactly a useful comparison.
scott_w · 1h ago
They probably will but they’re the comparisons I have to hand and, let’s be honest, this entire discussion is based on anecdotes.
troupo · 3h ago
On the other hand you didn't have "download this multigigabyte patch on day 1" because once burned to a CD it was there forever.

Same for "you haven't touched your game console? here's 200 gigabytes of updates at 200kb/s"

scott_w · 1h ago
Having had a PS4 and a PS5, I’d say even this has improved.
ipcress_file · 6h ago
I'm conflicted. I remember when I bought my first vehicle with electronic ignition. It wasn't as good as the electronic ignition today and I had to replace a few black box components that I didn't understand (what is a "thick film module" anyways???). So I was irritated and wished that my truck still had points and a regular distributor.

Flash forward to today. I can't remember the last time I replaced an ignition system component. I still don't know how they work. I can guarantee that the techs who do occasionally replace them at the dealer don't know how they work. But the whole system is so much more reliable.

That said, I do wonder how young people are supposed to learn and gain understanding in a world where they cannot possibly understand the components in complex systems. Back in the day (I know, yelling at a cloud), I could actually understand how a set of points and a distributor -- or a even a three-transistor radio -- worked.

troupo · 6h ago
I mean. I takes over a second with a blocking loading indicator to change car temperature by one degree: https://grumpy.website/1665

This... This is quite quantifiable

taikahessu · 6h ago
Well, it's obviously true new stuff breaks fast, can't argue with that. But it's deliberate, not forgetfulness. Just capitalism on steroids.
worldsayshi · 6h ago
> Deliberate

I've been thinking about this; I suspect that a lot but not all of "planned obsolescence" comes down to not acting on a flaw, that aligns with your interest, as it appears. Can that be thought of as deliberate?

It's a trolley problem kind of question I guess.

ozim · 5h ago
Meh, fluff article without substance.

Title misses the mark and hand waves shitload of progress we have.

Nagging about consumer electronics is fine because a lot of stuff has its issues. But compared to 20 or even 10 years ago everything on average works much better.

Procrastes · 4h ago
I loved my TI-99/4A. I used to think it was ahead of its time, but now I realize it was from an altogether alternate timeline where we built stuff to work.