Ask HN: Where are the best online gathering places for humans?
6 points by jMyles 1d ago 11 comments
Ask HN: Is context-switching the real productivity killer?
8 points by kristel100 13h ago 2 comments
Ask HN: Do you think programming as a job will end soon and if so, how soon?
4 akkad33 4 8/21/2025, 8:44:10 PM
So no, I don't think programming as a job will end soon, because there's no reason to think that it could. No plausible story I've seen about how that would even happen.
I do want to see big expensive products being built and released entirely by C-suites after laying off all their programmers/writers/directors/people who actually know how to do stuff. That should put an end to this madness pretty quickly.
1. AIs are really good at doing the equivalent of copy-pasting code you don't understand from many places, running the compiler and moving stuff around until it compiles. But whenever I find something slightly challenging, no AI remotely gets close to the solution. I can literally put everything in one function, explain to the AI why it doesn't work and why the solutions generated by the AI don't work over and over, and the AI never gets to the actual solution. For that, we need a human.
2. It's true that software quality is generally bad already and has been declining even before AIs. Security is really bad. But we at least need some basic sanity. You can't have an AI include random unchecked libraries forever, and you can't even provide a list of allowed libraries because AIs may well end up generating the malware in your code. You need someone who can read and understand the code to keep the bad security we have today, otherwise it will get even worse.
3. All the examples of vibe coding nowadays seem short-sighted: what happens when you vibe code millions of lines? How does that work, when does it just implode entirely? We already talk about how Microsoft may have just lost skill and how they may not be able to produce the same quality as they did at their peak, what happens when literally nobody understands the code?
4. For the better or worse, being more productive has never meant working less. When humans get a new tool that makes them more productive, they just produce more instead of working less. Companies will keep hiring programmers even if those are more productive because of AI.
AI will probably stay as a tool for programmers. But it won't take their jobs.
Since the eighties, software (as problem/solution space) has expanded to fit the hardware available. At first we scaled with the density of transistors, and then, in the nineties, we scaled with the unfolding multitude of Internet-enabled use cases. Then, in the late aughts, we scaled again, this time with the unfolding multitude of mobile and embedded use cases.
Think of it as a sort of hot-air balloon -- each of these initial spurts is kind of like the guy in the balloon pulling the rope to heat the air; the balloon goes up. And each time we got to pull that rope, we got to go higher, and explore more and more of the problem/solution space, transforming the world as we went.
The thing is, hot air balloons can only rise so high -- they have a contextual window that includes the lower and middle parts of Earth's atmosphere. But you can't get a hot-air balloon into orbit -- the context window gives out; you'd need to go back to earth and start from scratch (maybe build a rocket.)
What I find striking about AI isn't that it can replace all or most coders, it's that (to continue the metaphor) it makes the existing air in the balloon more buoyant -- there is less need for the guy pulling the rope to pull that rope (so if your job is rope-pulling-guy, watch out.) Yet -- and this is the crucial part -- AI is being _sold_ as if it's _more atmosphere_. I.e., someplace for both money and talent to _go_; a reason to keep pulling that rope.
For a while, it looked as if it was both -- a way to pull on the rope less frequently, and _also_ a new problem/solution space to go with the ol' balloon basket. But I'd reckon the _excitement_ about AI largely had to do with the second interpretation.
Now the picture is less certain. For some activities, AI still seems genuinely revelatory/apocalyptic, depending on which side of the manager/labour dyad you fall. Yet recent studies (frequently alluded to here on HN already, scroll around) seem to show limited bottom-line benefit for a lot of use cases. This might mean there is UX work to be done, or it might mean that we're bumping up against the top of the balloon's useful range, skidding along the ceiling of the problem/solution space.
So, ironically, if AI turns out to be very useful, in the same way that word processing, email, and maps all proved useful, programming sticks around as a lucrative profession, modulo some changes in how we market and think about ourselves as engineers. We will use AI to build the new AI things that people want.
But if, for whatever reason, AI turns out to be less of a big deal than it initially seemed, then we rope-pullers are now in a situation where there is less need to pull that rope, as there is nowhere left to go (and worse, the balloon is simply more buoyant now, and needs fewer rope-pullers).
So if AI is a big deal, the party continues; if AI is _not_ a big deal, it's time to get other skills, as a stagnant market leaves the investment capital nowhere to go and our skill-set becomes commoditized. (Time to get a union.)