Ask HN: What will tech employment look like in 10 years?
I do not predict the elimination of the humble coder, but the covid hiring wave has come and gone, and Big Tech for the most part successfully minimized the workforces of those who were hired in the covid wave: frontend, backend and fullstack engineers. The patterns of code required for these positions have been successfully recognized by the LLMs I think, and for many cases a single staff engineer with experience and a trusty LLM is similarly productive as a team of 2-4 junior engineers led by a senior engineer was only a short 5 years ago. I do not expect much expansion in this "traditional" web development (these positions have really only existed in modern form for about 20 years, roughly when Rails was first released).
Many such as Amjad Masad and Beff Jezos are of the opinion that for those who would have taken these positions before, the options are to either drill down the stack towards the bare metal, by reason of relative difficulty of embedded engineering, and that one struggles to imagine high-stakes software such as in a SpaceX rocket, Boeing airplane, or Anduril drone relying primarily on vibe-coded slop hastily LGTM'd into production. So the kind of software that requires large amounts of formal, simulated, or physical verification seems to still be necessary, but this is much more difficult to write than a webpage. Expansions in the labor market for those writing C, C++, Rust in the context of operating systems, embedded systems, microcontrollers, drivers, and so forth seems likely.
The other option seems to be to leave the stack entirely, and leverage small teams to create niche and targeted applications for small segments of users. There has been some success in this area as well, but requires a much broader skillset than simply being an expert programmer and understanding some computer science.
The options seem to be either to start reading Bjarne Stroustrup or Peter Thiel. But the skill ceiling for either path is fairly high, and for the short term I predict a sustained contraction in the software engineering labor market, while people adapt their educations and long-term career goals. Headcounts at FAANG I don't see recovering soon if ever. This has broader implications for a traditional startup route where one earned their stripes at FAANG before launching their own venture, but I digress ...
The workflows we have are not quite right for it. Coding has always been 10% coding and 90% debugging but I think the rate at which we generate the 10% will grow exponentially.
This means that the debugging has to grow. We will generate errors at an unprecedented rate.
LLMs trained on previous errors and methods won’t catch them. They’ll be more complicated and spread out over the code.
We need new tools to visualize the code and track errors. I think what it means to be a programmer will change. More testing, thinking and less klocs.
The projection from current SOTA is that coding LLMs replace 1 senior dev + 5 junior devs with 1 senior dev + LLM.
Ergo, there will be fewer junior dev positions and senior dev salaries can capture some of the additional value they deliver.
Whether that's offset by the increase in senior supply as junior -> senior conversions happen...?
The two areas that will likely expand demand to absorb those junior devs are:
(1) Integration engineering (i.e. poorly-documented interface to poorly-documented interface)
(2) Testing (because if kloc are cheap, there will be more money to validate -- and it will make people feel safer to have human-in-the-loop before sign-off / production).
Coding LLMs are always going to be better at things well-represented in their training sets. Which is to say, the most popular languages, API styles, apps as of todayish.
So, other things will be the best use of human time.
Thanks - I needed a good laugh.
I copy pasted a part of my code into Claude.ai to help me add a few lines. It completed that part and also rewritten another few lines. I asked why it did that, explanation was that I missed a edge case.
LLMs are good for writing boring parts of code and also helpful for catching bugs.
How did the finance world adjust to a world where financial transactions were automated on the blockchain using smart contracts?
How have cities adapted to the massive migration of in-person experiences to the metaverse — and so soon after they rebuilt all their physical infrastructure around the revolutionary personal transportation system known as the Segway[0]?
[0] none other than Steve Jobs predicted that cities would be designed around the Segway https://www.theguardian.com/world/2001/dec/04/engineering.hi...
(Not to say that revolutionary world-changing progress is not _possible_, but for every Industrial Revolution there are quite a lot of Segways and metaverses.)
This isn't C to Python, it's Python to more Python faster.
However, I think there is a second thing that is often overlooked here. It seems the angle is always 'oh companies can replace developers' but no one seems to consider that developers can replace companies. I think you are going to see small teams of very skilled people replace able to make amazing products. There are limits to what llm's can do on their own but a skilled engineer who masters the tooling and can unblock the models and knows the techniques to keep the models productive as the codebase grows will be able to produce amazing things far beyond what they could ever produce before. I think you are going to see an explosion of new, smaller companies. I think if you are an engineer you shouldn't be gloomy, you should be excited.
A LOT of new to be created firms will suddenly find they require programmers, developers, etc. Hence a lot of jobs.
We'll need a small level of improvement in the current crop of robots available, but not much.
I think the developer market is about to expand massively if tariffs are spreading (because that will force a lot of countries to move a lot of production back onshore, as goods are prevented or obstructed on the borders).
Of course, either I'm totally wrong or people haven't yet really realized they can do this, as it's not happening on the level I'm predicting ... yet.
One thing that is constant is the billions of people using the internet and their buying power, so I think there will also be tremendous opportunity for people able to release good software while mid+large+huge tech companies focus on eliminating the expensive tech salaries that built and sustain their fortunes. Their software is going to become shittier until LLMs get much better.
It's just one more step in the multi-generational trend of eliminating/outsourcing lower skilled jobs and increasing the barriers to entry for the remaining jobs. I'm not optimistic.
Outsourcing isn't just grunt work. It's almost all work, but cheaper and with less control over the outcome. There will be no "guiding people/LLMs".
You have talented people all over the globe and to date what the best of the best were doing is move to the US. Those were the people guiding/managing outsourced teams, not Americans. You see this now with the "Indians hiring Indians" phenomenon.
Trump put the brakes on that, so currently the only thing that's certain is uncertainty.
My take is that there will be an attempt to replace at least the bottom end of outsourcing with LLMs, so more pressure on offshore labour to perform and, over time, fewer positions overall, as junior roles have already evaporated around here and eventually that will translate to fewer seniors.
It might not be only grunt work, but it is mostly grunt work. That lack of control has led to our org only outsourcing grunt work and low risk projects because of the higher failure rates we've seen. We are starting to use on-shore senior+ and TLs to oversee the work the off-shore resources are doing to help control the outcomes more.
Also FAANG and the like perhaps didn't outsource, but definitely offshored much of their operations.
The current era of making the most complicated fucking thing possible is clearly nearing its conclusion. The writing is on the wall with the now weekly conversations about vendor and dependency management crises.
One might say to use something with more batteries included or whatever. I'd go 5 steps further - have you solved the problem outside the computer yet? Do you understand the domain model? Do you understand why the customer wants to pay for any of this bullshit in the first place?
I think that's approximately what it's going to look like. Excel, bash and powershell replacing kubernetes clusters.
The financial pressure will eventually correct these issues. The cost of borrowing is non-zero now. Investors are going to want to see happier customers much sooner than before. Hobbling along with a handful of marginally happy customers for a decade isn't going to cut it anymore.
The most likely corrective measure is that the company goes away because they couldn't course correct in time. This is often the tragic consequence of tech egos (often the CTO) refusing to be sublimated in service of the actual business needs.
After a week, take note of how many hours you spend actually writing code.
I won’t share my exact count, but it’s shockingly low in relation to all of the hours spent working.
My bottleneck to more productive output is 100% not “unable to write more code faster”. It’s actually people. Other people.
I think we’ll see advancements in robotics and more hires there.
And I think there will be more jobs around the LLM ecosystem — progress on foundational models, inference optimizations, on prem migrations, networks of agents, AI more deeply integrated with existing sw.
Overall I think there will be more jobs in observability, security, and infrastructure.
I agree there will be fewer junior positions. I’ve written about some of these ideas before including a deskilling for new practitioners https://matthewbilyeu.com/blog/2025-03-08/ai
The rate of adoption of new APIs will slow down considerably.
LLMs only really know what they're taught and when a new API comes out, the body of learning material is necessarily small. People relying on LLMs to do their jobs will be hesitant to code new things by hand when a LLM can do the same using older APIs much faster.
Who is going to do it then? Well, someone has to or else the API in question won't see widespread adoption.
People will stop publishing any new knowledge or method to keep it from the LLMs and to keep people from exploiting new ideas before they can.
Computer Engineering domain will continue to grow. This will be due to 1) need for better GPUs and CPUs 2) need for data centre infrastructure engineers
LLM ecosystem will also continue to grow along with Machine learning ecosystem. Regression based models cannot be replaced by LLMs due to inherently not being reliable in producing consistent output for structured problems
Initially there will be a huge employment downturn as organisations follow the pattern of hiring a Gen Y expert and supplementing them with Gen Z juniors using increasingly good AIs.
After a while, the Gen Y seniors will retire and/or die and it will be impossible to source a senior developer with actual knowledge. Wages will skyrocket for a few Gen Z experts, but in general there will be a shortage and the entire industry will eventually reboot, although I am stuck to speculate how exactly.
At some point, we will lose the ability to quickly make advancement in AI because engineers will stop understanding how they work due to a lack of deep understanding in mathematics etc. and advancement will drop by 100x or 1000x.
Due to people becoming very lazy on average, capitalism will become ineffective and most governments will veer towards centralised services, or central planning/communism to cope with the lazy populous. A few people with work ethic approximate to today’s founder work ethic, or deep skills, will be worshipped.
Recently, there was an analysis of when we might get to Post-Labor Economics:
> 2025 to 2030: Collapse of knowledge work. The "KVM Rule" applies: any job you can do entirely with a keyboard, video, and mouse will be fully replaced.
https://x.com/daveshapi/status/1916188978727784847?t=9YNl90V...
Even that seems too fast. There will surely be more innovation, some unanticipated, but not that knowledge work will be all gone in five years from now.
I'm curious, what makes you think this?
Did you see the recent case of the airline’s customer service LLM that promised someone a refund due to a relative dying? And the courts forced the airline to honour it despite it not being accurate (turns out LLMs have more humanity than a corporation).
Putting your company in the hands of an LLM workforce could save a few dollars, but it can cost a lot too.
Generalists will be undervalued, but will always be able to find a job.
There will be pockets of extraordinary cruelty and pockets of extraordinary grace.
Where you find yourself on the cruelty/grace/specialist/generalist surface will still be up to you, but also, just as it has been for thousands of years, also up to fate.
Do your best, control the things you can, and accept the things you can't. Remember that work is just work, not living.
Hang in there, it's only 10 years more to get there...
It'd suck a bit should you be automated out of the thing you've spend a few years studying if you're a fresh grad, not to speak of anyone with more experience. While I guess one shouldn't be complaining, no work bring shame, going back to cashier, or warehouse worked would be a bit of a letdown.
If the machines zig, you zag.
I think we are SEVERELY underestimating the amount of slop that is going to come from this.
I haven’t seen anything similar in any other engineering industry.
No new tests, no new design, no new implementation. Literally just a single word swapped.
Turns out the engineerinng team had uncertainty with the new testing framework that was mandated, and rather than document it they blew up their velocity.
This approach bites back. When work isn't understood, people on the outside disrespect the workers.
I am very surprised that no one in HN can see the correlation of the narrative and speculations vs the absurd age thrown around only to be repeated by actual people whose image is being tarnished because the mass narrative wants them to feel powerless(first stage of wage suppression).
I'm not saying you're wrong though, but just that it won't be much different.
The reasons why exist in Taleb's antifragility thesis: the antifragile will gain from disorder.
The nature of the tech industry, in the decades roughly since the Cold War ended(which put to rest a certain pattern of tech focused on the MIC and moved SV forward into its leadership position), has promoted fragility along several of Taleb's dimensions: it aims to scale, it aims to centralize, and it aims to intervene. The pinnacle achievement of this trend is probably the iPhone, a convergent do-everything device that promises to change your life.
But it's axiomatic(in Taleb's view, which I will defer to since his arguments are good enough for me) that this won't last, and with talk of "the end of the US empire", and a broader pessimism in tech, there seems to be popular agreement that we are done with the scale narrative. AI is a last holdout for that narrative since it justifies further semiconductor investment, stokes national security fears of an "AI race" and so on - it appeals to forces with big pocketbooks, that are also big in scale, and also in a position of fragility themselves. But eventually they will tap out too, for the same reasons. Whether that's a "next year" thing or a "twenty years" thing is hard to predict - the fall of the USSR was similarly hard to predict.
The things that are antifragile within software are too abstract to intentionally develop within a codebase, and are more a measure of philosophy and how one models the world with data - CollapseOS is at the extreme end of this, where the viewpoint is bacterial - "all our computing infrastructure is doomed if it is not maintainable by solo operators" - but there are intermediate points along it where the kinds of software that are needed are in that realm of a plugin to a large app or an extension to an existing framework, and development intentionally aims not to break outside that scope. That thesis agrees with the "small niches" view of things.
Since we have succeeded in putting computers into every place we can think of, many of the things we need to do with them do already have a conventional "way of doing it," but can exist in a standard and don't need to be formalized behind a platform and business model - and LLM stuff does actually have some role in serving that by being an 80% translator of intent, while a technical specialist has work in adding finish and polish. And that side of things agrees with going deeper into the stack.
I believe one pressing issue that's faced with this transition is the inclination brought from the corporate dev environment to create a generalized admixture of higher and lower level coding - to do everything as a JS app with WASM bits that are "rewritten in Rust" - when what indie software needs is more Turbo Pascals, Hypercards, and Visual Basics, environments that are tuned to be complete within themselves and towards the kinds of apps being written, while being "compatible enough" to deploy to a variety of end-user systems.
Average devs will be no longer needed as all they can do a senior will be able to do in the same time with LLM.
I think we'll go back to the system as in the Mythical Man-Month - one lead developer, one below them to do less important tasks, and a few domain experts not related to programming. Ironically I think good front-end developers may remain useful as UI/UX experts.
It's not like they've done a great job so far. There was a time companies spent time and money on testing their software on the people who would use it - nowadays we have "experts" who know it all from school.
Enough time for LLMs to see marked improvement and possibly the hallucination issue to be significantly reduced or solved.
When then happens the remaining software engineers won’t be skilled. Like many people in the principle or staff position or managerial position… technical skills don’t matter. AI now handles technical skills and the people controlling the AI are people who can navigate the politics.
Like you made a claim here with no logical reasoning as to why.
My reasoning is simple if AI handles all technical skills then no technical skills are required for the job anymore. Where’s your reasoning?
In addition to this “unqualified hand waving”. Look at your own baseless statement. At least I qualified my statement with “it’s possible”.
If LLMs solve the hallucination issue then it’s stuff like this that will sit at the top of the hierarchy. People who make grand claims with confidence and play politics to get to the top. You say my statement is handwavy but really that’s another way of saying “I don’t need to prove you wrong I’m just going to make a baseless claim about how your statement is utter crap but do it in a way as to subvert HN politeness rules, i can say it in a way that doesn’t raise any eyebrows and people who already agree with me will automatically take my side even though I didn’t present any additional new reasoning here”.
This is what I’m talking about. It’s people who are good at strategies like this who will occupy the top spots in the future should LLMs continue to improve. Case in point. If karma was voting for the next person in a leadership position you would win.
This is a view utterly out of touch with reality. AI handling all technical skills? And who defines where technical skills start and end?
Will AI development/innovation stop in the future? And if not, will the engineers working on AI not be applying technical skills? Will AI eliminate the technical skill of plumbing? Of doctors? What about systems analysis and architectural design, a technical skill. Will AI read the minds of people, anticipate their every tech need, and eliminate that too?
Maybe some have a weird understanding of the "technical". Its meaning is much broader that they think.
Oh I just read your second paragraph. I’m obviously only talking about coding. Blue collar jobs that involves a lot of manual skill is less of an issue. Robotics has an upward trendline but it’s nowhere near moving at the breakneck pace of AI.
Technical skills will matter for the simple reason that someone will always need to supervise the output of the AI. A non-technical person can't hope to do so too well. Without this supervision, the AI's work product will have skeletons that will cause unexpected issues due to outliers. Reality is full of outliers - it's what pays half the salary.
Imagine asking an AI to design an airplane. The design passes all test flights and also software tests. Would you want to fly in it without an expert human having reviewed the ins and outs of the design?
How about a CT scan machine? Would you want to risk the 10x radiation due to a hypothetical implementation error that strikes in 1 out of 10K cases?
You pulled this statement out of your ass. Objectively? We have baseline quantitative tests that say the opposite? LLMS are doing better on tests and it’s been improving. Where did your “objective” statement come from? Anecdotes? Quotations?
> Technical skills will matter for the simple reason that someone will always need to supervise the output of the AI.
Humans will never be so stupid as to lose all technical skill. In the beginning there needs to be mild technical skill at most so the human can somewhat understand what the AI is doing. But as trust grows the human will understand less and less of it. The human is like a desperate micro manager attempting to cling to understanding but that inevitably eventually erodes completely.