Ask HN: What are your thoughts on the impact of AI on programming careers?
16 HisoMorow 18 7/22/2025, 2:22:28 PM
I m sure you have heard this question many times already so excuse me if this post feels spammy, but the evolution of AI makes me worry on how it will impact my career. I m a junior level developer (not entry level though) that has been working on a company that creates systems for banks. A coworker of mine attended a presentation done by the company that focused on a project that would normally be done by a group of 5 people over a span of 6 months, but was done by a single person on merely 6 days through a no-coding / vibe coding platform.
This made me worry more than ever for the threat of becoming obsolete on my career due to the evolution of AI. What do you all think here? Is a programming career in legit threat under AI, more than other careers at least? And how long term you estimate the actual threat to be in this case? Lastly, how do you think it would be a good idea for a programmer to move like in order to conform to the new standards, aside ofc from using AI to help with work whenever it makes sense to do so?
We got a batch (10+) of summer interns this summer, bright students from good schools, and they are absolutely glued to the AI tools. They're getting lots of code written with lots of tasks complete, but occasionally they write code that makes absolutely no sense and can't explain why they did it. Lots of very overcomplicated solutions to problems, and seemingly low comprehension of the code they're creating. It actually makes me feel pretty good about career viability long term -- people that have a deep understanding of the systems and code will need to be around to clean up the mess that is about to be created over the next 5-10 years.
AI is trained on what humans have already done. It can remix and automate, sure, but it doesn’t innovate or create like humans do — at least not yet. Reaching that level of creativity is a whole different game, and if AI ever does get there, we’ll have way bigger questions to worry about than job automation.
Bottom line: AI won’t replace us — it’ll reshape how we work. The role of the programmer will evolve, just like it has with every major shift in tools and tech.
In the short term, naive bosses may try to offload too much work to AI, but someday they’ll realize their mistake. I don’t think programmers are going anywhere.
For a long time, we've been coders, developers. Mentally in the code. But now we work on the code. We're now engineers. We coordinate how the parts work together. I like this definition of engine: "any of various mechanical appliances, often used in combination"
The most expensive model today is Claude Opus 4. It can draft plans. It can engage this plan. It can write tests, run tests, fix tests. All by itself. But if you don't keep an eye on it, it will run off and do something you didn't intend it to do for 20 minutes.
No comments yet
As a very seasoned developer, I have seen lots of projects built by a single person in a short time that “would normally” be built with more people over a longer period of time. What is invariably missing from these projects are the very things that the 5 people would have added. The main thing that is usually missing is robustness. These single dev projects will usually fall over hard when they stray from the happy path (why would anyone enter a value other than a small digit for this input???).
Data integrity is often ignored as well, both internal to the application and how the data flows through the rest of the systems and the org as a whole.
Security is generally an afterthought as well.
And then, there’s sort of the elephant in the room, which is part of why these projects take a long time, is that they need to be defined and refined in the first place. It is not uncommon to find some edge case that requires a lot of effort to decide how best to handle it, and it’s often the product owner who needs to decide tradeoffs for thw product.
Incidently many of these reasons are why offshoring also fails.
So, where does that leave things? AI will undoubtedly allow a team of 9 to do what a team of 10 did 2 years ago (maybe even 3 of 5). AI also will potentially help with each of the things I’ve mentioned as well, I’ve had good luck with AI helping with missing test coverage. I’ve also had a lot of luck with AI helping where I have gaps in my knowledge about things. So I expect the quality of software will get much better too.
I will say, right now there is a big shake up of the industry. I expect short term prospects to be bad in the same way the offshoring frenzy of the early 2000s. What you are going to find, is that long term demand for technical talent will only go up. In the same way that efficiency in anything tends to increase demand, things get cheaper and people find more uses for said thing.
Another thing to think about; I sort of saw the writing on the wall 6 years ago. I continually saw the barrier to entry fall in software. So I changed my niche and moved closer to the hardware, where fewer people seemed to be entering. I will tell you that the places LLMs struggle with the most are the places others haven’t spent much time documenting. In the domain I currently work, it is impossible to use any code auto generation, there are too many places it transposes things or makes up api calls. But the code it generates can be very useful when the documentation is ambiguous and some obscure forums post likely has an answer the LLM can draw from. So it is possible to stay ahead of AI by moving to the fringes.
Get that AI Engineering master's degree. Learn the math, the statistics, the models.
https://www.indiatoday.in/technology/news/story/bill-gates-s...
Everything I see is just "I feel more productive"
Where are the actual numbers?
What worries me is the trend and speed at which things are progressing, especially with the amount of money thrown at it.
If you believe the trend will stay then many of us will be out of a job in a few years. If you don’t then you’re safe.
There are already markets ( translation, illustration etc ) where AI has changed things significantly. Ours is slightly different because it requires formal things to be correct unlike translations or illustrations which can be slightly ‘off’.
My view is that either AIs learn how to generate human debuggable or already correct code, or it won’t work and the trend will stop. I believe there’s a fundamental hardness here - if they break that , then other things will change and our world will be fundamentally different.
Note, this is all belief !
https://www.youtube.com/watch?v=shOLHtyUaFg
Basically it augments or provides boiler plate but that is all.
> This made me worry more than ever for the threat of becoming obsolete on my career due to the evolution of AI. What do you all think here?
Whatever impact genAI has, there will be a strong demand for traditional devs for years to come. Worst case, where that demand comes from may change (for instance, it might not be from SV style companies), but it will still exist.
Anyone responding is going to find themselves downvoted (karma leached) regardless of any legitimate points or position and you can't have interesting conversations without measured disagreement. Value comes from things you don't know, "yes people" blindly follow the trend.
There's also no real point or value in a discussion when you have bad actor bullies silencing truth they disagree with or opinions that are rationally based, and then have them trying to gaslight and play off the lack of anything to the contrary as their sentiment is just what everyone thinks (because downvoted posts disappear beyond a threshold).
Its not, and HN is just a minor step up from 4chan with perspectives favoring Chinese talking points most times, almost completely unmoderated with few exception. The sophisticated propaganda, manipulation, and other stuff just flows freely.
Just recently I had a post where I was significantly downvoted for saying AI shouldn't be controlling nuclear power plants for safety issues. How's that for warped. The people were saying the AI could manage the power grid and just turn the power plant back on right after it was shut down, and I pointed out there's a 3-day wait and failing that wait is what happened at Chernobyl and you can't do that in a controllable way (Xenon is a problem). -9 on that post, and flagged; now its gone afaics.
Seriously messed up. There's a point where the platform gets a bad reputation, and crazy/delusional polarizing elements take over by driving the grounded and measured people away.
That topic is a much more interesting topic.