Ask HN: When will managers be replaced by AI?
53 GianFabien 68 5/20/2025, 2:28:29 AM
There is no shortage of articles about AI replacing entry level jobs.
With reductions in workforce numbers, when will they start replacing managers with AI? What is the point of "leadership" when the workers are AI-bots?
Based on my experiences, I doubt that many of the managers are going to be competent prompt engineers.
Nobody would have believed it 10 years ago, but today AI is more likely to replace a concept artist than an accountant, so it's not beyond imagination to replace a manager even if the ICs are still human.
AI excels at summarization, which is a big part of the job for a lot of managers. They gather information, go to meetings, write reports, and generally re-share information appropriate for whatever audience.
At a lot of companies, the lowest level managers don't make a lot of decisions either. Tech leads make technical decisions, PMs make product decisions, and the skip-levels (e.g. Directors, VPs) make staffing decisions.
In practice, I don't think humans will report to AIs, but hierarchies might flatten (e.g. ICs report to Directors) and responsibilities might get shuffled around (e.g. some duties get assigned to HR).
If the workers are AI-bots, then I don't really see any skill overlap with management. If you manage only AIs, you are an IC, not management.
Of course, I take such reports with a grain of salt, because I often wonder whether such news items are self-serving product promotions in disguise.
Do you really think so? I understand the basic sentiment of your statement but having tried to use AI for concept art, I was very disappointed at its lack of originality. Especially in an inevitably oversaturated market of AI creative work, I see the value of good human conceptual artists only rising.
Look at Musk. He's CEO of six companies (or so), yet has time to run DOGE and constantly post on X.
I think CEO networking is code for cartels & collusion.
I think it is the layers of muddle management that could easily be replaced by AI.
It makes sense to me that AI could conceivably already be as good at making the hard, data-based decisions that CEOs make, and that, therefore, they could one day be replaced by AI. Meanwhile, you've got the soft skills part of being an executive, which humans are better at (as long as the people they deal with are also humans). So, you could split that CEO role into two parts, each specializing in half of what a CEO today does. Both roles would probably do a better job than the median CEO today, and get paid less overall.
But that "not anytime soon" part is the only thing I disagree with. Because I just don't know how long the timeline is for stuff like that. It can change pretty fast.
Plus C-level executives typically don't lower their pay, and IMO investors apparently don't care that much about their pay, I can't see a reason why their pay will be reduced (significantly).
You just described that CEOs are like broken clocks: they are mostly wrong, but sometimes they are right by chance.
How do you conclude that AIs can't do that? If it's about eloquently phrasing a random idea, AIs are perfect.
Same goes for managers in most cases. Firing people because an AI said to simply won't hold ip in court, at least for now.
When was the last time a CEO went to jail because of illegal activities committed by the company? There is no responsibility.
If the "responsibility" is "you become rich, and if you get fired we give you a huge bonus on top", then I'm pretty sure anyone would be happy to take it.
Being a CEO is like being a politician. You need to convince others that they need you even if they don't, or you're incompetent, or you serve other interests. It's not what it takes to "lead a company", it's what it takes to "get the highly-paid job".
So why didn't Warren Buffet replace himself as CEO with an AI, but instead he chose a human?
A proper assessment of their skill relative to the ground truth they lived would be nice. One cannot simply walk into an office and rub elbows now. And the other half of the gender, and minorities make up a much larger part of the work force
New Deal bootstrapping then Reaganomics putting thumbs on the scales for those generations too.
His biggest asset was J&J when government was spending tons on health propaganda and grooming cause Americans used to be a bunch of greasy slobs. Oh look comb and toothbrush and mouthwash sales are staples buy buy buy then inflate through media propaganda and tax policy.
He was not a wizard.
Nobody really wants to decrease the number of humans in their fiefdom, right?
However, if AI actually works out and produces tools that make people, like, 5x more effective, than a software company can replace an existing one at 1/5 the cost with 1/5 the engineers. Fewer people to manage, less deep corporate tree, and maybe some of those middle layers will also use AI…
But nobody wants to decrease the size of their fiefdom, so that company will need to be built from the ground up and then wipe out the competition.
So the type of management will be a big factor
Buy before Al replaces "managers," companies will (or should) rethink how their systems and workflows operate, then realign roles to match.
Instead of starting with a question of replacing roles (and some certainly will), it'll start with redefining how work gets done, and updating job descriptions accordingly.
What won't change is that employers will hire for value. So maybe while some companies would rather substitute managers with AI, I imagine many would prefer the outsized value an ai-literate manager might have
You need a lot fewer managers if your team is 5-20% what it needed to be a few years ago.
The hype around AI is simply the grifters opportunistically inserting themselves and clueless investors wanting to stop potential bleeding.
I think the real question is how do we best harness the increased productivity? Logically speaking, if each person is 5x as productive because of AI there should be an equally greater capacity to get things done. Businesses aren’t just running out of work to do, right?
This is a weird question. If the team below a manager is replaced by AI, then quite obviously there is nothing else to manage. The real question is: can AI replace the teams?
Then of course, if there is a reduction in workforce, there may be a reduction of the number of teams and hence of the number of managers for those teams.
> competent prompt engineers
Writing prompts is not engineering.
Performance Management - Biased or inconsistent performance reviews. - Goal-setting lacks clarity or alignment with org OKRs. - Lack of real-time performance insights.
Project Planning and Execution - Estimations are often inaccurate. - Project scope creep due to unclear requirements. - Dependencies across teams delay execution.
Technical Debt and Code Quality - Mounting technical debt slows velocity. - Inconsistent coding standards. - Hard to trace ownership for legacy code.
Team Collaboration and Communication - Cross-team communication breakdowns. - Time zones complicate decision-making. - Meeting overload or lack of clarity post-meeting.
Onboarding and Knowledge Transfer - New hires take too long to ramp up. - Tribal knowledge isn't well documented. - Onboarding processes are inconsistent.
Incident Management and Reliability - Blameless postmortems are rarely actionable. - Alert fatigue from noisy signals. - Root cause analysis (RCA) is time-consuming.
Career Growth and Mentorship - Lack of clarity in career ladders. - Mentorship is ad-hoc and inconsistent. - Managers don’t have enough time for coaching.
Engineering Productivity Metrics - Metrics often feel punitive or misused. - Hard to attribute impact to engineers’ work. - Lack of actionable insights from engineering data.
Cross-Functional Alignment - Product and engineering priorities are misaligned. - Specs often change mid-cycle. - Lack of visibility into roadmap tradeoffs.
Each of these categories will probably need an AI agent in itself and probably an AI agent to control all the other AI agents. It would be a complex system, but will still need some monitoring until it is self-sustaining.
The industrial age management practices disregard the intelligence levels of professional levels of ICs.
Middle managers - when the line managers are gone
Senior leadership - when the middle management is gone (IF they are willing to give up their seats)
Just now I'm recirculating for the 2nd time a quote for a laptop that admin sent back to me because it had a tiny detail wrong, and then by the time I did the 2nd submission the quote expired. And this is about 1% of the beauracracy to get a new employee started in my org.
The real concern should be that telling entry level workers they need to be prompt engineering experts on top of everything else is stupid. We're only making it harder to hire the right people.
We should be focusing on whether someone can get the job done regardless of what strategies they prefer to research a solution.
A writer has responsibility for their writing. How can an AI be responsible?
AI doesn't need to be responsible. It just needs to provide value, just like writers, developers, managers, etc.
As a more general role, the idea of responsibility is that the manager has the job of making sure that individual employees' tasks are suited both to their individual competence and abilities and to the corporation's deliverables and ultimate bottom line. This requires making arguments in both directions: in pulling employees to working on things more useful to the company, and in changing the deliverables to capitalize on employees' abilities.
With multiple agents (of any nature) feedback is essential for the work to be done well - that’s my understanding of how it translates to the actual work getting done.
Also there's tons of science validating how the most unqualified and unfit people make it to leadership positions. If you're a leader, most likely you're not a good one. So it's not like the industry knows a good leader when they see it. So if AI is a better manager, the industry doesn't care. It's politics and ass kissing that gets them up there.
However I think an AI would maybe be more honest, about how well I do and my super high skill level perhaps, and not simply reject people for being over 50, White, Male, Overqualified, and good looking, as is [or dare I say was] the custom with DEI hiring practices, for the last decade...or two.
The exact same thing applies if you are a manager, do you really want a flock of loyal electric sheep to do your bidding? If you’re in management for control over others, how is that satisfying? If you’re in it to mentor, who comes after you? How?
Why does anyone want this? Our societies are already so mechanized and automated yet somehow we have less time than the average medieval peasant to enjoy our allegedly easier lives. What toil has been eliminated thus far?
Then again the office politics might even get WORSE when people try to trick an AI Boss into blaming someone else for a problem they created themselves. Then again the AI will have a superhuman knowledge of who checked in bad code that broke the product, etc. Lots and lots of trade-offs.
as always: imho. ...
idk. ... what do you mean by "managers" in your question!?
in my view: the "real" task of mangers - regardless of the level, but even more if they are at a lower/mid level - is managing peoples & their expectations - either of their "team" or their superior.
and looking at the current state of "AI", i don't see much gain in using it to manage those part of "management".
but i think (current) "AI" would be a good source of "additional" decision/reasoning over the lets call it "technical parts" of mgmt ...
sure, this will change in the future, but currently i don't see much on the horizon regarding the "peoples" part of mgmt.
but in the medium/long term, i could imagine a development somewhat similar to the following:
looking at the progression of neo-liberal capitalism: using / blaming AI for unfavorable (mgmt)decisions may be a good possibility to "hide" behind said AI to enforce such "unpopular" decision.
they would have been made anyways, but using this pattern "nobody" is responsible for such developments, because "AI said so" etc...
just my 0.02€
You will see this magnified by this year's Google I/O announcements.
Anytime they mention "AGI", it really is the goal of AI replacing humans from their jobs that are economically useful. (Not the benefit of humanity bullshit.)
In 10 - 15 years time, the questions from those who got out of tech would be:
Did you that humans used to program computers?
[0] https://news.ycombinator.com/item?id=42490692