Stop squashing your commits. You're squashing your AI too
2 points by jannesblobel 7h ago 7 comments
Ask HN: Best codebases to study to learn software design?
100 points by pixelworm 2d ago 89 comments
AI Is Wrecking Young Americans' Job Prospects
64 lucaspauker 95 8/26/2025, 1:04:41 PM wsj.com ↗
In 2024, 21% of all bachelor's degrees awarded were Computer Science from University of Maryland College Park.
It was 3% in 2011.
I don't agree with the article that AI is wrecking job prospects. I see it is as companies are just now trending towards running leaner vs hiring every good engineer available during ZIRP.
Nonetheless, it's gotta be tough out there for new grads.
https://www.usmd.edu/IRIS/DataJournal/Degrees/?report=Degree...
This sounds more like overproduction of entry-level computer scientists than anything AI or hiring managers are up to.
I was told that a student can now get a CS degree without courses in OS, Compilers, Programming Languages, theory of computing etc. The argument being that a vast majority of jobs do not ever use the above. That may have caused a flood of grads with a shaky knowledge of the basics. The idea that software engineering is not really a science but more of a trade for which anyone could be trained without a formal degree has some shades of truth.
But in my experience, technology changes so fast that someone with a better grasp of the basics can evolve with the tech since they understand the fundamentals better. LLMs really separate those who can critique and correct its output, and those that blindly follow it, and the former will continue to have jobs.
Yikes. At that point, it's really not much of a "CS" degree. It's a trade program that teaches you how to use particular programming languages and frameworks.
Someone with that background is in a brittle position. They won't be able to pivot as easily to different technologies when things inevitably change. And they'll be ill-equipped to handle interesting open-ended projects where it's up to them to decide what approaches to use, how to bound problems, how to reason through trade-offs, and what lessons to take from prior work.
Source for CS graduates?
https://www.gradeinflation.com/
I'm saying that a source from more than a decade ago describing a general trend doesn't explain why recent CS graduates are facing a worse job market than folks five years ago did.
Grade inflation, per your source, has been happening for decades across the board. That does not tell us why "since the widespread adoption of generative AI, early-career workers (ages 22-25) in the most AI-exposed occupations have experienced a 13 percent relative decline in employment even after controlling for firm-level shocks" [1].
[1] https://digitaleconomy.stanford.edu/wp-content/uploads/2025/...
A rational actor is going to be more likely to pursue something they think they can actually pass.
If grade inflation is happening to all degrees, that doesn't explain increased enrollment in CS. (I'm generally curious if part of the explanation is a reduction in CS education quality.)
Part of my explanation was a possible increase in quality of CS education for industry, as I mentioned, it is possible they geared it more towards industry than pure CS encouraging more students to go in.
>If grade inflation is happening to all degrees, that doesn't explain increased enrollment in CS.
It does because CS is (was?) a notoriously more difficult program. Since it is one of the highest paying degrees, making that difficulty of passing more accessible would naturally shift money oriented people more into CS.
The current phrasing makes it sound like they’re a diploma mill producing 21% of all bachelor degrees in the country.
No comments yet
I really believe it's just for the headlines.
When other tech companies realize GenAI will never produce what they want, there will be a rush to re-hire developers.
Top talent all started as junior talent. Grab that pool so nobody else will have it.
I wrote about this a bit. I wish we could hire more. I am kind of shocked how few companies do it. There are a LOT of smart kids who would love a summer programming job.
https://simonsarris.com/p/growing-software-developers
If this is really a concern, require a long-term employment contract from incoming candidates.
What's your argument supporting this? Ten years ago GenAI couldn't produce two coherent sentences. We've come a long way, what makes you think it won't go further?
First, they'd have to identify them, which the interview process at most companies is terrible at.
My observation is that, between 2019 and 2023, there were many creators shilling this, and probably quite good livings made off views and clicks. Could social media have amplified this, “fakely”?
https://digitaleconomy.stanford.edu/wp-content/uploads/2025/...
Better instead to use our collective brain power for something more productive. Such as digging into the various possible causal factors and understanding if the paper properly addresses and disentangles them.
But it makes it much nicer to say its AI that's stealing jobs to create even more hype.
Example, if you dig into who we technically consider unemployed in that number, you’ll laugh.
Let’s say after 6 months of emails and ghost listings you take a break, you’re now considered “not in the labor force” which is the same category as retirees and full-time students. So that “improves” the unemployment rate
Not a hot take, but I think we’ve been in a recession/massive slowdown for much longer than the gov data shows
Willing to bet hedge funds have their own calculations of these metrics they keep secret as a market edge
Anyone referring to unemployment data in the singular has not dug into the numbers.
So yeah, i'd say most of this AI stuff is bullshit, if it was really this good Sam Altman wouldn't be talking about building social networks.
Is it possible to stay better than AI? Maybe for some people. Not for the average person. The results of that are one of the largest contributors to the gloomy future (among other things).
This is false.
Real disposable personal income is higher today than any time before March 2020 [1]. Covid stimulus first dramatically raised (March '20 to '21) and then lowered (March '21 to June '22) that figure. But we hit a local maximum in April '25, after which real DPI started falling, though nevertheless only to the level we saw in spring '21 and early '25, and no point before.
(Real median household figures are more laggy. But they show the same trend [2]. On a national level, these figures are up.)
[1] https://fred.stlouisfed.org/series/DSPIC96
[2] https://fred.stlouisfed.org/series/MEHOINUSA672N
https://i.imgur.com/Dbf8yyU.png
The US had recovered to full pre-recession employment levels by 2017[1].
Unemployment is around 4% right now.
I can’t speak to discretionary income or why the market is high, and maybe there is some sort of structural “underemployment” going on, but people are working.
[1] https://www.cbpp.org/research/chart-book-the-legacy-of-the-g...
I know we're talking broadly across all industries but I can only speak to what I know and am able to observe directly.
My opinion of the average software developer with a few years experience is not very high. Yet now that we have non-coders shipping features written with LLMs, and we're starting to observe the fallout from that, I'm getting closer to saying than an entry level coder is far better than an LLM (depending on how we evaluate "better").
There are also a lot of hidden costs associated with LLMs. For example, I'm spending a lot more time reviewing PRs than I used to. And we're taking a lot more time doing rework than we were before.
We can't yet say that LLMs have caused an increase in regressions, since we've been racing towards a major new version release, and so people are rushing in general and that skews the numbers. Over time, however, we'll have data on rate of bugs introduced before the widespread company adoption of LLMs vs after, controlled for crunch times as well.
If the average software developer only spends an average of 20% of their time actually writing code, then even if an LLM can offer an optimistic 50% productivity increase, then we're only optimizing for 10% best case scenario.
I think there is a lot of marketing-hype-driven ideology around "AI" right now that is leading a lot of people to buy into some of the overstated claims. This ideology may have companies genuinely slowing down their hiring of entry-levels at the moment, since some people are saying that an LLM is like having an incompetent intern. The business thinks "If you need to babysit a junior and you need to babysit an LLM, then why pay for the junior?" And we still need better data to determine if, on average, what a company pays for a junior is truly more expensive than delegating the work to an LLM + taking on the maintenance and review overhead. We don't have the answers yet. My personal bias has me thinking that on average a junior will provide higher returns although not necessarily immediately. The benefit of a junior is that they learn from mistakes and can adapt more readily to specific business requirements.
This is not to say that LLMs aren't valuable. I think the trade-off for entry-levels is that I would have killed to have something like Cursor when I was a a pre-teen teaching myself to code in the 90s. When you want to build something complicated and don't even know where to start, and LLM can get you some scaffolding and show you a basic strategy that you can build on. Then you go fix bugs and poke around and break stuff.. it's a great learning tool. So I expect that, over time, the talent of entry-levels will probably increase. In the short term, we need to get through this AI bubble and stabilize. Companies will learn where LLMs save costs and where they can still benefit from less-experienced coders. It will just take a bit of time.
Previously discussed: https://news.ycombinator.com/item?id=44226145
So far in the Industrial Revolution, automating away jobs has been how we've getting richer and richer for centuries.
If AI automates away half of all jobs, and this holds we will - after an adjustment period - double GDP and collectively be twice as wealthy!
If that actually happens, it solves many currently "unsolvable" societal problems.
I'm pretty sure that it does, but the adjustment period might be longer than we'd wish.
I suspect for the already wealthy this will happen, but I think the average person will largely get handed an empty basket of promises and not much else
This is a big if!
I think AI is going to end up more like the late 20th century automation push. It's going to hollow out whole communities.
Maybe there is some hope if they can't fully automate the job with AI.
Currently all I see is a mix of very highly paid do it all types with rather lowly paid outsourced talent but no sensible middle and of course no way to realistically learn on the job - the bar to get in is very high.
Fundamentals are the problem, if there's no new avenues for economic growth, then there is no way to pay down the debt.
Internet gave me a more wicked explanation of this phrase, thus the internet is superior to AI.
There is a scam you need to spend $100 000 and waste 4 years of life, before even applying for a job! But this "qualified" worker can be replaced with $100 minipc!!!
edit: downvotes are right, I was wrong! buy $500 minipc, it is 10x faster!
fresh college grads are competing with foreign visa holders that have years of experience
> We use two different approaches for measuring occupational exposure to AI. The first uses exposure measures from Eloundou et al. (2024). Eloundou et al. (2024) estimate AI exposure by ONET task using ChatGPT validated with human labeling. They then construct occupational exposure measures by aggregating the task data to the 2018 SOC code level. We focus on the GPT-4 based β exposure measures from their paper.
>The second primary approach we take uses data on generative AI usage from the Anthropic Economic Index (Handa et al., 2025). This index reports the estimated share of queries pertaining to each ONET task based on a sample of several million conversations with Claude, Anthropic’s generative AI model. It then aggregates the data to the occupational level based on these task shares. One feature of the Anthropic Economic Index is that for each task it also reports estimates of the share of queries pertaining to that task that are “automative,” “augmentative,” or none of the above. We use this information as an estimate of whether usage of AI for an occupation is primarily complementary or substitutable with labor.14
Are there more H1Bs per recent college graduate today than previously?
https://www.hindustantimes.com/world-news/us-news/walmart-h-...
"In January 2025, 53.3 million immigrants lived in the United States – the largest number ever recorded. In the ensuing months, however, more immigrants left the country or were deported than arrived. By June, the country’s foreign-born population had shrunk by more than a million people, marking its first decline since the 1960s" [1].
[1] https://www.pewresearch.org/short-reads/2025/08/21/key-findi...
The absolute level remains historically, though not unprecedentedly, high. But that's part of a 50-year trend that I am sceptical explains a <5-year change specific to software development.
You're moving the goalposts. There is no paradox--the job market is down and so is immigration.
[1] https://www.pewresearch.org/short-reads/2025/08/21/key-findi...
Sure, maybe. I don't know. I don't think that explains why "since the widespread adoption of generative AI, early-career workers (ages 22-25) in the most AI-exposed occupations have experienced a 13 percent relative decline in employment even after controlling for firm-level shocks. In contrast, employment for workers in less exposed fields and more experienced workers in the same occupations has remained stable or continued to grow" [1].
Unless the ratio of H1Bs in these fields to recent-college graduates has exploded in the last 5 years, immigration is not a sufficient explanation for the effect.
[1] https://digitaleconomy.stanford.edu/wp-content/uploads/2025/...
And that'll be interesting for humanity, as we derive at least some identity from the work we're doing.
Luckily for the regime, the killbots are already here, AI-powered, and under their control.