Republican governors oppose 10-year moratorium on state AI laws in GOP tax bill

54 MilnerRoute 26 6/28/2025, 4:55:31 PM politico.com ↗

Comments (26)

chisleu · 2h ago
The way I feel about this is. 1. It's going to pass 2. It's not going to get overturned by this Supreme Court 3. It's going to have the biggest impact on the world of any law.

Right now, The US and China are in an AI war. The US is doing everything it can to stop China from making progress on AI like it was a nuclear bomb. And it just might be that consequential in 10 years.

Where I am now is past the "3 sleepness nights" of 'Co-Intelligence' fame.

If you haven't seen a properly contexted (50k-100k tokens, depending on the size of the project(s)) LLM work in a code repo, then you have no idea why so many of us are terrified. LLMs are already taking jobs. My company laid off 7% of the workforce because of LLM's impact directly. I say that not because the CEO said it, but because I see it in my day to day. I'm a Principal Engineer and I just don't have need of Juniors anymore. They used to be super useful because they were teachable, and after some training you can offload more and more work to them and free up your time to work on harder problems.

With MCPs, LLMs aren't limited to the editor window anymore. My models update my JIRA tickets for me, rip content from the wiki into it's markdown memory bank which is kept in repo and accelerates everyone's work. It's connecting to databases to find out schemas and example column data. Shit, as I'm typing this it's currently deploying a new version of a container to ECR/ECS/Fargate with terraform for a little project I'm working on.

I believe we are in the very early days of this technology. I believe that over the next ten years, we are going to inundated with new potential for LLMs. It's impossible to foresee all the changes this is going to bring to society. The use cases are going to explode as each tiny new feature or new mode evolves.

My advice is to get off of the sidelines and level up your skills to include LLM integrations. Understand how they work, how to use them effectively, how to program system integrations for them... agents especially! Agents can be highly effective at many use cases. For instance, an Agent that watches a JIRA board for new tickets which contain prompts to be executed in certain repos, then executes the prompt and creates a PR for the changes. All in a context that is fully aware of your environment, deployment, CI/CD, secrets management, etc.

Anything will be possible sooner than we expect. It's going to impact the poorest people the most. A really cyberpunk reality could be upon us faster than we expect, including the starving masses stuggling to get enough to even survive.

norir · 1h ago
I would love to see a reverse atlas shrugged where all the programmers just stopped working and we could see how much the executive class could do without them through the magic of ai. As it stands, I feel most workers are increasingly facilitating their own dispossession.
greybox · 2h ago
You're a principle engineer who doesn't see the value in training juniors ...
chisleu · 1h ago
I did not say that I don't see the value in training juniors. I said that I don't have a need for them anymore. I can teach Claude in 1 API call what takes a day to walk a junior through.

Furthermore, I think we are going to find less and less work for Juniors to do because Seniors are blasting through code at a faster and faster pace now.

I'm not the only one saying that the entry level market is already getting trashed...

robomartin · 32m ago
I don't think that's what OP is saying at all.

There's a reality to content with here. We all know that software developers have been coming out of school with decidedly substandard skills (and I am being very kind). In that context, the value they might add to an organization has almost always been negative. Meaning that, without substantial training and coaching --which costs time, money and market opportunity-- they can be detrimental to a business.

Before LLM's you had no options available. With the advent of capable AI coding tools, the contrast between hiring an person who needs hand-holding and significant training and just using AI is significant and will be nothing less than massive with the passage of time.

Simply put, software development teams who do not embrace a workflow that integrates AI will not be able to compete with those who do. This is a business forcing function. It has nothing to do with not being able to or not wanting to train newcomers (or not seeing value in their training).

People wanting to enter the software development field in the future (which is here now), will likely have to demonstrate a solid software development baseline and equally solid AI-co-working capabilities. In other words, everyone will need to be a 5x or 10x developer. AI alone cannot make you that today. You have to know what you are doing.

I mean, I have seen fresh university CS graduates who cannot design a class hierarchy if their life depended on it. One candidate told me that the only data structure he learned in school was linked lists (don't know how that's possible). Pointers? In a world dominated by Python and the like, newbies have no clue what's going on in the machine. Etc.

My conclusion is that schools are finally going to be forced to do a better job. It is amazing to see just how many CS programs are just horrible. Sure, the modules/classes they take have the correct titles. What and how they teach is a different matter.

Here's an example:

I'll omit the school name because I just don't want to be the source of (well-deserved, I might add) hatred. When I interviewed someone who graduated from this school, I came to learn that a massive portion of their curriculum is taught using Javascript and the P5js library. This guy had ZERO Linux skills --never saw it school. His OOP class devoted the entire semester to learning the JUCE library...and nobody walked out of that class knowing how to design object hierarchies, inheritance, polymorphism, etc.

Again, in the context of what education produces as computer scientists, yes, without a doubt, AI will replace them in a microsecond. No doubt about it at all.

Going back to the business argument. There is a parallel:

Companies A, B and C were manufacturing products in, say, Europe. Company A, a long time ago, decides they are brilliant and moves production to China. They can lower their list price, make more money and grab market share from their competitors.

Company B, a year later, having lost 25% of their market share to company A due to pricing pressure, decides to move production to China. To gain market share, they undercut Company A. They have no choice on the matter; they are not competitive.

A year later A and B, having engaged in a price war for market share, are now selling their products at half the original list price (before A went to China). They are also making far less money per unit sold.

Company C now has a decision to make. They lost a significant portion of market share to A and B. Either they exit the market and close the company or follow suit and move production to China.

At this point the only company one could suggest acted based on greed was A during the initial outsourcing push. All decisions after that moment in time were about market survival in an environment caused by the original move.

Company C decides to move production to China. And, of course, wanting to regain market share, they drop their prices. Now A, B and C are in a price war until some form of equilibrium is reached. The market price for the products they sell are now one quarter what they were before A moved to China. They are making money, but it is a lot tighter than it used to be. All three organizations had serious reorganization and reductions in the labor force.

The AI transition will follow exactly this mechanism. Some companies will be first movers and reap short-term benefits of using AI to various extents. Others will be forced into adoption just to remain competitive. At the limit, companies will integrate AI into every segment of the organization. It will be a do or die scenario.

Universities will have to graduate candidates who will be able to add value in this reality.

Job seekers will have to be excellent candidates in this context, not the status quo ante context.

hyperliner · 2h ago
This sounds like “You are a manager who doesn’t see the value in training typists” or “You are a refrigerator seller who doesn’t see the value in training icemen.”
lazyeye · 2h ago
It's more than this.

You may think your job's not at risk because you're a plumber. But you're not realising that you will be competing with millions of new plumbers fleeing AI-decimated industries pushing down wages dramatically.

And what if China wins on AI and now Huawei can produce tech gear that is dramatically superior/cheaper to global competitors. So now Chinese tech dominates the globe giving enormous power and control to the the CCP.

chisleu · 1h ago
Absolutely right.
leptons · 3h ago
So "states rights" doesn't really matter to these people like they've been saying it does.
siliconc0w · 3h ago
There are about a zillion examples of them citing a principal like, 'states rights' and then immediately abandoning it when it suits them for things like gun control, abortion access, seizing control of a state's national guard, gender affirming care, etc.

The problem is that they are directionally correct in that it would be bad to have a patch work of laws around AI but the alternative is we leave it to congress which has consistently shown an inability to thoughtfully regulate or reform anything - just pass mega spending bills and increase the debt limit.

mikem170 · 2h ago
> it would be bad to have a patch work of laws around AI

Why would that be bad? And for who?

Wouldn't it be better to have a variety of laws around something new, and figure out over time what is optimal? Wouldn't this be better than having one set of laws that can be more easily compromised via regulatory capture? Why the common assumption that bigger and more uniform is better? Is that to encourage bigger companies and bigger profits? Has that been a good thing?

siliconc0w · 2h ago
Because if you want to sell an AI product you now need to hire an army of lawyers to do the state-by-state compliance. This dramatically increases the costs and slows down critical innovation. Another common argument is that any regulation will allow China to 'win the AI race' but I don't entirely agree with that premise - it's not a 'race' and if China 'wins' it'll be because they largely use their debt to finance effective high ROI industrial policy rather than mega tax breaks.
peterhadlaw · 2h ago
"care"
baby_souffle · 3h ago
> So "states rights" doesn't really matter to these people like they've been saying it does.

Nor does the deficit (and at least a dozen other big issues)

The term "performative bad faith" comes to mind...

frogperson · 2h ago
Just call it what it is. Fascism and authoritarianism.
FranzFerdiNaN · 2h ago
Yep. Conservatism only cares about one thing: protecting its own in-group while hurting the rest.
thrance · 3h ago
They have no values. The only thing one can find them consistently advocating for is their own selfish interests.
shortrounddev2 · 3h ago
Republicans do not have principles, only an unceasing desire for power. Any time they quote some principle at you, they are lying. They are trying to manipulate your sense of fairness to cynically get what they want. They will stab you in the back at the first opportunity. Republicans can not be trusted under any circumstances
techpineapple · 4h ago
It’s wild the dichotomy between libertarian and trad conservative in the Republican Party. You’ve got both the people who want to automate all the jobs away, and Tucker Carlson saying if FSD eliminates 2 million trucking jobs than we shouldn’t do it.
ETH_start · 3h ago
Libertarians don't believe that automation leads to fewer jobs being available. They look at the past 200 years of automation and see that as more tasks are automated, labor productivity simply increases.
techpineapple · 2h ago
I think this has changed. Historically yes, you’re right, but I think modern thinkers either think productivity will accelerate so we can have UBI(Sam Altman), or in some cases, have a very utilitarian perspective that if we need less people, we need less people(Peter Thiel)
apwell23 · 2h ago
they simply don't care about jobs numbers. chips fall where they may
GuinansEyebrows · 3h ago
'Dark Money' [0] describes how this ideological situation came to be. pretty interesting stuff.

[0] https://en.wikipedia.org/wiki/Dark_Money_(book)

hereme888 · 2h ago
_
russdill · 2h ago
? Republicans are also the ones trying to prevent states from regulating AI
Finnucane · 2h ago
All those guys lined up behind [expletive deleted] on inauguration day? What do you think they expected to get for the money they were paying out?