What if humanity forgot how to make CPUs?

66 Tabular-Iceberg 88 5/12/2025, 8:38:06 PM twitter.com ↗

Comments (88)

palmotea · 8h ago
This has a ton of holes:

> Z-Day + 15Yrs

> The “Internet” no longer exists as a single fabric. The privileged fall back to private peering or Sat links.

If you can't make CPUs and you can't keep the internet up, where are you going to get the equipment for enough "private peering or Sat links" for the privileged?

> Z-Day + 30Yrs

> Long-term storage has shifted completely to optical media. Only vintage compute survives at the consumer level.

You need CPUs to build optical media drives! If you can't build CPUs you're not using optical media in 30 years.

> The large node sizes of old hardware make them extremely resistant to electromigration, Motorola 68000s have modeled gate wear beyond 10k years! Gameboys, Macintosh SEs, Commodore 64s resist the no new silicon future the best.

Some quick Googling shows the first IC was created in 1960 and the 68000 was released in 1979. That's 19 years. The first transistor was created in 1947, that's a 32 year span to the 68k. If people have the capacity and need to jump through hoops to keep old computers running to maintain a semblance of current-day technology, they're definitely f-ing going to have been able to repeat all the R&D to build a 68k CPU in 30 years (and that's assuming you've destroy all the literature and mind-wiped everyone with any knowledge of semiconductor manufacturing).

lauriewired · 5h ago
> If you can't make CPUs and you can't keep the internet up, where are you going to get the equipment for enough "private peering or Sat links" for the privileged?

Storage. You only need a few hundred working systems to keep a backbone alive. Electron migration doesn’t kill transistors if they are off and in a closet.

> You need CPUs to build optical media drives! If you can't build CPUs you're not using optical media in 30 years.

You don’t need to make new drives; there are already millions of DVD/Bluray devices available. The small microcontrollers on optical drives are on wide node sizes, which also make them more resilient to degradation.

> they're definitely f-ing going to have been able to repeat all the R&D to build a 68k CPU in 30 years (and that's assuming you've destroy all the literature and mind-wiped everyone with any knowledge of semiconductor manufacturing).

If you read the post, the scenario clearly states “no further silicon designs ever get manufactured”. It’s a thought experiment, nothing more.

pointlessone · 18m ago
OK, no silicone. But we might be just fine after all. Just yesterday we had a story about Bismuth transistors that are better in every way than silicon ones. Maybe a tad more expensive. There are a plenty of other semiconductors out there too. We’ll have to adjust manufacturing but it will probably be just one upgrade cycle skip. Even with a complete mind wipe it’s still not that bad if only silicone is out.
kadoban · 3h ago
> If you read the post, the scenario clearly states “no further silicon designs ever get manufactured”. It’s a thought experiment, nothing more.

This kind of just breaks the thought experiment, because without the "why?" of this being vaguely answered, it makes no sense. How do you game out a thought experiment that starts with an assumption that humanity just randomly stops being humanity in this one particular way? What other weird assumptions are we meant to make?

esseph · 3h ago
If you don't like the rules of the game, you don't have to play it.
3eb7988a1663 · 6h ago
Surely knowing something is possible would speed up the process. Transistors had to go from this neat lab idea to find more and more incremental use cases. Eventually snowballing into modern chips. If you know from the beginning that computers are a neat idea, surely that would warrant more focused R&D.
CuriousRose · 6h ago
If humans forgot how to make new CPUs, it might finally be the incentive we need to make more efficient software. No more relying on faster chips to bail out lazy coding and make apps run lean. Picture programmers sweating over every byte like it's 1980 again.
burnt-resistor · 5h ago
Probably not. Devices would run out within a generation.

It ain't ever going to happen because people can write these things called books. And computer organization and architecture books already exist and there are many 10k's copies of them. What should be captured in modern computer organization books is applied science aspects of the history until now and the tricks that made Apple's ARM series so excellent. The other thing is TSMC needs to document fab process engineering. Without the capture of niche, essential knowledge they become strategic single points of failure. Leadership and logic dictate not allowing this kind of vulnerability to fester too deeply or too long.

saulpw · 2h ago
The essential tacit knowledge can't be captured in books. It has to be learned by experience, participating in (and/or developing) a functioning organization that's creating the technology.
immibis · 1h ago
Programmers haven't been able to rely on CPUs getting faster for the last decade. Speeds used to double every 1.5 years or so. Now they increase 50% per core and double the number of cores... every 10 years. GPU performance has increased at a faster pace, but ultimately also stagnated, except for the addition of tensor cores.
djmips · 4h ago
That's already happening
geysersam · 55m ago
Ah the good old days again, what a beautiful vision. Decadence and lazyness begone! Good luck running your bloated CI pipelines and test suits on megahertz hardware! /s
mahirsaid · 6h ago
There will be a great tragedy to be had if that was ever a reality in the near future. The bigger questions is what if you forgot hot to make the machines that make the CPU's. That is the bigger challenge to overcome in this crisis. Only one company specializes in this field that gives big company's like TSMC Their abilities to manufacture great CPU's. The trick is to create the machine that makes them and go from there. 10nm - 2nm capabilities.
PaulKeeble · 9h ago
There is a bit of an issue that almost all the know how exists within a couple of private companies and if the industry down turned, such as from a crash in an AI bubble causing a many year lull, giant companies could fail and take that knowledge and scale with them. Some other business would presumably buy the facilities and hire the people but maybe not. It's one of the issues of so much of science and engineering happening privately we can't replicate the results easily.
bsder · 9h ago
This isn't unique to semiconductors.

If you turn off any manufacturing line, your company forgets really quickly how to make what that line made. GE discovered this when they tried to restart a water heater line in Appliance Park.

to11mtm · 9h ago
Heck, the US had this problem when they needed to renew/refurbish nuclear weapons due to more or less 'forgetting' how to make Fogbank.
AlotOfReading · 6h ago
FOGBANK was a little more complicated. The process that was written down just didn't work as expected. That was partially lost institutional knowledge that had never been recorded, but the original manufacturing just didn't understand their process. The process had contaminants improving the final product that they were unaware of. When the process was restarted, that didn't happen until it was investigated.
silisili · 6h ago
Yup.

Remington apparently has no idea what the blueing formula was they used in their original 1911s.

Colt lost the ability to handfit revolvers.

bitwize · 1h ago
We as a global civilization are close to forgetting how to make CRTs. There's like one company that still makes them, but only for military or major industrial applications (fighter jet HUDs and the like), at call-for-pricing prices. The major manufacturers, like Sony etc. all shut down their production lines, never to be restarted again because the knowledge of how to make them dissipated with those production lines. If you're an enthusiast who wants to experience retro video games as they appeared back in the day, your only option is to scavenge an old TV from somewhere.
trollbridge · 8h ago
I’m a little puzzled how “forgot how to make CPUs” also included “forgot how to make the mechanical part of hard drives, how to make flash memory, and how to make other chips”. I guess I don’t think of a 74xx series chip as a “CPU”?
geor9e · 7h ago
I read it as: we have millions of hard drives and flash drives with a dead controller chip, so we harvest their other parts as spares. We still know how to make the spare parts from scratch, but we have so many for free.
vardump · 9h ago
We're toast should we ever lose ability to make CPUs.

Perhaps there should be more research how to make small runs of chips cheaply and with simple inputs. That'd also be useful if we manage to colonize other planets.

spencerflem · 9h ago
We as in civilization? We made it at least a few thousand years without it.

Or do you mean the circumstances that would lead to this (nuclear war perhaps) would make us toast

throw0101d · 9h ago
> We as in civilization? We made it at least a few thousand years without it.

Civilization is a continuity of discrete points of time.

We were able to enter (so-called) Dark Ages where things were forgotten (e.g., concrete) and still continue because things were often not very 'advanced': with the decline of Rome there were other stories of knowledge, and with the Black Death society hasn't much beyond blacksmithing and so was able keep those basic skills.

But we're beyond that.

First off, modern society is highly dependent on low-cost energy, and this was kicked off by the Industrial Revolution and easy accessible coal. Coal is much depleted (often needing deeper mines). Then next phase was with oil, and many of the easy deposits have been used up (it used to bubble up to the ground in the US).

So depending on how bad any collapse is, getting things up without easily accessible fossil fuels may be more of challenge.

AlienRobot · 9h ago
I'm not sure we can actually support 8 billion people's food production and distribution logistics without CPU's anymore.
spencerflem · 9h ago
Whatever makes us forget CPUs will make there be less than 8 billion people I'm sure.
cranky908canuck · 7h ago
Yep. The hard way.
squigz · 9h ago
No, but that's hardly the same as suggesting humanity would die off. We'd adapt, just at a much smaller scale.
vardump · 9h ago
The civilization as it is.
spencerflem · 9h ago
I mean, the chips they're talking abt didn't exist until like 40 years ago I think we could manage.

But tbh I don't see it as at all likely short of something like nuclear war that would be the much bigger problem.

conductr · 9h ago
It would be chaos at first, but if it's at all physically survivable our species likely will, then we're only a few years away from a "humans exist, but most their knowledge has been lost". Only a few more years later before no human alive has ever interacted with a CPU using device, then the whole notion of a CPU kind of disappears before too long.

We've had this happen before of course. There's a ton of things ancient civilizations were doing that we are clueless about. So clueless, that one of the leading theories is that they must have been aided by aliens.

vardump · 9h ago
Do we really still have the society wide institutional knowledge to do things how they were done 50 years ago? I wouldn't be so sure.
spencerflem · 9h ago
I don't think this is likely, but say we, the whole world, goes back to using telephones and writing paperwork on paper.

I don't think it'd be the end of life as we know it.

alabastervlog · 6h ago
The important chips aren’t the ones on our desks and in our hands. I think all that shit’s of dubious value to begin with.

It’s the ones in factories, power systems, and transportation equipment, among other things.

squigz · 8h ago
Mate, 50 years wasn't that long ago. We had computers and everything else we have now. We still did a lot of things fundamentally the same way. Everything was just slower and smaller (in scale; not physically)

I think you also should realize much of the world continues on without bleeding-edge technology - homes are still built, crops are still harvested, and the world goes on.

Legend2440 · 9h ago
Be more concerned about whatever nuclear war or social breakdown led to that point. Massive industrial manufacturing systems don’t shut down for nothing.
kimixa · 6h ago
To have zero effort in attempting to reproduce even 70s-era silicon technology for 30 years implies some real bad stuff, if the entire chain has been knocked out to that level I doubt "silicon chip fabrication" would really be a worry for anyone during that time.
vardump · 9h ago
It could also happen as natural decay over centuries. There's no guarantee we'll get more advanced over time.
spencerflem · 9h ago
That would be a pity, but I don't see why we'd be toast.
kimixa · 6h ago
Eh, there's plenty of small fabs globally that do smaller run "nowhere near cutting edge" (180nm or so) runs - you can make a pretty decent processor on that sort of tech.

Would be a pretty solid intermediate step to bootstrap automation and expansion in the cases where the supply of the "best" fabs is removed (like in a disaster, or the framework to support that level of manufacturing isn't available, such as your colony example)

0xTJ · 8h ago
A fun read, but I do find it a bit odd that in 30 years the author doesn't think that we would have reverse-engineered making CPUs, or at least gotten as far as the mid-70s in terms of CPU production capabilities.

Also, the 10k years lifespan for MC68000 processors seems suspect. As far as I can see, the 10,000 figure is a general statement on the modelled failure of ICs from the 60s and 70s, not in particular for the MC68000 (which is at the tail end of that period). There are also plenty of ICs (some MOS (the company, not the transistor structure) chips come to mind) with known-poor lifespans (though that doesn't reflect on the MC68000).

therealpygon · 6h ago
Agreed. It is a whole lot easier to recreate something you know is possible than to create something you don’t know is possible.
asciimov · 6h ago
> … no further silicon designs ever get manufactured

The problem wouldn’t be missing CPUs but infrastructure. Power would be the big one, generators, substations, those sorts of things. Then manufacturing, lot of chips go there. Then there is all of healthcare.

Lots of important chips everywhere that aren’t CPUs.

roxolotl · 8h ago
So taking this as the thought experiment it is what I’m struck by is that seemingly most things will completely deteriorate in the first 10-15 years. Is that accurate? Would switches mostly fail by the 10 year mark if not replaced? I’ve been looking at buying a switch for my house should I expect it to not last more than 10 years? I have a 10 year old tv should I expect it starts to fail soon?
__d · 8h ago
My experience with retro computers is that things start to fail from around the 10-15 year mark, yes. Some things are still good after 30 years, maybe more, but .. capacitors leak, resistors go out of spec, etc, and that means voltages drift, and soon enough you burn something out.

You can replace known likely culprits preemptively, assuming you can get parts. But dendritic growths aren’t yet a problem for most old stuff because the feature sizes are still large enough. No one really knows what the lifetime of modern 5/4/3nm chips is going to be.

protocolture · 7h ago
Theres a hardware law that hardware past its half life often lives for an excessively long time.

Really depends on brand and purpose but consumer hardware switches do die pretty frequently.

But if you bought something like a C2960 fanless switch I would expect it to outlive me.

floating-io · 6h ago
I have a 10+ year old Cisco 2960G and a pair of 10+ year old Dell R620's in my homelab, still humming happily along.

So, no.

datadrivenangel · 9h ago
It would be a bad decade, but someone would figure out how to get older microcontroller class chip production going pretty fast because $$$
myth_drannon · 9h ago
We would go back to 6502, it will be fine. Just more time spent optimizing the code.
Suppafly · 9h ago
The stuff people write for old consoles and computers is pretty amazing. Computers definitely evolved faster than they needed to for the general public. All of these industries were built around taking advantage of Moore's Law instead of being about getting them most bang from existing limitations.
to11mtm · 9h ago
A 6502 can easily power a robot to bend metal and other objects. You can bootstrap everything else from there.
trollbridge · 8h ago
So can a machine built entirely from discrete components. A VAX 11/780 had no microprocessor at all. Just a CPU built out of components.
FrankWilhoit · 7h ago
The larger point is that we are going to forget a lot of things.
Vilian · 5h ago
Hopeful they forget about JavaScript too, that would be a good thing to forget
waynesonfire · 8h ago
We would have to git-revert linux 486 support.
andsoitis · 4h ago
what do you have to believe to be true in order for humanity to forget how to make CPUs?
cadamsdotcom · 6h ago
It’s a bit like trying to censor a LLM: to delete such an interconnected piece of information as “everything about making CPUs” you have to so significantly alter the LLM that you lobotomize it.

CPUs exist at the center of such a deeply connected mesh of so many other technologies, that the knowledge could be recreated (if needed) from looking at the surrounding tech. All the compiled code out there as sequences of instructions; all the documentation of what instructions do, of pipelining, all the lithography guides and die shots on rando blogs.. info in books still sitting on shelves in public libraries.. I mean come on.

Each to their own!

spencerflem · 9h ago
This doesn't make a ton of sense to me. What situation would everyone lose the ability to make any CPU, worldwide, and we don't have a much much bigger problem then how to run AWS?
MostlyStable · 9h ago
This reads to me as mostly just an interesting way to teach about expected hardware lifetime assuming we were trying as hard as possible to keep things going. There is an entire genre of speculative SF that posits one major change and tries to think through the repercussions of that change. Often, the change itself is not very sensible, but it's also not the point.
spencerflem · 9h ago
I do think its interesting that digital records may not survive any sort of truly world war.

Its so easy to think of them as lasting forever

stevenwoo · 8h ago
We don't have enough time with digital data but it seems extremely doubtful anything digital would survive as long as the Herculaneum scrolls that were buried in mud that was on the front page last week, that's longer than almost any civilization has continuously existed (the exception only being ancient Egypt?) but maybe humans will turn it around in the near future and obliterate that record.
protocolture · 7h ago
My guess:

Something would need to happen to stop / prevent production for about 30 - 60 years.

Thats roughly equivalent to the Saturn V engine, and Codename FOGBANK which are the 2 examples of technologies that had to be reverse engineered after the fact.

Hypothetically we might choose to stop making new ones if demand dried up significantly.

bell-cot · 6h ago
Short of asteroid Dino-Doom v2.0 hitting the earth, how could CPU demand fall so low that we don't make any new ones?
protocolture · 4h ago
Some kind of demand reduction.

It could be the case that we finally hit a sold wall in CPU progress, cloud providers demand something they dont have to replace every few years, and the result is some kind of everlasting gobstopper CPU.

Then as failures fall off, so does demand, and then follows production.

A pretty large drop in global population might see the same result. Labor needs to be apportioned to basic needs before manufacturing.

NooneAtAll3 · 6h ago
I don't think it's that bizarre

we already had a sci-fi story, where humanity forgot all beatles songs https://www.youtube.com/watch?v=-JlxuQ7tPgQ

Animats · 9h ago
A war in Taiwan?
bgnn · 9h ago
Well, we stil have enough know-how outside Taiwan on everything to produce any semiconductors. A bigger world war is most likely what it takes to bring the supply chain to a halt. Even then, nobody magically forgets these things.
TimorousBestie · 9h ago
I kinda doubt it. The theoretical knowledge is there, but there’s a huge gulf between that and all the practical knowledge/trade secrets held by TSMC.

Another view on this topic is https://gwern.net/slowing-moores-law

rcxdude · 8h ago
If you wanna make something that's competitive with the latest and greatest, sure. But there's literally thousands of fabs that can make _a_ CPU, and hundreds that can make something that is usable in a PC, even if not very fast. There's a huge span of semiconductor fabrication beyond the bleeding edge of digital logic.
AnotherGoodName · 5h ago
One thing the above did have though was a mention that the high end would quickly be worth its weight in gold.

The nvidia dgx b200 is already selling for half a million. The nearest non tsmc produced competitor doesn’t come close. Imagine no more supply!

bgnn · 8h ago
I might be biased, being an insider of semiconductor industry, to think that there gulf isn't that huge. Virtuakly everything is known down to what, 28nm or so. That's still a fairly good process and pretty impossible to forget.
alabastervlog · 5h ago
The stuff that really matters is mostly on microcontrollers.

The few industries that push computing out of need would suffer. Certain kinds of research, 3D modeling.

But most of what we use computers for in offices and our day-to-day should work about as well on slightly beefed up (say, dual or quad CPU) typical early ‘90s gear.

We’re using 30 years of hardware advancements to run JavaScript instead of doing new, helpful stuff. 30 years of hardware development to let businesses save a little on software development while pushing a significant multiple larger than that cost onto users.

voidspark · 7h ago
TSMC factories use ASML hardware (designed and built in the Netherlands), that actually produces the chips.

https://www.asml.com/en

TSMC is running a successful business but they're not the only customers of ASML.

chasil · 8h ago
Intel still knows 14nm quite well, and would likely sell access to the line if asked.

If Taiwan ceased to exist, that would put us a decade back.

bgnn · 8h ago
Samsung is just tiny bit behind TSMC.

The gap isn't a decade, more like 12-18 months.

Also, TSMC has 5nm production in the US. There are actual people with know how of this process in the US.

voidspark · 6h ago
All of their photolithography equipment is manufactured in the Netherlands by ASML

https://www.asml.com/en

Other companies are using the same equipment (Samsung and Intel) but TSMC has deeper expertise and got more out of the same equipment so far.

squigz · 9h ago
There are other semiconductor manufacturers, right? Certainly it would be catastrophic to the industry, and would likely set progress back a while, but it would hardly be insurmountable. This discussion also assumes TSMC wouldn't sell/trade knowledge/processes, or they'd not be stolen, which wouldn't be crazy, given the hypothetical war in the region
NoMoreNicksLeft · 9h ago
Not magically, they forget naturally. No one human knows the whole sequence, from start to finish, no one can really write it down (or shoot a how-to video). Distributed, institutional knowledge is extraordinarily brittle.
bgnn · 8h ago
I agree, forgetting happens naturally. For example, it would be pretty difficult to produce vacuum tubes now. But I doubt this is applicable for CMOS technologies. Most of the steps down to finfets (TSMC 16nm) are rather well known. Yes we don't know the exact recipe of TSMC, Samsung or Intel, but it's not like alien technology. I read technical papers from all these fabs regularly and it's more open than what would people expect. Of course they keep their secrets too, for the cutting edge. There's so much knowhow out there that it's quite probable we can get there again in a short time if TSMC vanished fron earth all of a sudden.
insaneirish · 6h ago
> For example, it would be pretty difficult to produce vacuum tubes now.

Vacuum tubes are still made. They’re used extensively in instrument amplification.

But I think this bolsters your point!

NoMoreNicksLeft · 6h ago
It's not just about secrets... it's about how many techniques and processes simply aren't documented. There's no need (someone knows how, and is in the business of training new hires), no capacity (they're not exactly idle), and no perception that any of this is important (things have kept working so far).

Could they eventually replicate a CMOS technology? No one doubts this, but the latest lith process took how many years to develop, and only one company makes those machines anywhere in the world? Nearly microscopic molten tin droplets being flattened mid-air so that it can radiate a particular wavelength of UV?

That's not something they'll have up and running again in 6 months, and if it were lost, regression to other technologies would be difficult or impossible too. We might have to start from scratch, so to speak.

spencerflem · 9h ago
Chips are made elsewhere too. At worst, we (civilization) would lose the cutting edge if that happened.

It would be a sad thing but not as sad as everything else that would happen in a war.

PlunderBunny · 9h ago
We would go back to brick phones, with gears stuck on the side. Awesome!
starspangled · 6h ago
johnea · 8h ago
Wouldn't it be better if the world totally forgot the twitverse?
charcircuit · 8h ago
Even if humanity forgot, most of the process is automated, so it shouldn't be too hard to figure out how to keep an factory running.
daft_pink · 9h ago
thankfully the way capitalism works, we would quickly reinvent them and remake them and the companies that did so would make a decent profit.

generally the true problems in life aren’t forgetting how to manufacture products that are the key to human life.