Why Bell Labs Worked

265 areoform 184 5/11/2025, 8:47:20 PM 1517.substack.com ↗

Comments (184)

pmb · 6h ago
It was forcibly funded as part of a consent decree from the US government that allowed AT&T to continue as a monopoly as long as they invested a percent of their yearly revenue (or profit? I forget) in research. AT&T, having no interest in changing their incredibly profitable phone network, then proceeded to do fundamental research, as required as a condition of their monopoly.

Decades later, AT&T was broken up into the baby bells and the consent decree was removed at that time. Bell Labs' fate was then sealed - it no longer had a required legal minimum funding level, and the baby bells were MBA-run monstrosities that were only interested in "research" that paid dividends in the next 6 months in a predictable fashion.

The funding model is an integral part of the story.

dsjoerg · 1h ago
Citation needed. What I'm seeing: No evidence of a legal requirement to spend a % of revenue on research: There was no line-item mandate in the consent decree forcing AT&T to invest a specific percentage into Bell Labs. The support for research was strategic and reputational: AT&T used Bell Labs to fend off antitrust pressure and maintain regulatory goodwill.
kqr · 5h ago
That sounds plausible, but is not how it is told in The Idea Factory, where the authors explain that both AT&T (running the phone system) and Western Electric (manufacturing equipment for the phone system) had separate research divisions even before this. They then discovered that they were duplicating a lot of research, so they set up one entity to perform research for both the harder and the softer sides of the communication system.
speleding · 2h ago
The baby bells actually took with them part of Bell Labs, renamed to Bellcore, that survived for another decade or so. I interned there whilst doing my MSc, it was still a great place for a while, with serious research.

Wikipedia tells me it still exists in some form, albeit under a different name https://en.wikipedia.org/wiki/Iconectiv

YouWhy · 8h ago
Bell Labs grew to be a dominant player in an age that was characterized by an oversupply of a manageable number highly capable scientists who did not all have a chance for getting anything resembling funding.

Today we have a huge oversupply of scientists, however there's too many of them to allow judging for potential, and many are not actually capable of dramatic impact.

More generally, a standard critique for "reproducing a golden age" narratives are that the golden age existed within a vastly different ecosystem and indeed - stopped working due to systemic reasons, many of which still apply.

In particular, just blaming 'MBA Management' does little to explain why MBAs appeared in the first place, why they were a preferable alternative to other types of large scale management, and indeed how to avoid relapsing to it over a few years and personnel shifts.

Overall I am afraid this post, while evocative , did not convince me what makes 1517 specifically so different.

sabas123 · 5h ago
> Today we have a huge oversupply of scientists, however there's too many of them to allow judging for potential, and many are not actually capable of dramatic impact.

Realistically speaking it's also much harder to achieve the same level of impact back then as most, not just low-hanging, fruits have been plucked.

jve · 29m ago
They look high hanging fruits when you haven't yet reached them.

They look low hanging fruits when you have risen above them.

CaptainOfCoit · 3h ago
> not just low-hanging, fruits have been plucked.

Won't this always be the case?

I mean, if you look back 50 years, they look like low-hanging fruits, but at the time, they weren't, only with the benefit of hindsight do they look like low-hanging fruits.

Similarly people in 50 years will say we had all the low-hanging fruits available today in subject/area/topic X, although we don't see them, as we don't have hindsight yet.

anonymousDan · 4h ago
> Today we have a huge oversupply of scientists, however there's too many of them to allow judging for potential, and many are not actually capable of dramatic impact.

This is a fairly sweeping anti-science statement without any supporting evidence to back it up. Fairly typical of the HN anti-public research hive mind sadly.

pydry · 7h ago
>In particular, just blaming 'MBA Management' does little to explain why MBAs appeared in the first place

Whatever the reason it is definitely not because they are effective managers.

I suspect it's more of a social class phenomenon - they speak the language of the ownership class and that engenders trust.

Gibbon1 · 7h ago
My theory is when women and lower class men started working as bookkeepers and accountants in post war America a way was need to keep the plumb jobs reserved for the fail sons of the privilege classes.

I could be wrong but while 'business schools' existed before then the MBA as a upper class ivy league thing exactly dates to that time.

mlrtime · 2h ago
Do you have anything to back this up? Or are you looking for a story where none exists?

I ask because this comment seems completely backwards and mostly impossible to implement.

actionfromafar · 5h ago
Wasn't it also a time of larger projects being started, which required more coordination? Not that your theory needs to be wrong, it could be amplified.
beowulfey · 3h ago
judging potential is an easy thing to do. anyone who works with a trainee for a month can be a fantastic judge of their potential. the problem is, as you allude to, a lack of funding and of jobs
IAmBroom · 5m ago
You presume "works with ___ for a month" is easy.

It involves a month commitment of (trivially, overall) the temp's salary, and a month partial commitment of someone senior enough to make hiring decisions (even as a primary advisor).

I absolutely agree what you describe is both viable and useful, but it's easier to hire good-looking resumes and hope for the best.

piokoch · 6h ago
Plus there was a lot of "low hanging fruit" from the war time that was yet to be "productify", as we say today.

Radars, computers (Enigma crushers), lasers, many less visible inventions that had a great impact in say, materials science - those barrels had to be durable, planes lighter and faster, etc this allowed do build fancier stuff. Whole nuclear industry and its surrounding.

Another factor: cold war, there was incentive to spend money if only there was some chance to win some advantage.

gsf_emergency · 8h ago
>there's too many of them to allow judging

Agree with this in particular as a good symptom of the "tectonic shifts". I usually blame the Baumol effect, i.e., the increasing difficulty of the inherently human task: keeping science education/science educators up-to-date. Especially when faced with the returns on optimizing more "mechanical" processes (including the impressive short term returns on "merely" improving bean-counting or, in Bell Lab's/IBM's later eras, shifting info-processing away from paper)

I doubt AI or VCs* have any significant role to play in reducing the friction in the college-to-self-selling-breakthrough pipeline, but they should certainly channel most of their efforts to improving the "ecosystem" first

TFA has right ideas such as

>Make sure people talk to each other every day.

Which already happens here on HN! (Although it's mostly different daily sets of people but.. the same old sets of ideas?)

*Not if the main marketable usecase for college students is to game existing metrics. And I don't see no Edtech in the RFS either

n4r9 · 4h ago
> why MBAs appeared in the first place, why they were a preferable alternative to other types of large scale management

This is a whole topic of its own, intertwining with the rise of neoliberalism and short-termist managerial capitalism. I don't think we have to get into that every single time we point out a case where short-termist managerialism fails.

mschuster91 · 4h ago
> More generally, a standard critique for "reproducing a golden age" narratives are that the golden age existed within a vastly different ecosystem and indeed - stopped working due to systemic reasons, many of which still apply.

The lack of money to fund research isn't god given, it's a feature of capitalism. You get literally trillions of dollars worth of pension funds sloshing around in need for returns - but fundamental research isn't going to directly and imminently make billions of dollars, the timescale for ROI in basic research is usually measured in decades. So all that money went towards social networks, cryptocurrencies, AI or whatever else is promising the greatest ROI - and that can be argued to all be not to the betterment of society, but actually to its detriment.

It used to be the case that the government, mostly via the proxy "military-industrial complex" provided that money - out of that we got GPS, radar, microwaves, lasers, even the Internet itself - to counteract this. Unfortunately, Republican (or in other countries their respective Conservative equivalents) governments have only cut and cut public R&D spending. And now, here we are, with the entirety of the Western economy seeking to become rent-seekers with as little effort as possible...

And billionaires aren't the rescue either. Some of them prefer to spend their money on compensating for ED by engaging in a rat's race of "who has the bigger yacht/sports team", some of them (like Musk) invest on R&D (SpaceX, Tesla) but with a clear focus on personal goals and not necessarily society as a whole, some of them just love to hoard money like dragons in LoTR, and some try to make up for government cuts in social services and foreign development aid (Gates, Soros), but when was the last time you heard a billionaire just devoting even a small sliver of their wealth to basic R&D that doesn't come with expectations attached (like the ones described in TFA's "patronage" section)?

szundi · 4h ago
Golden age was a time when basic tinkering could bring you to groundbraking discovery. This is of course over now.
IAmBroom · 3m ago
Groundbreaking, not "groundbraking".

And as someone else noted: the golden age is ALWAYS in the past, and often in the not-recent but not-ancient, poorly-remembered past.

We probably exist in some future's golden age.

scrlk · 14h ago
As an interesting counterpoint to the idea of "just hire smart people and give them a lab", Ralph Gomory, head of IBM Research (a peer of Bell Labs in its day) from 1970-86 said:

> There was a mistaken view that if you just put a lab somewhere, hired a lot of good people, somehow something magical would come out of it for the company, and I didn't believe it. That didn't work. Just doing science in isolation will not in the end, work. [...] It wasn't a good idea just to work on radical things. You can't win on breakthroughs - they're too rare. It just took me years to develop this simple thought: we're always going to work on the in-place technology and make it better, and on the breakthrough technology. [0]

[0] https://youtu.be/VQ0PBve6Alk?t=1480

kqr · 5h ago
I recently read about Atari attempting to do the same thing in the 1980s (headed by Alan Kay, no less!) Market changes forced them to shut down the attempt a few years later due to lack of funding.
bluGill · 12h ago
Every breakthrough needs many 'man years' of effort to bring to market. research is good but for every researcher we need several thousand who are doing all the hard work of getting the useful things to market in volumn.
leoc · 10h ago
Speaking of which https://substack.com/home/post/p-115930233 :

> John Pierce once said in an interview, asserting the massive importance of development at Bell:

>> You see, out of fourteen people in the Bell Laboratories…only one is in the Research Department, and that’s because pursuing an idea takes, I presume, fourteen times as much effort as having it.

kqr · 5h ago
It gets even more nuanced. As I understand it, Pierce was not great at the pursuing part and often managed to enthuse others to follow his ideas instead of doing it himself. So maybe it is only natural that someone like Pierce would notice how much effort it takes to follow up.
kmeisthax · 11h ago
RCA tried to duplicate Bell Labs' success and it arguably bankrupted the company.
Animats · 7h ago
RCA's Sarnoff Labs produced the image orthicon, NTSC color TV, an early videotape system, an early flat-panel-display, and lots of vacuum tube innovations.[1]

The big business mistakes were in the 1970s, when RCA tried to become a diversified conglomerate. At one point they owned Hertz Rent-A-Car and Banquet TV dinners.[2]

[1] https://eyesofageneration.com/rca-sarnoff-library-photo-arch...

[2] https://en.wikipedia.org/wiki/RCA

whiplash451 · 7h ago
Out of curiosity: is there a write up about this?
MrRadar · 7m ago
The Technology Connections video series on RCA's Selectavision CED home video system touches on this quite a lot (it was a horribly mismanaged project which took more than a decade to commercialize, by which time it had already been superceded by VHS/Betamax and Laserdisc)[1]. His main source for the information on the development of the CED system was the book "The Business of Research: RCA and the VideoDisc" by Margaret B. W. Graham.

[1] https://www.youtube.com/watch?v=PnpX8d8zRIA&list=PLv0jwu7G_D...

OutOfHere · 9h ago
I suspect the MBAs contribute in no small part to the bankruptcy.
PaulRobinson · 2h ago
And yet I can list a lot of things that Bell Labs did that changed the World, and yet I am struggling to name even one that came out of IBM Research. I'm not saying it didn't happen, I'm saying that work didn't resonate as strongly.
scrlk · 1h ago
Dynamic random access memory, RISC, relational databases, laser eye surgery.

Less mainstream but significant: scanning tunnelling microscope, high temperature superconductivity, fractal geometry.

leoc · 11h ago
Eric Gilliam's "How did places like Bell Labs know how to ask the right questions?" https://www.freaktakes.com/p/how-did-places-like-bell-labs-k... came to a similar conclusion. (It did well here just a couple of months ago, too https://news.ycombinator.com/item?id=43295865 , so it is a little disappointing that the discussion seems to be starting from the beginning here again.) Another point which you've both made is that other big US firms had very important industrial research labs, too. (RCA Labs is one that seems to get little love these days, at least outside the pages of We Were Burning https://www.hachettebookgroup.com/titles/bob-johnstone/we-we... Also, to be fair, "Areoform" did mention Xerox PARC once in TFA.) Indeed, overstating the uniqueness of Bell Labs helps to billow up the clouds of mystique, but it's probably harmful to a clear understanding of how it actually worked.

But the ultimate problem with TFA is that it seems to be written to portray venture capitalists(?), or at least this group of VCs who totally get it, as on the side of real innovation along with ... Bell Labs researchers(?) and Bell Labs executives(?) ... against the Permanent Managerial Class which has ruined everything. Such ideas have apparently been popular for a while, but I think we can agree that after the past year or two the joke isn't as funny as it used to be anymore.

whatever1 · 8h ago
ExxonMobil also closed this year the NJ based prolific Corporate Strategic Research center that among many chemical process related breakthroughs, identified the CO2 emissions risks (way before academia), and invented the lithium battery.
ab5tract · 6h ago
CO2 emissions risks were already being discussed in the 1800s.

Furthermore, Big Oil notoriously suppressed any hint of their internal climate change models from being published and hired the same marketing firms that Big Tobacco employed.

BoxOfRain · 5h ago
It really pains me that the people involved in this cover-up likely led lives of absurd luxury rather than seeing the meanest hint of justice for what they did.

People talk about climate change as though we're all equally responsible but this is false, there may be few saints on this subject but there's certainly degrees of sinner and these people are at the very highest level of blame in my opinion. How much of the world will be uninhabitable by the end of the century due to their lies delaying timely climate action?

genewitch · 41m ago
Do you know of any literature that shows it being discussed in earnest before ~1905, which is when the newspaper article about it I've read was published?
LargoLasskhyfv · 19m ago
smj-edison · 8h ago
Just read that article you mentioned--I find the most interesting part of it is "system integrators," or those who intentionally pay attention to both the research going on, and the on-the-ground problems. It's fascinating how it mentions how they brought information back and forth, and even generated new ideas from all the connections they formed.
pydry · 7h ago
Who ever said that they should be isolated?

The key differentiator was giving them freedom and putting them in charge, not isolating them.

majormajor · 13h ago
You have to be willing to not have things guaranteed to "work." Don't just look at the best case. Investigate and discuss how many versions of Bell Labs didn't "work."

If you just look at the success stories, you could say that today's VC model works great too - see OpenAI's work with LLMs based on tech that was comparatively stagnating inside of Google's labs. Especially if nobody remembers Theranos in 50 years. Or you could say that big government-led projects are "obviously" the way to go (moon landing, internet).

On paper, after all, both the "labs" and the VC game are about trying to fund lots of ideas so that the hits pay for the (far greater) number of failures. But they both, after producing some hits, have run into copycat management optimization culture that brings rapid counter-productive risk-aversion. (The university has also done this with publish-or-perish.)

Victims of their own success.

So either: find a new frontier funding source that hasn't seen that cycle yet (it would be ironic if some crypto tycoon started funding a bunch of pure research and that whole bubble led to fundamental breakthroughs after all, hah) or figure out how to break the human desire for control and guaranteed returns.

p_v_doom · 5h ago
> If you just look at the success stories, you could say that today's VC model works great too

Well, thing is actually it is kind of horrible. You are basically handing the choice of what to develop where - something that is actually kind of important to society in general in the hands of unelected rich farts that look into making more money. Doesn't help when many of those farts have specific views and projects of society in the future as a technofascist hellscape.

And you could argue the vast majority of what the VC model has given us is scam and services that are very ethically dubious, surveil everyone everywhere, try to squeeze as much money/value out of people, without actually solving many real problems. There is also the thing that the very model is antitethical to solving problems - solving problems costs money, and doesnt bring anything back. Its inherently unprofitable, so any solution is doomed to become more and more enshitified.

surfingdino · 45m ago
Bell Labs had infinite money. Their owners made money every time someone picked up a phone. Not all businesses are that embedded in the society and those that have boards that might like the idea of funding their own labs have to answer to the higher power--the Wall Street crowd, who will force you to optimise for maximum profit in the shortest amount of time. You get there fastest by cutting costs, especially the costs of long-term research that may not bear fruit.

BTW. Why wait 50 years to forget Theranos? https://www.nytimes.com/2025/05/10/business/elizabeth-holmes...

snickmy · 47m ago
Shameless plug: I see few people in the comment that have been working there, or have had some first hand experience of that environment. I'd love the chance to interview them. This has been one of my main area of interest for a very long period. Anyone up for it ?
LeoPanthera · 16h ago
If you haven't already, check out the "AT&T Archives" on the AT&T Tech Channel on YouTube. It's an absolutely remarkable collection of American technology history.

https://www.youtube.com/playlist?list=PLDB8B8220DEE96FD9

teleforce · 11h ago
If you want to know the research culture and the environment of Bell Labs from author's first hand experiences, I'd highly recommended this book by Hamming [1].

[1] The Art of Doing Science and Engineering by Richard W. Hamming:

https://press.stripe.com/the-art-of-doing-science-and-engine...

nemild · 10h ago
My dad was at this talk in 1986 that PG shares on his blog:

https://paulgraham.com/hamming.html

Said it was amazing.

atakan_gurkan · 8h ago
Hamming gave that talk many many times. There are recordings of it on YouTube. It is also the final chapter of his book "The Art of Doing Science and Engineering", which, IMHO, is worth reading in its entirety.
badlibrarian · 10h ago
It's been out of stock for nearly a year. Interesting in a post talking about AT&T and Bell Labs to point out that Stripe struggles to maintain an inventory of niche printed books.
nemild · 10h ago
You're welcome to borrow my copy, feel free to ping me.
nar001 · 9h ago
Has it? It seems to be available on Amazon at least.
badlibrarian · 40m ago
Not in stock on Amazon. Just a bunch of third party sellers with varying ship dates months out. Nor Powells, other vendors, nor any used copies, nothing by ISBN search either.

Google says "in stock" in the search result. I do a search about once a month and click around. I've been doing it long enough that I can actually watch things get worse. No AI result yet, so there's still room to fall.

genewitch · 30m ago
Have you checked the aet of using the library and other data services?
kevmo314 · 16h ago
> The reason why we don't have Bell Labs is because we're unwilling to do what it takes to create Bell Labs — giving smart people radical freedom and autonomy.

My observation has been that smart people don't want this anymore, at least not within the context of an organization. If you give your employees this freedom, many will take advantage of it and do nothing.

Those that are productive, the smartest who thrive in radical freedom and autonomy, instead choose to work independently. After all, why wouldn't they? If they're putting in the innovation the equity is worth way more than a paycheck.

Unfortunately, that means innovation that requires a Bell Labs isn't as common. Fortunately, one person now can accomplish way more than a 1960's engineer could and the frontier of innovation is much broader than it used to be.

I used to agree with the article's thesis but it's been nearly impossible to hire anyone who wants that freedom and autonomy (if you disagree, <username>@gmail.com). I think it's because those people have outgrown the need for an organization.

musicale · 16h ago
> If you give your employees this freedom, many will take advantage of it and do nothing

This was addressed in the article

> Most founders and executives I know balk at this idea. After all, "what's stopping someone from just slacking off?" Kelly would contend that's the wrong question to ask. The right question is, "Why would you expect information theory from someone who needs a babysitter?"

also this hilarious quote from Richard Hamming:

> "You would be surprised Hamming, how much you would know if you worked as hard as [Tukey] did that many years." I simply slunk out of the office!

kevmo314 · 16h ago
Yeah, that's the point of my next sentence. Why would someone who comes up with information theory want to give it to an employer?

I think an answer to that was a lot clearer in the 1960's when going from idea to product was much harder.

wbl · 15h ago
"The only secret worth keeping is out: the damn things work".

What products could Shanon have made only knowing information theory? Or CSRO knowing only ODFM solved multipath? Did Bob Metcalf make more money when everyone had Ethernet or if he'd licensed it much more exclusively?

It's very hard for a single fundamental result to be a durable competitive advantage compared to wider licensing on nicer terms. That's particularly true when much else goes into the product.

kevmo314 · 15h ago
Shannon did a lot more than just information theory. In fact, anyone who fits the autonomy persona does because that was part of the definition.

Sure, licensing information theory is a bit of a stretch, but Shannon literally built one of the first artificial intelligence machines [1]. 2025 Shannon would've been totally fine building his own company.

If you see these idols through their singular achievements, then yes of course it's hard to imagine them outside the context of a lab, but rarely are these innovators one trick ponies.

By the way, Bob Metcalfe did indeed start his own company and became pretty successful in doing so.

[1] https://en.wikipedia.org/wiki/Claude_Shannon#Artificial_Inte...

foobarian · 14h ago
Maybe the 2025 Bell Labs is the wider ecosystem of VCs and free floating innovators who end up starting startups instead of doing things in house.

I do think there is a lot less low hanging fruit which makes the comparison apples and oranges. Google is like Bell Labs today, and what did they invent? LLMs? Compare that to information theory, the transistor, Unix, etc.

kevmo314 · 14h ago
> Maybe the 2025 Bell Labs is the wider ecosystem of VCs and free floating innovators who end up starting startups instead of doing things in house.

Yep, agree with this statement. That's exactly what I think happened.

codr7 · 11h ago
I have no idea what Bell Labs was like on the inside, but the startups I've been involved in didn't leave a lot of room for experimentation, trying and failing.

Quite the opposite, always a mad rush towards profit at any cost.

lll-o-lll · 5h ago
> Why would someone who comes up with information theory want to give it to an employer?

Why would someone who is not motivated by financial gain care?

> I was motivated more by curiosity. I was never motivated by the desire for money, financial gain. I wasn't trying to do something big so that I could get a bigger salary.

— Claude Shannon

fuzzfactor · 9h ago
>Why would someone who comes up with information theory want to give it to an employer?

When an employer or occupation provides a fully respectable career for life, that's your job and it's fully respectable to have that be your life's work from that point onward, plus information theory doesn't represent the full 1% of what Shannon had to offer anyway :)

tikhonj · 12h ago
This has not been my experience at all. I worked on a team with substantial autonomy and agency for a few years, and most people—not everyone, sure, but almost—naturally rose to the occasion.

People want to do good work and people want to feel like they're doing good work. If you create an environment where they feel trusted and safe, they will rise to your expectations.

I had way more trouble with people working too hard but with misaligned ideas of what "good" meant—and stepping on each other's toes—than with anyone slacking off. It's easy to work around somebody who is merely ineffectual!

And, sure, a bunch of stuff people tried did not work out. But the things that did more than made up for it. Programming and quantitative modeling are inherently high-leverage activities; unless leadership manages out all the leverage in the name of predictability, the hits are going to more than make to for the flubs.

kevmo314 · 12h ago
Doing work on a team isn't really what the article is discussing though. I'm referring to the very research-y skunkworks-style autonomy.

I am well aware that people in companies can work effectively on teams and that people rise to the occasion in that context. If it didn't work, companies wouldn't hire. But that's not what the article is about.

smj-edison · 15h ago
> it's been nearly impossible to hire anyone who wants that freedom and autonomy

Interesting, this is something that I'd love to do! I'm already planning on pursuing custom chip design for molecular simulation, but I don't really want to handle the business side of things. I'd much rather work in a paid lab than get rich and sell it off. Plus, you can do so much more with a team vs being independent.

I was also homeschooled though (unschooling and tjed philosophy) so I've always been picking my own projects. Sometimes I wonder if the lack of generalist researchers comes down to education (another thing I'd love to pursue).

Zorass · 11h ago
“Smart people don’t need organizations anymore.” I get it—going solo is more appealing now than ever. But I can’t help thinking: some things really only happen in a kind of shared magnetic field. Not because you can’t do it alone, but because that moment when another smart person lights you up— that doesn’t happen in solo mode.
kevmo314 · 10h ago
Yeah I completely agree. I see it more like the benefits of going solo have eclipsed the benefits of a team in an organization.

I don't think it's a strictly better environment but in many dimensions going solo is now better than any company. I do often long for that shared magnetic field though.

No comments yet

OutOfHere · 8h ago
Eh. AI can empower the solo worker more than anyone else.
chrsw · 14h ago
Hypothetical slackers didn't stop great work from coming out of the lab. I'm not sure why today would be any different.
jltsiren · 8h ago
Hiring smart people who want freedom and autonomy is easy. Just give them freedom, autonomy, stability, and a good enough salary. The hard part is getting them to contribute to your business. Maybe they will contribute if they find it interesting. But if you expect them to contribute, you are clearly not giving them autonomy.

Many of the smartest people I know are good at ignoring bureaucratic requirements, or at least handling them with the minimum effort necessary. And that often includes business, which many of them see as a subcategory of bureaucracy.

ghaff · 15h ago
I think it's complicated.

A lot of large US tech corporations do have sizable research arms.

Bell Labs is certainly celebrated as part of a telephone monopoly at the time though AT&T actually pulled out of operating system development related to Multics and Unix was pretty much a semi-off-hours project by Ritchie and Thompson.

It's true that you tend not to have such dominant firms as in the past. But companies like Microsoft still have significant research organizations. Maybe head-turning research advancements are harder than they used to be. Don't know. But some large tech firms are still putting lots of money into longer-term advances.

coolcase · 8h ago
Yeah F# and Typescript are very impressive. We just got used to tonnes of innovation. It ain't UNIX but I'd say Typescript is as impressive. An exoskeleton for JS that rivals Haskell.

See also VSCode and WSL.

And if we ain't impressed with LLMs then wtf! I mean maybe it is just nostalga for the old times.

Lots of great stuff is coming out. Quantum computing. Open source revolution producing Tor, Bitcoin, Redis, Linux.

I think we are in the Golden age!

And it is not all from one place. Which is better.

pjmlp · 6h ago
The way SQL Server was ported to Linux for example, makes use of DrawBridge.

.NET and Java also started as research projects, as did GraalVM, Maxime, LLVM, many GHC features, OCaml improvements,....

noosphr · 5h ago
>And while a VC fund is limited in what it can do in providing open-ended freedom. It can try to provide a meaningful simulacrum of that space and community, which is why I’m so excited about programs like 1517’s Flux that invests $100k in people, no questions asked and lets them explore for a few months without demanding KPIs or instantaneous progress.

>>You can move to the United States. (We will help with visas.)

This is no longer viable for anyone who isn't already a US citizen. Not sure how serious about investing in individuals that VC is, but from talking to 16 to 22 year olds _none_ of them want to move to the US with ICE deporting students for saying the wrong thing online - or the perception they do. US universities and businesses are suffering from brain drain that unless reversed in the next 3 years will be a drag on the US economy for decades.

nine_k · 15h ago
In a way, it's similar to the connection between "boredom" and creativity. When you don't have much to do, you can do anything, including novel and awesome things. It, of course, takes the right kind of person, or a right group of persons. Give such people a way to not think about the daily bread, and allow them to build what they want to build, study what they want to study, think about what they want to think about.

It feels anti-efficient. It looks wasteful. It requires faith in the power of reason and the creative spirit. All these things are hard to pull off in a public corporation, unless it's swimming in excess cash, like AT&T and Google did back in the day.

Notably, a lot of European science in 16-19 centuries was advanced by well-off people who did not need to earn their upkeep, the useless, idle class, as some said. Truth be told, not all of them advanced sciences and arts though.

OTOH the rational, orderly living, when every minute is filled with some predefined meaning, pre-assigned task, allows very little room for creativity, and gives relatively little incentive to invent new things. Some see it as a noble ideal, and, understandably, a fiscal ideal, too.

Maybe a society needs excess sometimes, needs to burn billions on weird stuff, because it gives a chance to to something genuinely new and revolutionary to be born and grow to a viable stage. In a funny way, the same monopolies that gouge prices for the common person also collect the resources necessary for such advances, that benefit that same common person (but not necessarily that same monopoly). It's an unsetllting thought to have.

conception · 14h ago
This is what I think the biggest benefit to having a significant UBI. Sure, lots of folks who currently are in “bullshit jobs” would sit around and watch one screen or another but! A lot, probably more than we imagine, would get bored and… do something. Often that something would be amazing.

But lizard brains gotta keep folks under their thumb and horde resources. Alas.

godelski · 10h ago

  > but! A lot, probably more than we imagine, would get bored and… do something.
I'm of the same belief. We're too antsy of creatures. I know in any long vacation I'll spend the first week, maybe even two (!), vegging out doing nothing. But after that I'm itching to do work. I spent 3 months unemployed before heading to college (laid off from work) and in that time taught myself programming, Linux, and other things that are critical to my career today. This seems like a fairly universal experience too! Maybe not the exact tasks, but people needing time to recover and then want to do things.

I'm not sure why we think everyone would just veg out WALL-E style and why the idea is so pervasive. Everyone says "well I wouldn't, but /they/ would". I think there's strong evidence that people would do things too. You only have to look at people who retire or the billionaire class. If the people with the greatest ability to check out and do nothing don't, why do we think so many would? People are people after all. And if there's a secret to why some still work, maybe we should really figure that out. Especially as we're now envisioning a future where robots do all the labor.

BoxOfRain · 5h ago
Music is always something that comes to mind for me; in the UK there's a long history of excellent music with strong working class roots, but as the economy becomes more precarious in the UK (housing costs are insane here) music has increasingly turned into the province of people who are more well-off because they have to worry less about their daily bread. As a result a lot of it gets a bit homogenised and predictable in my opinion.

I think people are drawn to labour but not drudgery, and a lot of jobs don't really do much to differentiate between the two. I reckon if less people had to worry about putting bread on the table what we'd see is a massive cultural revival, a shot in the arm to music and the arts.

razakel · 3h ago
>Firstly, you must be skint and on the dole. Anybody with a proper job or tied up with full time education will not have the time to devote to see it through. Also, being on the dole gives you a clearer perspective on how much of society is run. If you are already a musician stop playing your instrument. Even better, sell the junk. It will become clearer later on but just take our word for it for the time being.

- The Manual, by the KLF

mjevans · 14h ago
UBI isn't going to get us there. Give everyone more cash and the rent-seeking _WILL_ suck harder. Same problem with blindly raising the minimum wage and not instead addressing the root issue.

Basic econ 101: inelastic demand means supply can be as expensive as the limited number who are lucky enough to get it are able to afford.

Bell Labs, generally think tanks, they work by paying _enough_ to raise someone to the capitalist society equivalent of a Noble.

Want to fix the problem for everyone in society, not just an 'intellectual elite'? Gotta regulate the market, put enough supply into it that the price is forced to drop and the average __PURCHASE POWER__ raises even without otherwise raising wages.

nine_k · 13h ago
This has been tried, very honestly, and it mostly sucked, then crashed. The calculation argument [1] kills it. The optimization problem which the market solves in a chaotic and decentralized way through price discovery and trading is intractable otherwise, not with all the computing power of the planet. It also requires prediction of people's needs (ignoring desires), and it's a problem more ill-posed than prediction of weather.

The market of course needs regulation, or, rather, stewardship: from protection of property rights all the way to limiting monopolies, dumping, etc. The market must remain free and varied in order to do its economic work for the benefit of the society. No better mechanism has been invented for last few millennia.

Redistribution to provide a safety net to those in trouble is usually a good thing to have, but it does not require to dismantle the market. It mostly requires an agreement in the society.

[1]: https://en.m.wikipedia.org/wiki/Economic_calculation_problem

Retric · 12h ago
That’s the advantage to UBI.

A revenue neutral UBI check at some subsistence level and killing all other government assistance including lower tax brackets would in the short term significantly lower the standard of living for many low income Americans and boost others. However people would try and maximize their lifestyle and for most people that would be through working. Others would opt out and try and make being really poor work for them.

Essentially you remove central planning around poverty and as the government stops requiring rent stabilized apartments etc. Which in the short term pushes a lot of poor people out of major cities but simultaneously puts upward pressure on wages to retain those workers and pushes down rents via those suddenly available apartments. It doesn’t actually create or destroy wealth directly, you just get a more efficient allocation of resources.

nine_k · 10h ago
There's a catch. If enough people opt for not working, the level of UBI may go below the level of survival for some time. This will push those who can work and don't want to tolerate it to go find work. But those who cannot work much, or at all, like disabled people, would be facing hunger, and would be unable to afford the special stuff they need to survive (like medicine or home aid). They might just die from that.

This returns us back to the problem of some guaranteed payments to those we don't want to let die, and maybe want to live not entirely miserably, and the administration thereof.

Another danger is the contraction of the economy: businesses close, unable to find workers, the level of UBI goes down, people's income (UBI + salary) also goes down, and they can afford fewer goods, more businesses close, etc. When people try to find work because UBI is not enough, there may be not enough vacancies, until the economy spins up again sufficiently. It's not unlike a business cycle, but the incentive for a contraction may be stronger.

Retric · 8h ago
As long as a fixed percentage of the economy is going to UBI there’s natural feedback loops. Fewer people work, UBI goes down and incentives to work increase. However, long term efficiency gains keep pushing up the standard of living for people on UBI which then makes not working more appealing. The specific numbers only matter in the short term if in 500 years 1% of the economy went to UBI people would likely be very well off by modern standards, but still be tempted to work for even more.

There’s a long way between uncomfortable and death here, entitlement spending is already over 10k/person/year and that’s excluding the impact of progressive taxation. Revenue neutral flat tax and a 20+k UBI isn’t unrealistic. A reasonable argument can be made for universal healthcare being part of UBI, but that’s a separate and quite nuanced discussion.

Not that I think there’s any chance of a UBI in the US, but it’s an interesting intellectual exercise.

01HNNWZ0MV43FF · 11h ago
We should still keep the progressive income tax. UBI can even be implemented as NIT

Adding a land tax too, now that would be, that would really, that would fix some things

marcus_holmes · 9h ago
Land tax would cause people to sell land. We need a wealth tax. Income tax is too easy to game.
thuanao · 11h ago
> and it mostly sucked

Citation needed. If you're referring to the USSR, please pick an economic measure that you think would have been better, and show why the calculation problem was the cause of its deficiency. USSR was incredibly successful economically, whether it was GDP growth, technological advancement, labor productivity, raw output, etc. Keep in mind all of this occurred under extremely adverse conditions of war and political strife, and starting with an uneducated agrarian population and basically no capital stock or industry.

The Austrian economist Hans-Hermann Hoppe writes of Hayek's calculation problem:

> [T]his is surely an absurd thesis. First, if the centralized use of knowledge is the problem, then it is difficult to explain why there are families, clubs, and firms, or why they do not face the very same problems as socialism. Families and firms also involve central planning. The family head and the owner of the firm also make plans which bind the use other people can make of their private knowledge […] Every human organization, composed as it is of distinct individuals, constantly and unavoidably makes use of decentralized knowledge. In socialism, decentralized knowledge is utilized no less than in private firms or households. As in a firm, a central plan exists under socialism; and within the constraints of this plan, the socialist workers and the firm’s employees utilize their own decentralized knowledge of circumstances of time and place to implement and execute the plan […] within Hayek’s analytical framework, no difference between socialism and a private corporation exists. Hence, there can also be no more wrong with the former than with the latter.

nine_k · 10h ago
A family is small enough to allow for reasonable planning. (Imperfect still, as you know if you ever tried to run a family.)

Indeed, a private company usually operates in a way a centralized monarchy / oligarchy would operate: the bosses determine a plan, the subordinates work on implementing it, with some wiggle room but with limited autonomy.

Larger companies do suffer from inefficiencies of centralization, they do suffer waste, slowdowns, bureaucracy, and skewed incentives. This is well-documented, and happens right now, as we facepalm seeing a huge corp doing a terrible, wasteful move after wasteful move, according to some directives from the top. This is why some efficient corporations are internally split into semi-independent units that effectively trade with each other, and even have an internal market of sorts. (See the whole idea of keiretsu.)

But even the most giant centralized corporations, like Google, Apple, or the AT&T of 1950s, exist in a much, much larger economy, still driven mostly by market forces, so the whole economy does not go haywire under universal central planning, as did the economy of the late USSR, or the economy of China under Mao, to take a couple of really large-scale examples.

fuzzfactor · 8h ago
Concepts like this would definitely be in play and misguided UBI could result more in preservation of status quo than allowing abundance to spread.

That's why experiments need to be made.

Now with research pay Bell was right up there with other prestigious institutions, elite but not like the nobility of old.

I would say very much more like a "Gentleman" scientist of antiquity, whether they were patrons or patronized in some way, they could focus daily on the tasks at hand even when they are some of the most unlikely actions to yield miracles.

Simply because the breakthroughs that are needed are the same as it ever was, and almost no focused tasks lead in that direction ever, so you're going to have to do a lot of "seemingly pointless" stuff to even come up with one good thing. You better get started right away and don't lift your nose from the grindstone either ;)

evidencetamper · 11h ago
> Basic econ 101: inelastic demand means supply can be as expensive as the limited number who are lucky enough to get it are able to afford.

In the same basic econ 101, you learn that real estate demand is localized. UBI allows folks to move to middle of nowhere Montana.

mjevans · 8h ago
To do what? People want to live near three sorts of things:

Social connections like family / friends / potential mates

Livelihood needs like education / jobs / foods (1st world, the food they like is fresh / better; historic / other food exists!)

General QoL climate / beauty / recreational opportunities

Many big cities cost more because it's where the opportunity is, or where their family that previously/currently prospered from that opportunity resides. For many of us on HN it's where the sort of jobs we'd be good at contributing to society exist. Even if some corp opened an office in the middle of Montana there wouldn't be anything else there as other opportunities. Heck given UBI, I'd rather join Star Fleet with awesome healthcare for all, cool technical challenges, and anything other than Starbase 80.

billy99k · 11h ago
UBI might work in the short-term, but as more and more people are having kids (and learning from parents on UBI, to also get UBI), we would run out of people actually working and paying the taxes to support it.
marcus_holmes · 9h ago
Which is exactly the thing they tested multiple times and found to be wrong.

People get bored doing nothing, and enjoy contributing to their community.

No, they're not going to go get shitty factory jobs. But that's OK, because all those jobs are now automated and done by robots.

But they are going to go and do something useful, because that's what people do. The anti-UBI trope that "given basic income, everyone will just sit around on their arses watching TikTok videos" has been proven wrong in every study that measured it.

porridgeraisin · 7h ago
> No, they're not going to go get shitty factory jobs. But that's OK, because all those jobs are now automated and done by robots. Nonsense. There are tens of millions of jobs that cannot be automated in the near future, which people would certainly never do if they had UBI. America just outsources them to poorer countries, so you're clueless.
p_v_doom · 26m ago
The key to universal basic income is that it is basic. There are many jobs that cannot be automated but with the right incentive even under UBI people will do them.
conception · 10h ago
This assumes that most people would be satisfied with UBI and not attempt to make more money.
90s_dev · 10h ago
It's not about excess.

Look at some of the most famous success stories in comedy, art, music, theatre, film, etc.

A good number of them did their best work when they were poor.

"Community" is a great example. Best show ever made, hands down. Yet they were all relatively broke and overworked during the whole thing.

It's because they believed in the vision.

nine_k · 9h ago
Art is materially different from science and technology. Great art is known to emerge from limitations. Art is full of limitations that are self-imposed for that purpose, like the meter and rhyme in poetry, geometry and color in painting, etc. Art is primarily about processing and evoking emotions.

Science requires much more concentration on abstract thinking, loading a much larger context, if you will. It's counterproductive to do it while busy with something else. It overworks you all right, and it demands much more rigor than art.

All revolutionary new technology is initially inefficient, and requires spending a lot of time and money on finding efficient solutions. First electronic computers were terribly unwieldy, expensive, and unreliable. This equally applies to first printing presses, first steam engines, first aircraft, first jet engines, first lasers, first LLMs (arguably still applies). It's really hard to advance technology without spending large amounts of resources without any profit, or a guarantee thereof, for years and years. This requires a large cache of such resources, prepared to be burnt on R&D.

It's investment into far future vs predictable present, VC vs day trading.

90s_dev · 9h ago
Tell that to a professional in the arts.
brightball · 9h ago
That was a great show. The best show ever made, hands down though…is “Chuck”
90s_dev · 9h ago
Of all shows I might dare concede to, Chuck is not in the top 50.
phinnaeus · 8h ago
West Wing, Chernobyl, Westworld Season 1, Breaking Bad
90s_dev · 1h ago
I wonder why "best show ever made" always has contenders made in the last 30 years. You Bet Your Life with Groucho Marx was such a fascinating show, and had some of the most intriguing, raw conversations with ordinary people, ever. Not to mention Groucho's natural instinct for wordplay that just never failed him.
throwaway2037 · 7h ago

    > Notably, a lot of European science in 16-19 centuries was advanced by well-off people who did not need to earn their upkeep, the useless, idle class, as some said.
I heard a recent interview with John Carmack (of DOOM fame) who described his current style of work as "citizen scientist", where he has enough money, but wants to do independent research on AI/ML. I am always surprised that we don't see more former/retired hackers (whom many got rich from a DotCom), decide to "return to the cave" to do something exciting with open source software. Good counterexamples: (1) Mitchell Hashimoto and his Ghostty, (2) Philip Hazel and his PCRE (Perl Compatible Regular Expressions) library. When I retire (early -- if all things go well), the only way to that I can possibly stave off a certain, early death from intellectual inactivity would be something similar. (Laughably: I don't have 1% of the talents that John Carmack has... but a person can try!)
analog31 · 10h ago
To some extent, the NSF did that. My graduate education was funded by the NSF, and my research didn't have an obvious practical purpose, except to enable further research.

Today, I'm in a corporate research role, and I'm still given a lot of freedom. I'm also genuinely interested in practical applications and I like developing things that people want to buy, but my ability to do those things owes a lot to the relatively freewheeling days of NSF funding 30+ years ago.

r14c · 14h ago
I know this is going to be an unpopular take, but isn't the idea of socialism that you make a unitary democratic government fill the role of Huge Monopoly Foundation so you can do stuff like fund research labs and be accountable to the public?
nine_k · 13h ago
It's the statist idea. Socialism in practice usually involves regulating the market heavily, or into oblivion altogether, and giving the State a huge redistribution power. See my comment nearby on why such a setup fails to work.

A socialism where the only way to work is to own a part of an enterprise (so no "exploitation"is possible) would likely work much better, and not even require a huge state. It would be rather inflexible though, or mutate back into capitalism as some workers would accumulate larger shares of enterprises.

r14c · 12h ago
Having some kind of default steward for market developments that get so competitive and fundamental that they reach full market saturation is helpful. Under a market system, at that scale, the need for growth starts to motivate companies to cut corners or squeeze their customer base to keep the numbers going up. You either end up pricing everyone out (fixed supply case) or the profit margins get so slim that only a massive conglomerate can break even (insatiable demand case). This is why making fundamental needs and infrastructure into market commodities doesn't work either.

The problem with social democracy is that it still gives capitalists a seat at the table and doesn't address the fundamental issues of empowering market radicalism. Some balance would be nice, but I don't really see that happening.

sien · 10h ago
Across the OECD average government spending is 46% of GDP.

https://www.oecd.org/en/topics/policy-issues/public-finance-...

How is that 'market radicalism' ?

How is government spending ~25 trillion USD a year somehow not considered?

r14c · 8h ago
"market radicalism" as in, the notion that ~all aspects of life should be subject to market forces. I think markets are good for a lot of things, but competing for basic necessities like food and shelter puts people in a really unhealthy headspace. The green revolution made a lot of our concepts about food scarcity obsolete.

A good example of market radicalism at play in the US is the healthcare system. "Everyone knows" that the market is better at allocating resources, but we actually have terrible outcomes and there is no political will to change an obviously broken system. Despite having real-world examples of single-payer (and hybrid) healthcare systems out there that are more cost effective and have better outcomes.

nine_k · 6h ago
Because it's really not a market at all.

On what market the cost of services to be rendered would be held secret, or at least obscure? Try shopping for a particular kind of surgery across hospitals. How many plans does your employer offer? Would you rather buy a plan independently? Why?

The whole "insurance" system is not insurance mostly, but a payment scheme, of a quite bizarre, complicated, and opaque kind. The regulatory forces that gave birth to it are at least as strong as any market forces remaining in the space, and likely stronger.

OTOH direct markets are a poor way to distribute goods that have a very inelastic and time-sensitive demand, like firefighting, urgent healthcare, or law enforcement. Real insurance is a better way, and a permanent force sustained by tax money usually demonstrates the best outcomes. The less urgent healthcare is, the better a market works, all the way to fitness clubs which are 100% market-based.

Apocryphon · 11h ago
Sounds like distributism.
greyw · 13h ago
Hardly. Socialism is about workers/communities owning the means of production. Research labs these days are mostly funded by the public. That's just about allocation of government resources.
godelski · 11h ago
This is what I wished academia would be. I'm finishing my PhD and despite loving teaching and research (I've been told I'd make a good professor, including from students) I just don't see the system doing what it should. Truthfully, I'm not aware of any such environment other than maybe a handful of small groups (both in academia and industry).

I think we've become overly metricized. In an effort to reduce waste we created more. Some things are incredibly hard to measure and I'm not sure why anyone would be surprised that one of those things is research. Especially low level research. You're pushing the bounds of human knowledge. Creating things that did not previously exist! Not only are there lots of "failures", but how do you measure something that doesn't exist?

I write "failure" in quotes because I don't see it that way, and feel like the common framing of failure is even anti scientific. In science we don't often (or ever) directly prove some result but instead disprove other things and narrow down our options. In the same way every unsuccessful result decreases your search space for understanding where the truth is. But the problem is that the solution space is so large and in such a high dimension that you can't effectively measure this. You're exactly right, it looks like waste. But in an effort to "save money" we created a publish or perish paradigm, which has obviously led to many perverse incentives.

I think the biggest crime is that it severely limits creativity. You can't take on risky or even unpopular ideas because you need to publish and that means passing "peer review". This process is relatively new to science though. It didn't exist in the days of old scientists you reference[0]. The peer review process has always been the open conversation about publications, not the publications themselves nor a few random people reading it who have no interest and every reason to dismiss. Those are just a means to communicate, something that is trivial with today's technologies. We should obviously reject works with plagiarism and obvious factual errors, but there's no reason to not publish the rest. Theres no reason we shouldn't be more open than ever[1]. But we can't do this in a world where we're in competition with another. It only works in a world where we're united by the shared pursuit of more knowledge. Otherwise you "lose credit" or some "edge".

And we're really bad at figuring out what's impactful. Critically, the system makes it hard to make paradigm shifts. A paradigm shift requires a significant rethinking of the current process. It's hard to challenge what we know. It's even harder to convince others. Every major shift we've seen first receives major pushback and that makes it extremely difficult to publish in the current environment. I've heard many times "good luck publishing, even if you can prove it". I've also seen many ideas be put on the infinite back burner because despite being confident in the idea and confident in impact it's known that in the time it'd take to get the necessary results you could have several other works published, which matters far more to your career.

Ironically, I think removing these systems will save more money and create more efficient work (you're exactly right!). We have people dedicating their lives to studying certain topics in depth. The truth is that their curiosity highly aligns with what are critical problems. Sometimes you just know and can't articulate it well until you get a bit more into the problem. I'm sure this is something a lot of people here have experienced when writing programs or elsewhere. There's many things that no one gets why you'd do until after it's done, and frequently many will say it's so obvious after seeing it.

I can tell you that I (and a large number of people) would take massive pay cuts if I could just be paid to do unconditional research. I don't care about money, I care about learning more and solving these hard puzzles.

I'd also make a large wager that this would generate a lot of wealth for a company big enough to do such a program and a lot of value to the world if academia supported this.

(I also do not think the core ideas here are unique to academia. I think we've done similar things in industry. But given the specific topic it makes more sense to discuss the academic side)

[0] I know someone is going to google oldest journal and find an example. The thing is that this was not the normal procedure. Many journals, even in the 20th century, would publish anything void of obvious error.

[1] put on open review. Include code, data, and anything else. Make comments public. Show revisions. Don't let those that plagiarize just silently get rejected and try their luck elsewhere (a surprisingly common problem)

sien · 9h ago
Currently the OECD average spending on R&D is ~2%. Let's say half of that is government spending.

The OECD's total GDP per year is ~50 trillion. So 1 percent is roughly 500 Bn on research.

So there clearly has to be some accountability. But no doubt it could be improved. As you say publishing everything these days makes more sense with platforms like arXiv.

With taking pay cuts to do research, have you ever seen places offer part time work for something and then allow people to research what they want in the other time ?

Or researchers just doing this with other jobs ?

Ha. Hmm. I just realised I have a cousin who does this.

ziofill · 10h ago
I left a tenured position after getting fed up with several things, among which the same grant proposal getting a “it’s too visionary” from a reviewer and “it’s trivial” from another. If it’s such a coin toss, F off will ya?
fuzzfactor · 9h ago
See the picture of Bell Labs, 1966?

Each person, "principal investigator", has a lab which they built.

They only have so much space, and so much budget, but they get a clean slate.

And they're all different. But they all have brilliant ideas they need to work out.

Advantages of doing it like this were proven earlier, and it was still going strong like this in the 1970's.

It was still kind of an academic model.

In these decades there was sometimes special schooling where students were groomed to take their place in this exact lab, starting as pre-teens. Nobody imagined it would ever dwindle in any way.

This is what places like Exxon and DuPont still looked like in 1979 too.

Without being quite an actual monopoly, one thing that's in common is that anything you invent that could be the least bit useful to an employer that size, they can surely afford to make the most of it like few others can.

So the scientists could go wild to a certain extent as long as they were guided by the same "north star" that the org as a whole recognized. Whether it feels like you're getting closer or not, that's the direction you must be drawn to.

You should have seen some of the stuff they built.

Oil companies can have an amazing budget sometimes.

When somebody had a breakthrough, or OTOH scuttled a project or transferred to a different technical center, their lab would be cleared out so a completely different project could get underway with a new investigator building from the ground up. This could take a while, but eventually some very well-equipped labs having outstanding capabilities can develop.

As an entrepreneur, I liked it down in the basement where they would auction off the used gear from about a dozen of those labs at once, after it had sat in storage for a period of time.

After critical mass was achieved, then I had way more fairly current equipment at my immediate disposal than any dozen of my institutional counterparts could benefit from the mainstream way. Turns out I really could get more accomplished and make more progress my own way than if I was actually at a well-funded institution instead. Using equipment they once owned and were successful with to a certain extent, and I usually became only a little bit more advanced, and only some of the time.

Most things are truly the "least bit useful" anyway ;)

detourdog · 15h ago
The birthed an industry based on electrical properties that were barely understood. They also ended up needing a very dynamic metering and accounting system. Apple can get away with a more unified workforce because their needs are known and not unique.
nine_k · 14h ago
If your needs are known, they are also known to competitors.

I know that Elon Musk is not a popular figure nowadays, but he very correctly stated that competition is for losers, and the real innovators build things that competitors are just unable to copy for a long time, let alone exceed. SpaceX did that. Google, arguably, did that, too, both with their search and their (piecemeal acquired) ad network. Apple did that with iTunes.

Strive to explore the unknown when you can, it may contain yet-unknown lucrative markets.

(There is, of course, an opposite play, the IBM PC play, when you create a market explosion by making a thing open, and enjoy a segment of it, which is larger than the whole market would be if you kept it closed.)

detourdog · 2h ago
Part of what makes an industry is a shared problem statement. This statement would be based on needs and constraints. I don't believe that the needs are a secret sauce for a company. The approach at solving the shared needs is the secret sauce.

I think a better approach is to competition is that it should be irrelevant. One should be developing a solution that solves the problem statement.

musicale · 16h ago
> I’m so excited about programs like 1517’s Flux that invests $100k in people, no questions asked and lets them explore for a few months without demanding KPIs or instantaneous progress.

If Bell Labs let people xplore for multiple years, a few months probably isn't enough time.

areoform · 15h ago
That's absolutely true! But we aren't a multi-billion dollar corporation with a war chest in the billions so sadly this is the best we can do. :(
userbinator · 14h ago
The focus on investing in what actually matters instead of being distracted by virtue signaling for ideological culture wars no doubt also had a huge influence.
WalterBright · 15h ago
A related program was Lockheed's Skunkworks.

There have been many attempts to replicate the success of the Skunkworks, but they've all failed because the organizers thought they could improve on it.

dingaling · 9h ago
McDonnell's Phantom Works is still running, as a division of Boeing now, and seems to be doing innovative work despite the lack of innovation in the name.

They were responsible for the tailless X-36, the X-37 space plane and, allegedly, much of the groundwork for the winning NGAD design.

musicale · 16h ago
> The freedom to waste time. The freedom to waste resources. And the autonomy to decide how.

As the article notes, several companies (Apple, Google, etc.) could (currently) afford to fund such a lab, but there is no way their management and shareholders would approve.

There's a reason for this: research labs seem to benefit competitors as much as (or more than) the companies that fund them. This wasn't an issue for AT&T when it was a monopoly, but it is now. Personally I don't see it as a problem (since one home run innovation could pay for the entire lab) but company managers and shareholders do.

On the other hand, Apple does seem to have a de facto AI lab with a good deal of resource waste, so maybe that's good.

querez · 15h ago
>> The freedom to waste time. The freedom to waste resources. And the autonomy to decide how.

> As the article notes, several companies (Apple, Google, etc.) could (currently) afford to fund such a lab, but there is no way their management and shareholders would approve.

Google did set up such a lab. The mission of Google Brain was literally to hire smart people and let them do work on whatever they want. ("Google Brain team members set their own research agenda, with the team as a whole maintaining a portfolio of projects across different time horizons and levels of risk." -- https://research.google.com/teams/brain/). Unsurprisingly, Google Brain is the place that originated the Transformer that powers the current AI craze (and many, many, many other AI innovations).

areoform · 15h ago
And they shut it down. In 2023.

The current tech giants spend a lot of money on "research," where research means optimizing parts of the product line to the 10^nth order of magnitude.

Arguably, Google Brain was one such lab. Albeit with more freedom than normal.

Which is fine, it's their money. But then they (and the broader public) shouldn't bemoan the lack of fundamental advances and a slowdown in the pace of discovery and change.

sidibe · 15h ago
"And they shut it down. In 2023"

You mean they renamed it/merged it with another group that has similar freedom and focus on research

querez · 1h ago
As someone who worked at both Brain and DeepMind: their cultures were very different. Brain was bottom-up, open-field research whatever you want/care for. DeepMind was much more narrowly focused and massively more top-down.
zipy124 · 4h ago
A decent amount of people seem to be unhappy/leaving deep-mind due to the forced focus on AI and lack of freedom currently though. Therefore the point above still stands.
nullhole · 14h ago
What’s the name of the other group?
esafak · 12h ago
Deepmind. I'd say it's actually more reputable than Google Brain.
zy0n911 · 14h ago
Deepmind
sandkoan · 14h ago
DeepMind.
linguae · 14h ago
Interestingly enough, even non-monopoly large corporations once had labs where researchers had a good deal of freedom and where the projects were not required to be directly tied to business objectives. Hewlett-Packard, Digital Equipment Corporation, Sun Microsystems, Fujitsu, Sony, NEC, Toshiba, and Hitachi, just to name a few, had labs back in the 80s, 90s, and 2000s. As late as the early 2010s, a PhD graduate in computer science had options in industry to do research that wasn’t tied to short-term business priorities.

Unfortunately these opportunities have dried up as companies either got rid of their research labs or shifted the focus of their research labs to be more tied to immediate business needs. Many of my former classmates and colleagues who were industrial researchers are now software engineers, and not due to intentionally changing careers. Academia has become the last bastion of research with fewer commercialization pressures, but academia has its “publish or perish” and fundraising pressures, and now academia is under attack in America right now.

I once worked as a researcher in an industrial lab, but the focus shifted toward more immediate productization rather than exploration. I ended up changing careers; I now teach freshman- and sophomore-level CS courses at a community college. It’s a lot of work during the school year, but I have roughly four months of the year when I could do whatever I want. Looking forward to starting my summer research project once the semester ends in a few weeks!

jjtheblunt · 12h ago
> As the article notes, several companies (Apple, Google, etc.) could (currently) afford to fund such a lab, but there is no way their management and shareholders would approve.

When I was at Apple for several years, there were definitely at least two such groups.

tellarin · 11h ago
Google has DeepMind, Microsoft has Microsoft Research, Meta has FAIR.

It’s not trivial to foster such environments, but they do still exist in different forms.

refurb · 11h ago
Facebook dumped $60B into an AI universe and HN made fun of them for it.
sweeter · 12h ago
This is why I think we need publicly funded open source projects with paid leads. There are so many basic things we've failed to do for ourselves and our fellow human beings.

For example, the best non-AI TTS system is still Ivona TTS that originated at Blizzard in like 2007. The best open source solution is espeak and it's permanently stuck in 1980... Ivona was bought up by Amazon and now they don't even use the original software, but do charge money per word to use the voice via Amazon Polly. They could open source it, but they don't.

We don't even have something as basic as text to speech freely available, whether you are disabled or not. That is a problem. You have this amazing innovation that still holds to this day, squandered away for nothing.

Why can't we just have an institute that develops these things in the open, for all to use? We clearly all recognize the benefit as SysV tools are still used today! We could have so many amazing things but we don't. It's embarrassing

WalterBright · 15h ago
If I was a billionaire, that would be a fun thing to do:

1. find 100 highly motivated scientists and engineers

2. pay them each $1m/year

3. put them in a building

4. see what happens!

Temporary_31337 · 15h ago
To be fair you only need to pay sustenance or opportunity cost so closer to 100k per person should be fine especially outside of USA
WalterBright · 15h ago
I figured the $1m would also fund the equipment and supplies they'd need.
fuzzfactor · 9h ago
Here's what works for me:

1. Already found X highly motivated scientists and engineers.

- in my case people that must like chemicals, electronics, software, stuff like that

2. $1Mil funding x X but it's got to be paid back in 5 years so a viable business model needs to be initiated right away even if the technology is as much as a year out from possible release or commercialization.

- each person needs to be worth a million over 5 years, that's hard enough to find, it would be more of a needle in a haystack to find experimentalists where it's good to sink a million per year for a decent length of time, but that can be worked up to. If serious research is involved, stealth must be an option

3. Put them in X number of buildings.

- works better than you think, and "nobody's" doing it

4. Some of these are profit centers from day 1, so you could even put franchising on the table ;)

- you'd be surprised what people who've already invented a lifetime of stuff could do with the kind of resources that can enable a motivated creator who has yet to make very remarkable progress, so leverage both

whywhywhywhy · 3h ago
> but it's got to be paid back in 5 years

What happens if they don't though?

xqcgrek2 · 15h ago
My guess is not much unless you give them a specific problem, and create a hierarchy.

Otherwise things will just fragment into cliques and fights, like any university department.

WalterBright · 15h ago
What would there be to fight over? They each have a $1m budget.
Thrymr · 12h ago
Have you ever seen an academic department?

Surely the lab scientists and engineers would assert that they need a bigger budget than the mathematicians, and so on.

akomtu · 10h ago
In the Manhattan project the scientists were unified by a noble goal. Money don't buy that. Without such a goal, you'll get a hundred scientists each pulling in his own direction in order to get rich.
ted_dunning · 15h ago
The hard part is picking the right people.
WalterBright · 15h ago
The way to do it is the way top engineers and scientists were recruited for the Manhattan Project. You go around to universities and talk to the professors, who will know who the highly motivated people are.
whywhywhywhy · 3h ago
That would have worked back then but post-internet you should be looking for the people who had drive to achieve or build something without the academic structure there to force them to do it.

Universities are the place for low agency people in todays world.

api · 15h ago
What I look for in engineers would be what I’d look for here: “I have no idea how to do that, let me get started.”
PicassoCTs · 6h ago
So MBA culture would have to go away. Which only happens in wartimes, do or die times. The MBAs get send to the front, the employees get send to the front, the madmen get a chance at being mad and after that its back to building little paper-forts and the madmen in gardensheds or asylums. What a world. If the MBAs would at least not done the cloths and manners of the mad men, parading themselves in skinsuits as "innovative", that would be something.
alganet · 15h ago
What if it's a trick?

You start by creating a myth: "this place breeds innovation". Then, ambitious smart people wanting to innovate are drawn to it.

Once there, there are two ways of seeing it: "it was just a myth, I'll slack off and forget about it" or "the myth is worthwhile, I'll make it real".

One mistake could end it all. For example, letting who doesn't believe outnumber or outwit those who "believe the myth".

So, small pieces: A good founding myth (half real, half exaggerated), people willing to make it more real than myth, pruning off who drags the ship down.

Let's take that "productivity" from this myth perspective. Some people will try to game it to slack off, some people will try to make the myth of measuring it into reality (fully knowing it's doomed from the start).

A sustainable power of belief is quite hard to put into a formula. You don't create it, you find it, feed it, prune it, etc. I suspect many proto Bell Labs analogues exist today. Whenever there's one or two people who believe and work hard, there is a chance of making it work. However, the starting seed is not enough by its own.

If you ask me, the free software movement has plenty of supply of it. So many companies realized this already, but can't sequester the myth into another thing (that makes monry), even though free software already makes tons of (non monetary) value.

porridgeraisin · 15h ago
A close family member worked at Bell labs during the cold war era. According to them,

<paraphrase>

The reason is very simple. There was a big picture motivation: the war, followed by the cold war. Once the big picture motivation wasn't there anymore, that sort of organizational structure(or lack of it) does not work the same way. What ends up happening is what a sibling comment has noted:

> My observation has been that smart people don't want this anymore, at least not within the context of an organization. If you give your employees this freedom, many will take advantage of it and do nothing.

</paraphrase>

You might say, but `grep` wasn't used for war! Correct, but it came up as a side effect of working on much larger endeavours that tied into that bigger picture.

This has been true for most of recent human history. You might know this already, but Fourier was part of most of Napoleon's expeditions, and his work on decomposing waveforms arose out of his work on the "big picture": ballistics.

kevinventullo · 14h ago
So what you’re saying is it will take a war fought by autonomous robots on behalf of two massively wealthy adversarial nations in order to finally get us a robot that can do the dishes?
ForOldHack · 12h ago
They are called 'Dishwashers'
porridgeraisin · 7h ago
Any big picture motivation works, war is just the most timeless human activity. But, the space programs of various countries are also good examples. However, one could of course argue that the space program arose out of competing in the cold war.
fsckboy · 15h ago
>During WW2, Bell Labs reversed engineered and improved on the British Magnetron within 2 months.

um... the UK sent the magnetron they had recently invented (1940) to the US in a spirit of wartime cooperation and because their own research and industrial base was already maxed out at the time. pretty sure they sent an owners manual and schematics too. probably even some people?

(magnetrons, for generating microwaves, were the essential component for radar)

areoform · 15h ago
I'm quoting their research summary. By reverse engineering, it means that they figured out why the magnetron worked and then optimized it. They X-Rayed it, found a deviation from plans, then developed a model to understand why there was a deviation in performance.

    However examples No. 11 and 12 had the number of resonators increased to 8 in order to maximise the efficiency of the valve with the magnetic field provided by the then available permanent magnet, E1189 also incorporated cooling fins to enable the device to be air rather than water cooled. 
    
    Sample No.12 was taken to the USA by E. Bowen with the Tizard mission and upon testing at Bell Labs produced 10 times the power at 5 times the frequency of the best performing American triodes. A certain amount of confusion arose as the drawings taken by Bowen still showed the 6 resonator anode but an X-Ray picture taken at Bell Labs revealed the presence of 8 resonators.
    
    The E1189 or its Navy equivalent NT98 was used in the Naval radar type 271 which was the Allies first operational centimetric radar. The early RCM’s like the E1189 were prone to mode jumping (frequency instability) under pulse conditions and the problem was solved in by means of strapping together alternate segments a process invented by Sayers in 1942. Strapping also considerably increased the magnetron’s efficiency. 

via, https://www.armms.org/media/uploads/06_armms_nov12_rburman.p...

and another account, https://westviewnews.org/2013/08/01/bell-labs-the-war-years/...

fsckboy · 14h ago
>the problem was solved in by means of strapping together alternate segments a process invented by Sayers in 1942

UK physicist James Sayers was part of the original team that developed the magnetron in the UK. He did join the Manhattan Project in 1943, so perhaps before that he came over to the US (to Bell Labs) as part of the radar effort: in that case strengthening Bell Labs contributions, weakening any claim to reverse engineering :) When Lee de Forest "invented" the triode tube amplifier, he had no idea how it worked. When Shockley "invented" the transistor, his team grumbled that he had stolen their work (similar to Steve Jobs, the boss, taking over the Macintosh project when his own Lisa project failed) but in any case, it was not actually understood yet how transistors worked. "How the First Transistor Worked: Even its inventors didn’t fully understand the point-contact transistor" https://spectrum.ieee.org/transistor-history

In these cases, the bleeding edge of R and the bleeding edge of D were the same thing. A certain amount of "reverse engineering" would have been mandatory, but it's really "reverse sciencing", "why did my experiment turn out so well", rather than "reverse engineering a competitor's product to understand how did they make it work so well."

https://en.wikipedia.org/wiki/MIT_Radiation_Laboratory

In early 1940, Winston Churchill organized what became the Tizard Mission to introduce U.S. researchers to several new technologies the UK had been developing. Among these was the cavity magnetron, a leap forward in the creation of microwaves that made them practical for use in aircraft for the first time. GEC made 12 prototype cavity magnetrons at Wembley in August 1940, and No 12 was sent to America with Bowen via the Tizard Mission, where it was shown on 19 September 1940 in Alfred Loomis’ apartment. The American NDRC Microwave Committee was stunned at the power level produced. However Bell Labs director Mervin Kelly was upset when it was X-rayed and had eight holes rather than the six holes shown on the GEC plans. After contacting (via the transatlantic cable) Dr Eric Megaw, GEC’s vacuum tube expert, Megaw recalled that when he had asked for 12 prototypes he said make 10 with 6 holes, one with 7 and one with 8; and there was no time to amend the drawings. No 12 with 8 holes was chosen for the Tizard Mission. So Bell Labs chose to copy the sample; and while early British magnetrons had six cavities American ones had eight cavities... By 1943 the [Rad Lab] began to deliver a stream of ever-improved devices, which could be produced in huge numbers by the U.S.'s industrial base. At its peak, the Rad Lab employed 4,000 at MIT and several other labs around the world, and designed half of all the radar systems used during the war.

that seems to be the source of the reverse engineering idea, and I think Bell Labs' role (which is quite important) was more toward perfecting the devices for manufacture at scale, as it was an arm of a giant leading edge industrial company.

I'm not diminishing Bell Labs nor anybody there, it was a lot of smart people.

pests · 11h ago
> as part of the radar effort

Something I've been curious about and thought I'd ask the room here since it was mentioned.

It seems to me that "the radar effort" was very significant, almost Manhattan Project levels itself. In every book about scientists in WW2 or the atomic bomb that I've read, it seemed everyone had a friend "working on radar" or various scientist weren't available to work on the bomb because they were, again, "working on radar."

Was this true or just something I'm overanalyzing?

teleforce · 11h ago
It's very true.

Guess who pioneered the venerable Silicon Valley, it's HP (then Agilent, now Keysight). Their first killer product was the function (signal/waveform) generator. HP basically the Levi's of the radar era, making tools for the radar/transistor/circuit technology gold rush.

One of the best academic engineering research labs in the world for many decades now is MIT Lincoln Lab, and guess what it's a radar research lab [1].

I can go on but you probably get the idea now.

[1] MIT Lincoln Laboratory:

https://www.ll.mit.edu/

fsckboy · 10h ago
part of why "radar" does not have the mystique of other superweapons is that it was first of all not exclusive/unknown/secret technology, all of the combatants on both sides knew about it and were working on it before the war started, while paradoxically at the same time during the war it was ultra top secret, because small details of the technology were quite significant in deployment; and "good defense" is never as exciting as good offense. (modern radar today is more fully offensive)
rjsw · 4h ago
My grandfather did a PhD in Germany in the early 30s, learned German well enough to defend his thesis. After working on radar through to the end of the war he was part of the control commission that went around German industry to find out what it had been doing, he discovered that he already knew all the senior radar scientists from when they had been students together.
antihipocrat · 11h ago
In the spirit of wartime cooperation is putting it nicely.

The magnetron was one of several technologies that the UK transferred to the USA in order to secure assistance in the war effort.

https://en.m.wikipedia.org/wiki/Tizard_Mission

fsckboy · 10h ago
>...putting it nicely. The magnetron was one of several technologies that the UK transferred to the USA in order to secure assistance in the war effort.

how is that not in the spirit of wartime cooperation? with spirited cooperation, each side contributes in order to get what they want from cooperation

if you want more nuance, the American administration, completely upper class https://oldlifemagazine.com/look-magazine-april-12-1949-fran... was 100% behind helping the UK and got the job done, but we have a political system that has to respond to the common people, and just as the English labour party has never thought "oh, what can we do to help the US?", neither has the American populace in reverse, on top of the traditional American individualism and revulsion toward European monarchical and imperial wars.

Difference is, we don't bitch about it.

Britain is completely entitled to be proud of its absolute grit, prowess, and determination wrt the second world war, but the US did right by them too. America was already on the rise, but not entirely self-confident (that had begun wrt WWI but had not become a birthright till after WWII.) We didn't have a 19th century empire that collapsed (although we were in certain respects a self-contained 19th century western empire), and we were perfectly positioned (geography, population, GDP, English Common Law legal system plus bill of rights, but lacking other tired old ideas about class) to assume the mantles not only of British hegemony, but also French, German, Dutch, Belgian and the other "imperial thrones" that were now unoccupied. it was to our benefit but it was not "our fault" or even "our doing"

antihipocrat · 6h ago
It was business, that's all. Just like the demand that the UK dismantle it's worldwide trade networks was business and plenty of other examples that set the US up to become the global power.

There's no problem with that at all, it's what every power has had to do in order to reach that status throughout history. I was just calling out that it was primarily a transaction.

Apocryphon · 10h ago
What was greater, Bell Labs or Xerox PARC?
ArthurStacks · 10h ago
Because its easy to be 1st
DonHopkins · 6h ago
So what's the name of your company that you like to brag about so much, or are you still too embarrassed to admit it, and have all your ignorant racist posts on Hacker News associated with it?

You certainly have expressed a lot of contempt and disrespect for your own employees, as much as you hate all the black people you refuse to hire. Do you not want your all white employees to know what you think of them either?

It must not be a very successful company, and you must be lying through your teeth about it, if you can't say its name.

ArthurStacks · 4h ago
Perhaps 'racist' in your culture, not in mine, and despite you not liking that, my country doesn't care.

And i'll put my personal details on here the day Hackernews makes it mandatory. Not when some very strange stalker demands it.

motohagiography · 11h ago
really interested in 1517 now. if i could build a new bell labs i would use a hedge fund structure instead of vc. the lab would attract talent and some of it might get invovled in the fund, but the lab would stay afloat on fund profits. the idea is to just let smart people cook for a tryout year, with great incentives. (not too unlike 1517's program as it turns out)

the difference with this lab idea and a vc like YC is that vc portfolio companies need products and roadmaps to raise investment and for driving revenue. whereas an asset manager is just investing the money and using the profits to fund engineering research and spinoff product development.

firms like this must already exist, maybe i just never hear about their spinoffs or inventions? if not, maybe a small fund could be acquired to build a research division onto it

potamic · 8h ago
How large a fund would you need to sustain employing like a hundred scientists full time?
saboot · 12h ago
ctrl+f "national lab" = 0 results

Hello? We have 17(!) federally funded national labs, full of scientists doing the work this article waxes nostalgic about. Through the Laboratory Directed Research and Development (LDRD) program they afford employee scientists the ability to pursue breakthrough research. However, they are facing major reductions in funding now due to the recent CR and the upcoming congressional budget!

mepian · 12h ago
From this blog's About page: "In 2010, our team cofounded the Thiel Fellowship with Peter Thiel..."

Make of that what you will.

raziel2701 · 9h ago
I was a post doc and applied to an LDRD. It's not a free for all, they, the institution, already knows what topics want to fund and you write your proposal to cater to that. Very similar to tacking "AI" to your pitch these days.

It's an illusion that no-strings-attached funding exists. The government has an agenda and you're free to research anything you want, as long as it is one of the pre-determined topics. It's a very political process.

johnea · 15h ago
Bell Labs wasn't the only loss. HP Labs was another victim of LBO cannibalism.
bradchris · 8h ago
Do we think that Bell Labs, if it were to avoid being broken up, would have existed in any way the same way today ?

Cynically, it likely would have ended up a zombie shell of itself, like IBM

Topically, assuming it avoided such a fate and was held in high regard by the industry and employees, this current administration would likely be meddling with every single grant, contract, or project that did not align with the administrations' priorities (see: colleges and research grants)

ForOldHack · 12h ago
And Xerox, and IBM went through spits and bumps, IBM Boca Raton, IBM Tully Road. Fairchild. Colombia Physics. The Manhattan project. Princeton. Cornell. Bleachey Park.
ForOldHack · 12h ago
MIT Media lab.