Building supercomputers for autocrats probably isn't good for democracy

71 rbanffy 37 6/8/2025, 9:11:18 PM helentoner.substack.com ↗

Comments (37)

Exoristos · 55m ago
I would have thought it obvious that LLMs' primary usefulness is as force-multipliers of the messaging sent out into a society. Each member of hoi polloi will be absolutely cocooned in thick blankets of near-duplicative communications and interactions most of which are not human. The only way to control the internet, you see, proved to be to drown it out.
bgwalter · 32m ago
I missed that the article is talking about Gulf monarchy autocrats instead of U.S. autocrats.

That is very simple: First, dumping graphics cards on trusting Saudi investors seems like a great idea for Nvidia. Second, the Gulf monarchies depend on the U.S. and want to avoid Islamic revolutions. Third, they hopefully use solar cells to power the data centers.

Will they track users? Of course, and GCHQ and the NSA can have intelligence sharing agreements that circumvent their local laws. There is nothing new here. Just don't trust your thoughts to any SAAS service.

zelphirkalt · 17m ago
But at the end of the day HN is a small bubble and many people out there are not well informed and even more will trade privacy for convenience sooner or later. Making it so that the temptations do not even come into existence would be preferable from a certain point of view.
prpl · 1h ago
Not clear to me anyone “in charge” cares in any case. In fact, that may be the point.
jeffbee · 8m ago
I have a question. In what sense is OpenAI going to assist UAE in building large-scale data centers suitable to machine learning workloads? Do they have experience and expertise doing that?
martin-t · 1h ago
The biggest danger of AI isn't that it will revolt but that it'll allow dictators and other totalitarians complete control over the population.

And I mean total. A sufficiently advanced algorithm will be able to find everything a person has ever posted online (by cross referencing, writing style, etc.) and determine their views and opinions with high accuracy. It'll be able to extrapolate the evolution of a person's opinions.

The government will be able to target dissidents even before they realize they are dissidents, let alone before they have time to organize.

noident · 1h ago
> A sufficiently advanced algorithm will be able to find everything a person has ever posted online (by cross referencing, writing style, etc.)

Is this like a sufficiently smart compiler? :)

Stylometry is well-studied. You'll be happy to know that it is only practical when there are few suspect authors for a post and each author has a significant amount of text to sample. So, tying a pseudonymous post back to an author where anyone and everybody is a potential suspect is totally infeasible in the vast majority of cases. In the few cases where it is practical, it only creates a weak signal for further investigation at best.

You might enjoy the paper Adversarial Stylometry: Circumventing Authorship Recognition to Preserve Privacy and Anonymity by Greenstadt et al.

fooker · 17m ago
> that it is only practical

You're missing the point, it doesn't have to be practical, only the illusion of it working is good enough.

And if authoritarian governments believe it works well enough, they are happy to let a decent fraction of false positives fall through the cracks.

See for example, polygraph tests being used in court.

sitkack · 57m ago
Someone did a stylometry attack against hn awhile ago, it would with very high confidence unmask alt accounts on this site. It worked. There is zero reason to believe that it couldn't be applied on a grand scale.
noident · 45m ago
That sounds considerably more narrow than what the GP described.

What if I don't have an alternate HN account? Or what if I do have one, but it has barely any posts? How can you tie this account back to my identity?

Stylometry.net is down now, so it's hard to make any arguments about its effectiveness. There are fundamental limitations in the amount of information your writing style reveals.

AStonesThrow · 56m ago
How do y’all prove it worked, O Privacy Wonks?

How do y’all establish ye Theory Of Stylometry, O Phrenology Majors?

O, @dang confirms it on Mastodon or something??

dfxm12 · 1h ago
determine their views and opinions with high accuracy

The truth, accuracy doesn't matter to authoritarians. It doesn't matter to Trump, clearly, people are being sent away with zero evidence, sometimes without formal charges. That's the point of authoritarianism. The leader just does as he wishes. AI is not enabling Trump, the feckless system of checks and balances is. Similarly, W lied about wmd's, to get us into an endless war. It doesn't matter that this reason wasn't truthful. He got away with it and enriched himself and defense contractor buddies at the expense of the American people.

barbazoo · 1h ago
throwanem · 1h ago
tabarnacle · 1h ago
Sure.. for folks who don’t worry about anonymity when sharing online. For those who prioritize anonymity, I’m doubtful.
fooker · 14m ago
You can identify with a decent amount of confidence whether two paragraphs of text were written by the same person.
throwanem · 1h ago
So am I. They would be among the first and most quietly vanished in this scenario, being trivially identifiable from a God's-eye view.
exiguus · 1h ago
I'm not entirely convinced that nations will play as significant a role in the coming decades as they have historically. Currently, we observe a trend where affluent individuals are increasingly consolidating power, a phenomenon that is becoming more apparent in the public sphere. Notably, these individuals are also at the forefront of owning and controlling advancements in artificial intelligence. Coincidentally, this trend is often referred to as 'tech fascism,' bringing us back to the dictator schema.
throwanem · 1h ago
States haven't always been a major feature of power. But we've never seen the interaction of personal power with modern weaponry, by which I do not mean nukes. When it was just a question of which debauched noble could afford more thugs or better assassins, sure. But 'how many Abrams has the Doge?'
exiguus · 56m ago
>But 'how many Abrams has the Doge?'

As many as you can control with signal chat.

Besides, I'm not sure if tanks like the Abrams are as important anymore. Nowadays, things like food and water really matter. For example, exporting corn is crucial. Also, having the soils needed to make modern tech, like chips and batteries, is super important. Therefore Greenland is.

nerdsniper · 58m ago
Across history, often the “state” is/was really just a kind of collective umbrella organization to help manage the interests of the powerful.
exiguus · 52m ago
I agree. Initially, this power was embodied by monarchs who claimed divine right, such as god-given kings. Over time, the influence shifted towards corporations that wielded significant economic and political control. Today, it is often the super-rich individuals who hold substantial sway over both economic and political landscapes.
arcanus · 1h ago
I do not find her critique of argument #2 compelling [1]. Monetization of AI is key to economic growth. She's focused on the democratic aspects of AI, which frankly aren't pertinent. The real "race" in AI is between economic and financial forces, with huge infrastructure investments requiring a massive return on investment to justify the expense. From this perspective, increasing the customer base and revenue of the company is the objective. Without this success, investment in AI will drop, and with it, company valuations.

The essay attempted to mitigate this by noting OAI is nominally a non-profit. But it's clear the actions of the leadership are firmly aligned with traditional capitalism. That's perhaps the only interesting subtly of the issue, but the essay missed this entirely. The omission could not have been intentional, because it provides a complete motivation for item #2.

[1] #2 is 'The US is a democracy and China isn’t, so anything that helps the US “win” the AI “race” is good for democracy.'

bgwalter · 44m ago
The U.S. may be a nominal democracy, but the governed have no influence over the oligarchy. For example, they will not be able to stop "AI" even though large corporations steal their output and try to make their jobs obsolete or more boring.

Real improvements are achieved in the real world, and building more houses or high speed trains does not require "AI". "AI" will just ruin the last remaining attractive jobs, and China can win that race if they want to, which isn't clear yet at all. They might be more prudent and let the West reduce its collective IQ by taking instructions from computers hosted by mega corporations.

antithesizer · 37m ago
If democracy builds supercomputers (and bombs, propaganda, prisons) for autocrats, of what good is democracy? The evidence points strongly to democracy and autocracy being friends, even "good cop, bad cop"
zelphirkalt · 13m ago
Or is it rather, that there are few well working democracies and most are infiltrated by autocrats at least to some degree?
Kapura · 16m ago
the ultra-wealthy in western democracies understand they have much more in common with the ruling autocrats than the average citizen of a democracy (the motherfuckers keep voting for taxes!)
credit_guy · 1h ago
Maybe.

I would not do business with Kim Jong Un. He is murdering a lot of his own people. Or with Putin. He is murdering a lot of Ukrainians.

But guess what: both North Korea and Russia are under sanctions. You can't do business with them anyway.

But the UAE is not under sanctions. Which means that in the opinion of the US Government it is ok to do business with them. Then who is Open AI to say otherwise? Why should it be any of their concern to determine who is a good guy or a bad guy in the world? Shouldn't there be a division of responsibilities? Let the Department of State determine who is good and who is bad, and let companies do business with those who are not on the sanctions list.

anthonymartinez · 1h ago
for a while the Pinochet regime was our perfectly acceptable ally in Chile, even though we knew he was a mass murderer. its silly to throw up your hands just because the state department (itself not exactly a bastion of morality) says that its not illegal to do business with someone.
credit_guy · 1h ago
Let me guess: you were against Bush's war in Iraq to take down Saddam. Why? Wasn't it moral to try to eliminate a known mass murderer?

Either is our duty to be the the moral arbiters of the world or it isn't. Which one is it?

jfengel · 46m ago
We tolerate quite a few mass murderers in charge of countries. We attacked that one because, supposedly, he had the tools and intent to attack the United States with chemical weapons.

Many were opposed to that war, not because they didn't feel it was right to eliminate a mass murderer, but because that was not the stayed reason. The stated reason in fact turned out to be false, and was arguably an abject lie.

In other words ... it's not a great example of what you're trying to claim.

bdangubic · 30m ago
We tolerate quite a few mass murderers in charge of countries

including our own…

jamroom · 50m ago
We didn’t go in to Irag because Saddam was a mass murderer - we went in because Bush lied to America that Saddam was trying to get yellow cake uranium to build a bomb. A lot of Americans were against the war because we knew Saddam was not involved in 9/11 but Bush jr wanted to finish what his father couldn’t in the first gulf war. Honestly I would love it if we cared enough about mass murderers to actually go in and help, but I just don’t see that being a reason.
Spooky23 · 44m ago
Obviously that’s not true. Pinochet and Saddam were both direct product of US policy and intervention.

At the, end, Saddam ultimately pulled too hard on the leash and miscalculated his power. Murder, mass or otherwise and morality has little bearing on matters of empire.

Thinking otherwise is naive.

sitkack · 53m ago
One can be against a war and at the same time be against the government that war would remove. We killed hundreds of thousands of Iraqis, many of those troops were conscripts who didn't want to be there, yet we bulldozed them into the sand to suffocate, or burned them alive on the highway while they retreated.

There are more than two answers to everything.

> Wasn't it moral to try to eliminate a known mass murderer?

Given the context and the means. No.

ImPostingOnHN · 54m ago
> Why should it be any of their concern to determine who is a good guy or a bad guy in the world?

Because helping someone do something bad is itself bad.

> Shouldn't there be a division of responsibilities?

It sounds like you mean an abdication of responsibility? We are already responsible for our own choices and actions, as well as their effects.

credit_guy · 16m ago
No, I'm not talking about abdication of responsibility. I'm talking about modesty. It is very appealing to think you know better than other people. That we know how a society should be governed, and we are able to label another country as totalitarian, or undemocratic, or illiberal, or such. But looking around the world, you can see that a lot of evil is perpetrated exactly by people who think they know better than everyone else. Osama bin Laden himself thought that what he was doing was for the advancement of good over evil, and a lot of his followers thought the same.

A lot of the people reading Hacker News right now think they have a better solution for the societal problems of the UAE. I personally have no idea about what's going on over there. But let's say that I'm in charge of the business decisions at Open AI. Should I start thinking that I know a way to solve their problems, and part of that way is for my company to apply some form of AI embargo on them? Or should I simply know my limitations, and restrict my judgment to the matters I am familiar with.

"Abdication of responsibility". What grand words. Why exactly has Open AI a responsibility to guide the UAE towards a better future? And, more importantly, why should Open AI feel confident that they know what is better for the UAE?