The NO FAKES act has changed, and it's worse

64 miles 21 6/24/2025, 5:34:24 AM eff.org ↗

Comments (21)

rootlocus · 1h ago
> The new version of NO FAKES requires almost every internet gatekeeper to create a system that will a) take down speech upon receipt of a notice; b) keep down any recurring instance—meaning, adopt inevitably overbroad replica filters on top of the already deeply flawed copyright filters; c) take down and filter tools that might have been used to make the image; and d) unmask the user who uploaded the material based on nothing more than the say so of person who was allegedly “replicated.”

Sounds like the kind of system small companies can't implement and large companies won't care to implement.

dspillett · 1h ago
> Sounds like the kind of system small companies can't implement and large companies won't care to implement.

Or the sort of thing bigger companies lobby for to make the entry barriers higher for small competition. Regulatory capture like this is why companies above a certain level of size/profit/other tend to swing in favour of regulation when they were not while initially “disrupting”.

stodor89 · 1h ago
15 years ago that would've been outrageous. But at this point they're just kicking a dead horse.
mschuster91 · 2h ago
> The new version of NO FAKES requires almost every internet gatekeeper to create a system that will a) take down speech upon receipt of a notice; b) keep down any recurring instance—meaning, adopt inevitably overbroad replica filters on top of the already deeply flawed copyright filters; c) take down and filter tools that might have been used to make the image; and d) unmask the user who uploaded the material based on nothing more than the say so of person who was allegedly “replicated.”

You already need point a) to be in place to comply with EU laws and directives (DSA, anti-terrorism [1]) anyway, and I think the UK has anti-terrorism laws with similar wording, and the US with CSAM laws.

Point b) is required if you operate in Germany, there have been a number of court rulings that platforms have to take down repetitive uploads of banned content [2].

Point c) is something that makes sense, it's time to crack down hard on "nudifiers" and similar apps.

Point d) is the one I have the most issues with, although that's nothing new either, unmasking users via a barely fleshed out subpoena or dragnet orders has been a thing for many many years now.

This thing impacts gatekeepers, so not your small mom-and-pop startup but billion dollar companies. They can afford to hire proper moderation staff to handle such complaints, they just don't want to because it impacts their bottom line - at the cost of everyone affected by AI slop.

[1] https://eucrim.eu/news/rules-on-removing-terrorist-content-o...

[2] https://www.lto.de/recht/nachrichten/n/vizr6424-bgh-renate-k...

pjc50 · 1h ago
This is one of those cases where the need to "do something" is strong, but that doesn't excuse terrible implementations.

Especially at a time when the US is becoming increasingly authoritarian.

marcus_holmes · 1h ago
The EU has a different approach to this kind of regulation than the USA [0]. EU regulations are more about principles and outcomes, while US regulation is more about strict rules and compliance with procedures. The EU tends to only impose fines if the regulations are deliberately being ignored, while the US imposes fines for any non-compliance with the regs.

So while you can compare the two, it's not an apples-to-apples comparison. You need to squint a bit.

The DMCA has proven to be way too broad, but there's no appetite to change that because it's very useful for copyright holders, and only hurts small content producers/owners. This looks like it's heading the same way.

> This thing impacts gatekeepers, so not your small mom-and-pop startup but billion dollar companies.

I don't see any exemptions for small businesses, so how do you conclude this?

[0] https://www.grcworldforums.com/risk/bridging-global-business... mentions this but I couldn't find a better article specifically addressing the differences in approach.

johngladtj · 2h ago
None of which is acceptable
mschuster91 · 2h ago
That's the inevitable consequence of the big platforms not doing anything on their own to curb abuse like terrorists, defamation, fake news (some of them leading to riots and deadly lynch mobs), AI slop, you name it.

What did you expect governments to do in the face of rising public pressure and inaction?

The EU in particular had more than enough patience. I can't count how often individual countries and then the EU itself told Meta, Twitter, Google/Youtube et al to clean up shop or else, they decided to ignore it or do less than the bare minimum in response... and now they cry as the EU has finally shown some fangs and the US is following suit (although, I'll admit, for the entirely wrong reasons).

When industries want to self-regulate, they can, but they actually have to make an effort, because when stuff goes south, the regulation that results will inevitably be much harder.

AnthonyMouse · 34m ago
The fallacy is in expecting corporations to play the role of the government.

Suppose someone posts a YouTube video that you claim is defamatory. How is Google supposed to know if it is or not? It could be entirely factual information that you're claiming is false because you don't want to be embarrassed by the truth. Google is not a reasonable forum for third parties to adjudicate legal disputes because they have no capacity to ascertain who is lying.

What the government is supposed to be doing in these cases is investigating crimes and bringing charges against the perpetrators. Only then they have to incur the costs of investigating the things they want to pass laws against, and take the blame for charges brought against people who turn out to be innocent etc.

So instead the politicians want to pass the buck and pretend that it's an outrage when corporations with neither the obligation nor the capacity to be the police predictably fail in the role that was never theirs.

johngladtj · 2h ago
Sorry, but your argument is even less acceptable
benchly · 1h ago
You need to expound on why as your replies are not only unacceptable but remarkably useless.

Try dialogue.

mschuster91 · 1h ago
Is it? Why should the big tech giants be exempted from the laws and regulations that apply for everyone else?
ricardobeat · 1h ago
We don’t punish telecoms, ISPs or the mail company for “facilitating terrorism”. Where do you draw the line?

These rules have serious consequences for privacy, potential for abuse, and also raise the barriers immensely for new companies to start up.

The problem is quite obvious when you consider that Trump supporters label anything they dislike as fake news, even when the facts are known and available to everyone. These rules would allow any opposition to be easily silenced. Restricting the measures to terrorism, illegal pornography, and other serious crimes would be more acceptable.

Your question is like asking “why don’t we have metal detectors and body scanners on every school and public building”. Just because you can, and it would absolutely increase safety, does not mean it’s a good idea.

IMO legislation should focus on how individuals can be made responsible, and prosecuted when they break the law – not mandating tech companies to become arms of a nanny state.

anon0502 · 1h ago
As I understood, it propose for broad filter so more content which should fall under "fair use" will now be take down faster.

> not your small mom-and-pop startup

not sure why you said this, it's the artists / content makers that suffer.

privatelypublic · 2h ago
Slippery slope. See how far we've fallen.
ls612 · 2h ago
Unfortunately this is the inevitable outcome of information and computation (and therefore control) becoming cheap. Liberal political systems can no longer survive in equilibrium. The 21st century will be a story either of ruling with an iron fist or being crushed beneath one :(
dathinab · 19m ago
there is a huge difference between having strict laws and enforcing them and "ruling with the iron fist"

the later inherently implies using violence to suppress people

speeder · 1m ago
That is always same thing.

All laws that you want to be strictly enforced, requires violence. This is why people should ALWAYS remember when making laws: "Is this worth killing for?"

I remember some years ago on HN people discussing about a guy that got killed because he bought a single fake cigarrete. IT goes like this:

You make a law where "x" is forbidden, penalty is a simple fine. Person refuses to pay fine. So you summon that person to court, make threats of bigger fine. Person ends with bigger fine, refuses to pay anything. So you summon that person again, say they will go to jail if they don't pay. They again don't pay, AND flee the police that went to get them. So the cops are in pursuit of the guy, he is a good distance away from the cops for example, then they have the following choice: Let him go, and he won, and broke the law successfully... Or shoot him, the law won, and he is dead.

This chain ALWAYS applies, because otherwise laws are useless. You can't enforce laws without the threat of killing people if they refuse all other punishments.

I don't know if the guy you were discussing with is right or not, if digital era result in the need of "ruling with the iron fist", but make no mistake, there is no "strict law enforcement" that doesn't involve killing people in the end.

Thus you always need to think when making laws: "Is this law worth making someone die because of it?"

rightbyte · 1h ago
Information is as expensive as always it is just copying it that is cheap.
Xelbair · 1h ago
is it? it was always easier and faster to spew bullshit than to refute it, and now we can automate it.
rootlocus · 1h ago
Information, disinformation, what's the difference?