The Case for AGI by 2030

3 doitLP 5 5/19/2025, 12:03:19 AM 80000hours.org ↗

Comments (5)

Zambyte · 8h ago
I'd be interested in a case against AGI now. Can you define "general intelligence" in a measurable way (even subjectively) that includes things usually considered to have general intelligence (at least humans) but doesn't include existing AI systems?

People seem to have this idea of AGI that it is an all knowing oracle of truth that is perpetually beyond the current capabilities. This is useful for convincing VCs that you need more funding, and fear mongering the government into regulating away competition. A simple and reasonable alternative conclusion is that AGI has been here for years, and that reality just isn't quite as exciting as sci-fi.

Will AGI capabilities increase? Sure, as we build out more tools for AGI to reach for, and as the intelligent agents themselves mature. Fundamentally, it is here.

Lockal · 6h ago
Ah, "machines will be capable, within twenty years, of doing any work a man can do" - 1965
andsoitis · 9h ago
> “we are now confident we know how to build AGI”

Uhm. If you knew how to build AGI, what is your logical next step? Is this step in the interest of humanity?

RetroTechie · 6h ago
We should also ask ourselves: assuming AGI (far exceeding human capabilities in every field of intellect) will emerge in near-future, could turn against humanity, look for ways to wipe us out and/or plunge our society in total chaos (or send us on a self-destruct path), what could humanity do to prevent such scenarios?

I doubt this would happen. But can we rule it out 100%? We've become dependent on technology + networked systems to a high degree. If that's messed with (large-scale, worldwide, many different systems simultaneously or in short order), can we still 'unplug'? (those AGI systems, or ourselves - take your pick)

For the naysayers: some possibilities:

# AGI systems co-operating. Or taking over other systems to further their goals.

# Discovering ways to erase (or corrupt / subtly modify) most data stored in datacenters, and most backups too.

# Exploiting 0-days to do similar bad stuff to PC's, smartphones, etc. Remember most such devices are always-connected these days & employ automatic updates.

# Mess with critical infrastructure like power grids, logistics chains, public transport / flight control systems, etc. Or plunge stock markets into chaos.

# Develop a deadly biological weapon, have that synthesized somewhere, and cause it to be released.

# Mess with social media & news networks, to send humans into mass hysteria (or blissfully unaware what's about to hit them).

Granted, such a "rise of the machines" scenario sounds pretty wild. But "99.999% certain this won't happen", doesn't cut it imho. A 100% safety guarantee is needed here.

turtleyacht · 8h ago
"But oh, to be free. Not have to go poof! What do you need? Poof! What do you need? Poof! What do you need? But to be my own master, such a thing would be greater than all the magic and all the treasures in all the world."

- Aladdin (1992)