I'm probably not the only one thinking this, but I'm actually more worried about the alignment of who controls the power – the folks running things, the military-industrial complex – than about AGI itself. Right now, the bigger threat feels like other people using AI against us, not some rogue AGI suddenly taking over.
It’s not just paranoia either, there’s a lot of research into things like autonomous war drones (basically AI robots), which are designed for a very specific job. And the reason I think this is important is because once AI/AGI starts replacing most jobs, we lose the thing that gives us leverage – our value as workers. That's not necessarily my opinion, but what I suspect their opinion is – the people with the power.
feraloink · 2d ago
At first I thought, same-old same-old, but I changed my mind halfway through. I like how the author, Elke, identifies the underlying similarities between e/acc and the AGI doomers. (By AGI doomers, I'm referring to many Less Wrong folks, NOT earnest people who worry about whether AI will cause 20% unemployment or if it is hype and they're being played).
I think pmarca may have had a change of heart about being on the e/acc side, but he certainly is associated with funding lots of AI startups, so that may be more relevant.
Also, after the SBF_FTX fiasco and association with effective altruism, I'm not sure if EA should be so strongly linked to the doomers as Elke does. This and my vague recall of what Andreessen says on Twitter are minor points. Elke wrote an insightful, clever post about AGI (and even AI without attaining AGI) as a form of eschatology.
P.S. My next stop will be to look for the 1999 book she mentioned for an earlier view of tech eschatology.
It’s not just paranoia either, there’s a lot of research into things like autonomous war drones (basically AI robots), which are designed for a very specific job. And the reason I think this is important is because once AI/AGI starts replacing most jobs, we lose the thing that gives us leverage – our value as workers. That's not necessarily my opinion, but what I suspect their opinion is – the people with the power.
I think pmarca may have had a change of heart about being on the e/acc side, but he certainly is associated with funding lots of AI startups, so that may be more relevant.
Also, after the SBF_FTX fiasco and association with effective altruism, I'm not sure if EA should be so strongly linked to the doomers as Elke does. This and my vague recall of what Andreessen says on Twitter are minor points. Elke wrote an insightful, clever post about AGI (and even AI without attaining AGI) as a form of eschatology.
P.S. My next stop will be to look for the 1999 book she mentioned for an earlier view of tech eschatology.