> But here's the truly unsettling part: even the optimists are getting nervous. Geoffrey Hinton, the "Godfather of AI," revised his human extinction risk estimate from basically zero to 10-20% within three decades. When the guy who invented the technology starts sounding like a doomsday prepper, maybe it's time to pay attention.
Anyone who actually thinks that genAI poses such a risk, but continues to work on it anyway, is unambiguously doing evil things.
FrankWilhoit · 7h ago
The risk of human extinction does not come from AI but from human devolution, which is also why we (some of us ;) overestimate AI. It is the technology that is sufficiently advanced to seem like magic; it is our threshold for magic that is going down, as we go down.
Anyone who actually thinks that genAI poses such a risk, but continues to work on it anyway, is unambiguously doing evil things.