Ask HN: Even with AGI, it wouldn't know what you know. Can we preserve that?
1 consumer451 4 7/23/2025, 11:07:36 PM
Even if we had true AGI today, even if that AGI attained top .001% human-level intelligence as a product, that product would still not have your experiences and knowledge of a given project or domain. It could not replace you.
That AGI didn't experience what you have experienced. How could it without recording all of your audio and visual input?
Clearly, I am not the first person to think of this concept. My question: is it important for us to fight the inevitable attempt at capture of all our audio/visual existence? Or, is this a Luddite concept in 2025?
Is it some elaboration on "It's useless, humans are better, therefore stop trying to collect data?".
If it is, it is more of an assertion than a question. Those are rather common, and there isn't much to say about them.
If it's not, you should express your doubts better. Explain in detail where you're coming from.
My original thought here was: chill out everyone! Even if we had AGI, even ASI today... neither of those systems could know everything that you have seen, heard, dreamed, or thought. They cannot truly replace you... unless all audio, video, and eventually neuro signals in your lifetime are recorded. This is really important, as all of those things influence you to make the decisions that you make. This is what makes you unique, and important.
Then the next thought was... is that something worth fighting against? Or, should I welcome my individually trained technological replacement?
When I was talking about this with a co-worker earlier today, the thought occurred: this is not only the final frontier of privacy, but of individuality.
If all of your work and knowledge is done via text like Slack or Email, or via transcribed online meetings, then this seems relatively easy in the long term. That problem is not that hard, conceptually. However, it would still not know some anecdote that you heard over beers from someone else, which is how multiple AI concepts were discovered.
If we continue that idea, not just work, but to offload all our our thinking, a virtual you, it would require from birth A/V data capture, right?
[0] The reason that this came up in my mind is that today I presented a new LLM-enabled feature in my SaaS product, to my beta users. One user quickly and unironically, responded "This is awesome! I don't have to use my brain for this anymore!" This scared the crap out of me, and my post here is the generalization of this idea.