I appreciate Lanier’s reminder that AI is ultimately a collective artifact, not an independent mind. Calling it "large language social collaboration" might be clunky, but the point is fair: what we label "AI" is just our accumulated digital patterns, recombined.
I do wonder, though. At what point does the remixing and generalization become more than just a mirror of us? Even if it is still trained on human work, its behavior can surprise us in ways that feel alien.
Curious if others here agree: is the term "AI" just marketing hype, or is it still useful shorthand for systems whose emergent properties we do not fully control?
cratermoon · 4h ago
My objection to calling it collaboration is that the contributors were not given a choice of whether or not to participate.
To collaborate is to voluntarily join with others
and contribute willingly.
The current crop of LLM models were created from large-scale appropriation of the works of others without consent.
squircle · 3h ago
This is a poignant observation. What if there was real tender love and care put into gathering these training materials? (Y'know, truly free and open source, with a deep understanding of what it means to be human.) If you are working on this, we should talk.
I do wonder, though. At what point does the remixing and generalization become more than just a mirror of us? Even if it is still trained on human work, its behavior can surprise us in ways that feel alien.
Curious if others here agree: is the term "AI" just marketing hype, or is it still useful shorthand for systems whose emergent properties we do not fully control?