I'm wondering how these changes in behavior compare to the "video games don't cause actual violence" result, seeing as the output of games are far more lifelike than a mere chat log.
Maybe it's because fictional media up to this point has never been deeply personalized to the level AI permits. A story where you play as a ruthless gangster that millions of other people also play as is divorced from one's personal affairs through the boundary of made-up characters. Now imagine a rival gangster bangs on the door and one realizes it's the likeness of their actual long-incarcerated brother in real life that an AI generated. It would be more invasive and affecting to the psyche than anything a AAA studio could come up with in a vacuum.
When it gets personal, people have more reason to get up and take action. See: the Baby Reindeer lawsuit, the taboo around real person fanfiction, etc. AI has achieved never-before-seen levels of "getting personal" with the player in interactive media.
Black Mirror was prescient in this regard with the episode "Playtest" exploring the nature of AI-personalized horror.
Another thing I'd guess is if a game studio publishes a work of fiction that encourages the player to commit suicide, the studio would be sued out of existence. So game studios have been incentivized throughout history to never publish games with specific attitudes towards topics like those if they wanted any chance to keep selling copies. It was never that video games as a medium were incapable of inducing suicide or psychosis, but the nature of game publishing and being an artist meant no game developer that wanted to keep their reputation was going to be such a sociopathic asshole as to write a game framing suicide as a positive thing, much less a game singling out one specific person with fine-tuned personal anecdotes twisting the knife in every possible emotional vulnerability. AI is able to bypass that unspoken social contract with enough personalized context.
Maybe it's because fictional media up to this point has never been deeply personalized to the level AI permits. A story where you play as a ruthless gangster that millions of other people also play as is divorced from one's personal affairs through the boundary of made-up characters. Now imagine a rival gangster bangs on the door and one realizes it's the likeness of their actual long-incarcerated brother in real life that an AI generated. It would be more invasive and affecting to the psyche than anything a AAA studio could come up with in a vacuum.
When it gets personal, people have more reason to get up and take action. See: the Baby Reindeer lawsuit, the taboo around real person fanfiction, etc. AI has achieved never-before-seen levels of "getting personal" with the player in interactive media.
Black Mirror was prescient in this regard with the episode "Playtest" exploring the nature of AI-personalized horror.
Another thing I'd guess is if a game studio publishes a work of fiction that encourages the player to commit suicide, the studio would be sued out of existence. So game studios have been incentivized throughout history to never publish games with specific attitudes towards topics like those if they wanted any chance to keep selling copies. It was never that video games as a medium were incapable of inducing suicide or psychosis, but the nature of game publishing and being an artist meant no game developer that wanted to keep their reputation was going to be such a sociopathic asshole as to write a game framing suicide as a positive thing, much less a game singling out one specific person with fine-tuned personal anecdotes twisting the knife in every possible emotional vulnerability. AI is able to bypass that unspoken social contract with enough personalized context.