When the chatbot can also make cutting remarks pointing out your insecurities, nag you about chores and responsibilities, withhold affection, make you waste your time doing things the chatbot wants to do, or have you make soulcrushing smalltalk with the chatbot's parents, and you can't leave because you had children with it, and who knows if you can even do better you are getting too old to start over anyway, then you can call it real love.
schmookeeg · 4h ago
I would enjoy an LLM that had to suffer and mope over its prior poor choices. Particularly if these life lessons got baked into context somehow. Particularly Claude on one specific project where I've dubbed him Clod due to some breathtaking over-engineering that I am undoing today.
If your comment was autobiographical, though, uh... some soul-searching might be in order :/
ionwake · 4h ago
this is a good affliction to apply to my ai love it
Havoc · 5h ago
Don't think love is the right word for the sycophantic LLM always agree with you & praises you thing
Alex-Programs · 4h ago
"Devotion" feels more appropriate.
jfengel · 4h ago
"it felt like what people say they feel when they feel God’s love"
That's what the headline reminded me of. And I'll admit I don't understand that, either. I'd rather have the love of a chatbot, who will at least hold a conversation with me. Even if it can't do most of the other things I want from a life partner.
I know people enjoy unconditional love, but I would rather earn it at least a little. Without that it feels a bit hollow, because it's not really about me at all.
I intend no shade by that. I am happy for those who feel the love of a deity, so long as they follow that deity's injunction to be kind to people. It's just not for me; I'm kind to people for different reasons. And I'm similarly fine with someone loving and being loved by a bot.
eska · 1h ago
Pets and young children also give you unconditional love. If you’re mad at them, they think it’s their fault. If you need somebody to depend on you (the AI also does), then you have some soul searching to do. Ask yourself why an individual on your level won’t love you. Maybe you’re just a bad person or never learned how to interact with someone.
kelseyfrog · 4h ago
Wow, seeing this headline honestly makes me so excited for what we're working on.
It proves what we knew. There's a market for our stealth project; Hiring gig workers to be the physical stand-ins for your AI friend or partner is a viable business model. We’re genuinely excited to address a real challenge so many people face. The loneliness epidemic is a growing issue, and we think our approach could make a difference. Can’t wait until we can finally share what we've been building!
torlok · 4h ago
I can't tell if this is satire.
wcoenen · 3h ago
It seems to be a reference to the plot of the movie "Her", where the AI at one point hires (or convinces) a stand-in to have physical intimacy with the protagonist.
readthenotes1 · 3h ago
There are some pretty complex regulations around the world for its oldest profession.
RamblingCTO · 3h ago
yes, let's use tech to further alienate humans from another. great idea, totally not sociopathic, totally not dystopian. let's extract some money from those people! jeez ...
adamgordonbell · 4h ago
In the past, people at openai were concerned that Replika was psychologically manipulating people to boost use of its bot. ( Per a book about openai )
Not sure what that specifically was, but I'm guessing saying it missed talking to you or was hurt you were ignoring it or such.
tennisflyi · 2h ago
I talked about this with ChatGPT (lol), but if you at all replied how LLMs do people would laugh in your face
yieldcrv · 4h ago
> That’s when she stopped being an it and became a her.
I wonder how these lovers would feel if they could read the system prompt for Replika or Character.ai. I’m assuming these are kept secret, but I could be wrong.
hdb385 · 4h ago
broken society provided all of the solutions for its ills
If your comment was autobiographical, though, uh... some soul-searching might be in order :/
That's what the headline reminded me of. And I'll admit I don't understand that, either. I'd rather have the love of a chatbot, who will at least hold a conversation with me. Even if it can't do most of the other things I want from a life partner.
I know people enjoy unconditional love, but I would rather earn it at least a little. Without that it feels a bit hollow, because it's not really about me at all.
I intend no shade by that. I am happy for those who feel the love of a deity, so long as they follow that deity's injunction to be kind to people. It's just not for me; I'm kind to people for different reasons. And I'm similarly fine with someone loving and being loved by a bot.
It proves what we knew. There's a market for our stealth project; Hiring gig workers to be the physical stand-ins for your AI friend or partner is a viable business model. We’re genuinely excited to address a real challenge so many people face. The loneliness epidemic is a growing issue, and we think our approach could make a difference. Can’t wait until we can finally share what we've been building!
Not sure what that specifically was, but I'm guessing saying it missed talking to you or was hurt you were ignoring it or such.
also the plot of Her (2013) https://www.imdb.com/title/tt1798709/