I think this is terrible, but I suppose I think it is less bad if ChatGPT didn’t get the phone number from the bank’s files/data.
If ChatGPT just provided it from public info/the training set, am I wrong to think that isn’t as bad?
Just for clarity, I think it’s not a good idea to use LLMs if you care whether the answer is right or wrong. So this is a terrible use case.
legacynl · 2h ago
Wow this is terrible
femto · 2h ago
Good luck getting any recompense. CBA disclosed your phone number. I'm aware of a company that disclosed 3000 high resolution colour passport scans, along with all personal details from a travel booking website. About half of the records were for school children. No one was notified that their data was leaked. Diddly squat happened to that company.
If ChatGPT just provided it from public info/the training set, am I wrong to think that isn’t as bad?
Just for clarity, I think it’s not a good idea to use LLMs if you care whether the answer is right or wrong. So this is a terrible use case.