“If someone who was alone and in a bad mental place, potentially considering self-harm, had read something like that, it could really put them over the edge,” she worried.
In response to the incident, Google told CBS that LLMs “can sometimes respond with non-sensical responses.”
“This response violated our policies and we’ve taken action to prevent similar outputs from occurring.”