We talk a lot about young people and social media.
我们经常会谈论关于年轻人和社交媒体的话题。
Now, a troubling story about one of the next big things in high-tech, AI chatbots.
而现在,一起令人担忧的事件引发了人们对高科技领域又一重要产物的关注--人工智能聊天机器人。
A lawsuit filed against Menlo Park-based Character.AI and Google, which licenses its AI technology.
近日,总部位于门洛帕克市的Character.AI公司和授权其人工智能技术的谷歌公司,收到了一纸诉状。
The suit comes after a 14-year-old in Florida died by suicide following a chat with one of Character's AI bots.
这起诉讼是在佛罗里达州一名14岁的男孩在与Character.AI的一个人工智能机器人聊天、随即自杀身亡之后提出的。
The lawsuit claims the bots were intentionally designed to operate as a deceptive and hypersexualized product.
该诉讼称,这些机器人被故意设计成一种欺骗性、过度性化的产品。
At one point, the family claims the teenager asked the bot, "What if I told you I could come home right now?"
男孩的家人称,有一次男孩问机器人:“如果我告诉你我现在就可以回家呢?”
The bot responded, "Please do, my sweet king."
机器人回答说:“请这样做吧,我亲爱的国王。”
When the 14-year-old expressed suicidal thoughts, the bot wrote, "Don't talk like that. I won't let you hurt yourself."
当这名14岁的男孩表达自杀想法时,机器人写道:“不要那样说。我不会让你伤害自己的。”
But at some point after that conversation, the teen took his own life.
但就在那次谈话之后的某个时间点,该少年选择结束了自己的生命。
And as the case makes headlines, many are wondering about the limits and dangers of the technology.
随着此案成为头条新闻,许多人开始质疑这项技术的局限性和危险性。
Character.AI released a statement, saying, in part, that they're "heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family."
Character.AI公司在事后发布了一份声明,表示,“我们对一名用户的不幸去世感到心碎,并向其家人表示最深切的哀悼。”