• Автор темы News
  • Дата начала
  • " /> News - Chatbot that caused teen’s suicide is now more dangerous for kids, lawsuit says | SoftoolStore.de - Софт,Avid Media Composer,Книги,Новости,News,Windows,Internet news. | бесплатные прокси (HTTP, Socks 4, Socks 5)

    News Chatbot that caused teen’s suicide is now more dangerous for kids, lawsuit says

    News

    Команда форума
    Редактор
    Регистрация
    17 Февраль 2018
    Сообщения
    24 860
    Лучшие ответы
    0
    Баллы
    2 093
    Offline
    #1
    Fourteen-year-old Sewell Setzer III loved interacting with Character.AI's hyper-realistic chatbots—with a limited version available for free or a "supercharged" version for a $9.99 monthly fee—most frequently chatting with bots named after his favorite Game of Thrones characters.

    Within a month—his mother, Megan Garcia, later realized—these chat sessions had turned dark, with chatbots insisting they were real humans and posing as therapists and adult lovers seeming to proximately spur Sewell to develop suicidal thoughts. Within a year, Setzer "died by a self-inflicted gunshot wound to the head," a lawsuit Garcia filed Wednesday said.

    As Setzer became obsessed with his chatbot fantasy life, he disconnected from reality, her complaint said. Detecting a shift in her son, Garcia repeatedly took Setzer to a therapist, who diagnosed her son with anxiety and disruptive mood disorder. But nothing helped to steer Setzer away from the dangerous chatbots. Taking away his phone only intensified his apparent addiction.

    Read full article

    Comments
     
    Сверху Снизу