• Автор темы News
  • Дата начала
  • " /> News - Chatbots urged teen to self-harm, suggested murdering parents, lawsuit says | SoftoolStore.de - Программное обеспечение, Avid Media Composer, Книги, Новости, Windows, Интернет-новости, Бесплатные прокси (HTTP, Socks 4, Socks 5)

    News Chatbots urged teen to self-harm, suggested murdering parents, lawsuit says

    News

    Команда форума
    Редактор
    Регистрация
    17 Февраль 2018
    Сообщения
    29 839
    Лучшие ответы
    0
    Баллы
    2 093
    Offline
    #1
    After a troubling October lawsuit accused Character.AI (C.AI) of recklessly releasing dangerous chatbots that allegedly caused a 14-year-old boy's suicide, more families have come forward to sue chatbot-maker Character Technologies and the startup's major funder, Google.

    On Tuesday, another lawsuit was filed in a US district court in Texas, this time by families struggling to help their kids recover from traumatizing experiences where C.AI chatbots allegedly groomed kids and encouraged repeated self-harm and other real-world violence.

    In the case of one 17-year-old boy with high-functioning autism, J.F., the chatbots seemed so bent on isolating him from his family after his screentime was reduced that the bots suggested that "murdering his parents was a reasonable response to their imposing time limits on his online activity," the lawsuit said. Because the teen had already become violent, his family still lives in fear of his erratic outbursts, even a full year after being cut off from the app.

    Read full article

    Comments
     
    Сверху Снизу