Meta “programmed it to simply not answer questions,” but it did anyway.

  • InternetUser2012@lemmy.today
    link
    fedilink
    English
    arrow-up
    0
    ·
    3 months ago

    Well, if the chatbot learned anything from Dementia Don the racist rapist with 34 felonies that can’t complete a coherent sentence, it learned that you never tell the truth.