• AwkwardLookMonkeyPuppet@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    15 hours ago

    They mimic the inputs. Microsoft made a chatbot a few years ago named Tay who turned into a hateful Nazi in less than 24 hours because Microsoft didn’t install any safeguards around the type of inputs it received. The program was scrapped almost immediately.