From the article:

This chatbot experiment reveals that, contrary to popular belief, many conspiracy thinkers aren’t ‘too far gone’ to reconsider their convictions and change their minds.

  • SpaceNoodle@lemmy.world
    link
    fedilink
    arrow-up
    8
    arrow-down
    3
    ·
    2 days ago

    If they’re gullible enough to be suckered into it, they can similarly be suckered out of it - but clearly the effect would not be permanent.

    • Whopraysforthedevil@midwest.social
      link
      fedilink
      English
      arrow-up
      1
      ·
      27 minutes ago

      I’ve always believed the adage that you can’t logic someone out of a position they didn’t logic themselves into. It protects my peace.

    • Zexks@lemmy.world
      link
      fedilink
      arrow-up
      3
      arrow-down
      1
      ·
      2 days ago

      That doesn’t follow with the “if you didnt reason your way into a believe you can’t reason your way out” line. Considering religious ferver I’m more inclined to believe this line than yours.

      • Azzu@lemm.ee
        link
        fedilink
        arrow-up
        5
        ·
        1 day ago

        No one said at all that AI used “reason” to talk people out of a conspiracy theory. In fact I would assume it’s incredibly unlikely since AI in general is not reasonable.