• athairmor@lemmy.world
    link
    fedilink
    arrow-up
    50
    ·
    2 days ago

    This is one example of this isn’t really intelligence of any kind. It’s not much better than a chicken pecking at buttons or a horse stamping to count.

    • mindbleach@sh.itjust.works
      link
      fedilink
      arrow-up
      4
      arrow-down
      8
      ·
      2 days ago

      Ehhhh. Saying it’s not intelligence “of any kind,” when it can construct whole relevant sentences, is confusing intelligence for correctness. LLMs represent a lesser form of reasoning - like the difference between Turing machines and pushdown automata. They’re plainly doing some of what goes into proper general thinky-thinky behavior. They’re just not doing enough of it to avoid obvious fuckups.