This is one example of this isn’t really intelligence of any kind. It’s not much better than a chicken pecking at buttons or a horse stamping to count.
Ehhhh. Saying it’s not intelligence “of any kind,” when it can construct whole relevant sentences, is confusing intelligence for correctness. LLMs represent a lesser form of reasoning - like the difference between Turing machines and pushdown automata. They’re plainly doing some of what goes into proper general thinky-thinky behavior. They’re just not doing enough of it to avoid obvious fuckups.
This is one example of this isn’t really intelligence of any kind. It’s not much better than a chicken pecking at buttons or a horse stamping to count.
Ehhhh. Saying it’s not intelligence “of any kind,” when it can construct whole relevant sentences, is confusing intelligence for correctness. LLMs represent a lesser form of reasoning - like the difference between Turing machines and pushdown automata. They’re plainly doing some of what goes into proper general thinky-thinky behavior. They’re just not doing enough of it to avoid obvious fuckups.