Lugh@futurology.todayM to Futurology@futurology.todayEnglish · 7 months agoEvidence is growing that LLMs will never be the route to AGI. They are consuming exponentially increasing energy, to deliver only linear improvements in performance.arxiv.orgexternal-linkmessage-square66fedilinkarrow-up1300arrow-down127
arrow-up1273arrow-down1external-linkEvidence is growing that LLMs will never be the route to AGI. They are consuming exponentially increasing energy, to deliver only linear improvements in performance.arxiv.orgLugh@futurology.todayM to Futurology@futurology.todayEnglish · 7 months agomessage-square66fedilink
minus-squareconciselyverbose@sh.itjust.workslinkfedilinkEnglisharrow-up1·7 months agoA combination of unique, varied parts is a complex algorithm. A bunch of the same part repeated is a complex model. Model complexity is not in any way similar to algorithmic complexity. They’re only described using the same word because language is abstract.
A combination of unique, varied parts is a complex algorithm.
A bunch of the same part repeated is a complex model.
Model complexity is not in any way similar to algorithmic complexity. They’re only described using the same word because language is abstract.