• deafboy@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    10 months ago

    I don’t know much, but from what I know, we still haven’t reach a point of diminishing returns, so more power = more better.

    • kadu@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      10 months ago

      There is a lot of theoretical work on this problem, but I’m in the camp that isn’t convinced large language models are the path towards general intelligence.

      Throw 10x the computing power on it and it might learn that a maths equation is reversible, because it will probably have seen enough examples of that. But it won’t learn what an equation represents, and therefore won’t extrapolate situations that can be solved by equations.

      • Kogasa@programming.dev
        link
        fedilink
        English
        arrow-up
        2
        ·
        10 months ago

        You can already ask ChatGPT to model a real life scenario with a simple math equation. There is at least a rough model of how basic math can be used to solve problems.

    • fidodo@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      10 months ago

      Not necessarily since you also need better techniques. A competitor could easily surpass you with less by being smarter about how the AI is trained.