Currently, AI models are trained using GPUs. In the future though, Generative AI will probably require its own specialized ASICs to achieve the best performance. This happened with bitcoin mining a few years ago and is also the reason big tech companies are making their own CPUs now.

Since there are only a few companies on the planet capable of producing these chips in bulk, the government could easily place restrictions on the purchase of AI hardware. This would control who has access to the best AI.

Only the government and a few permitted parties have access to the best AI. Everyone else would use worse AI that, while still good enough for most people, could be detected by the government. The government could use their superior models to easily detect whether a post is AI-generated, for example, and provide that insight as a service to citizens.

Effectively, the government becomes the sole purveyor of truth, as opposed to that power being in the hands of whoever can afford the biggest computer.

  • nodsocket@lemmy.worldOP
    link
    fedilink
    arrow-up
    0
    arrow-down
    3
    ·
    10 months ago

    Would you rather let anyone with the money buy a nuke, or only let the governments have them? At least this way there’s a fewer number of psychopaths to worry about.

    • peto (he/him)@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      ·
      10 months ago

      Yeah, totally the same thing. Utterly comparable, you clearly fully understand what it is capable of and the risks it poses.

      I also respect your knowledge of nuclear weapons and the reasons why every billionaire doesn’t have a home defence warhead.

    • gzrrt@kbin.social
      link
      fedilink
      arrow-up
      1
      ·
      10 months ago

      I’d say LLMs are pretty comparable to an operating system (i.e., something anyone can buy, use and develop without any outside interference) and not comparable at all to nuclear weapons.