• dasenboy@lemm.ee
    link
    fedilink
    arrow-up
    3
    arrow-down
    1
    ·
    1 day ago

    I use chatgpt a lot, what is the best non-billionaire funded llm? I really need to change to one that doesn’t worsen the world…

    • iheartneopets@lemm.ee
      link
      fedilink
      arrow-up
      2
      ·
      13 hours ago

      That may be hard, seeing as all AIs use ungodly amounts of electricity. So I’d say they all worsen the world.

    • knatschus@discuss.tchncs.de
      link
      fedilink
      arrow-up
      3
      ·
      1 day ago

      While deepseek is billionaire funded it still should be better if run locally I don’t think Foss llms are at that level yet

      • dasenboy@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        ·
        21 hours ago

        Thanks, another person mentioned it, I’m trying it now, hopefully it suits my needs.

    • MajorSauce@sh.itjust.works
      link
      fedilink
      arrow-up
      2
      ·
      1 day ago

      Try hosting locally DeepSync R1, for me the results are similar to ChatGPT without needing to send any into on the internet.

      LM Studio is a good start.

        • shadow@lemmy.sdf.org
          link
          fedilink
          arrow-up
          2
          ·
          11 hours ago

          Any relatively new gaming PC from the last, what, 4? Years has enough power to run local LLMs. Maybe not the ginormous 70GB behemoth models, but the toned down ones are pretty damn good and if you don’t mind waiting a few seconds while it thinks, you can run it completely locally as much as you want, and whenever you want.

        • MajorSauce@sh.itjust.works
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          18 hours ago

          You would benefit from it with some GPU offloading, this would considerably accelerate the speed of the answers. But you only need enough RAM to load the model at the bare minimum.