• Rustmilian@lemmy.world
      link
      fedilink
      arrow-up
      8
      arrow-down
      1
      ·
      1 year ago

      Good luck with that. FOSS is transparency on a source code level, there’s no obscurity they can hide their back door behind.

        • Rustmilian@lemmy.world
          link
          fedilink
          arrow-up
          6
          arrow-down
          1
          ·
          edit-2
          1 year ago

          You realize nobody would know about this in the first place if it was Proprietary, right?
          FOSS allows for whistleblowers, scrutiny, and audits. Proprietary ‘security via obscurity’ does not.

          • sir_reginald@lemmy.world
            link
            fedilink
            arrow-up
            9
            ·
            1 year ago

            I’m perfectly aware of all that. but cryptography is an extremely complicated discipline that even the most experienced mathematicians have a hard time to design and scrutinize an algorithm, they heavily rely on peer review. If one major institution like the NIST is biased by the NSA, they will have a bigger chance of compromising algorithms if that are their intentions.

            • Rustmilian@lemmy.world
              link
              fedilink
              arrow-up
              3
              ·
              edit-2
              1 year ago

              You’d be surprised what the world wide collective of Cryptographers are capable of when they’re able to scrutinize a project in the first place. Which would you prefer? A closed unscrutinizable encryption algorithm or one that’s entirely open from the ground up?
              NIST could do damage if they’re biased, but it’s not like people aren’t keeping a close eye on them and scrutinizing as many mistakes as possible. Especially for an algorithm as globally important as PQC.

              • sir_reginald@lemmy.world
                link
                fedilink
                arrow-up
                6
                arrow-down
                1
                ·
                1 year ago

                I’m totally against anything proprietary. That’s the first requisite for anything I use. And I’m not advocating for proprietary algorithms at all, that would be very much the demise of encryption.

                I’m just worried that a sufficiently influent actor (let’s say a government) could theoretically bribe these institutions to promote weaker encryption standards. I’m not even saying they are trying to introduce backdoors, just that like the article suggest they might bias organizations to support weaker algorithms.

                AES 128 bits is still considered secure in public institutions, when modern computers can do much stronger encryption without being noticeable slower.

                • Rustmilian@lemmy.world
                  link
                  fedilink
                  arrow-up
                  2
                  ·
                  edit-2
                  1 year ago

                  A huge amount of organization are already biased and using weaker algorithms… They just do so under the obscurity of proprietary software so it’s much harder to scrutinize them.

    • They did this before with the eliptic curve cryptography, and we knew it had this problem before it was implemented as a standard.

      So if the NSA offers a standard, don’t trust it and include in your encryption software the option to use something different.

      • kraniax
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        do you happen to have any links? I’d like to read more about this.

        • Uriel238 [all pronouns]@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          1 year ago

          It was a big deal in the early 2010s so easy to web search. Techdirt had a lot of posts on it, so you might be able to search them for key words like eliptic, encryption, NSA, NIST, etc.

          Also at the same time, NSA was wooing penetration testers to sell them zero-day vulnerabilities rather than reporting them to appropriate public forums or software developers. Around this time large companies liked to sue white-hats for CFAA violations rather than paying the bounties for discovered vulnerabilities, deflecting said hats towards gray- and black- activities. Some would sell these vulnerabilities to other non-NSA interests, leading to ransomware epidemics and other fun hacker shenanigans.

          It’s a good time to be a hacker without scruples, especially since the NSA is continuing its surveillance efforts rather than securing communications of the free world. (The latter is – was? – the mission of the NSA in the 20th century.)