• m_f@midwest.social
    link
    fedilink
    arrow-up
    45
    ·
    5 months ago

    There’s at least one example you can look at, the Jenkins CI project had code like that (if (name.startsWith("windows 9")) {):

    https://issues.jenkins.io/secure/attachment/18777/PlatformDetail

    Microsoft, for all their faults, do (or at least did) take backwards compatibility very seriously, and the option of “just make devs fix it” would never fly. Here’s a story about how they added special code to Windows 95 to make SimCity’s broken code work on it:

    Windows 95? No problem. Nice new 32 bit API, but it still ran old 16 bit software perfectly. Microsoft obsessed about this, spending a big chunk of change testing every old program they could find with Windows 95. Jon Ross, who wrote the original version of SimCity for Windows 3.x, told me that he accidentally left a bug in SimCity where he read memory that he had just freed. Yep. It worked fine on Windows 3.x, because the memory never went anywhere. Here’s the amazing part: On beta versions of Windows 95, SimCity wasn’t working in testing. Microsoft tracked down the bug and added specific code to Windows 95 that looks for SimCity. If it finds SimCity running, it runs the memory allocator in a special mode that doesn’t free memory right away. That’s the kind of obsession with backward compatibility that made people willing to upgrade to Windows 95.

    • umbrella@lemmy.ml
      link
      fedilink
      arrow-up
      11
      arrow-down
      1
      ·
      5 months ago

      video drivers do this nowadays.

      its part of the reason your nvidia driver is gigabytes in size (other than the bloat)

      • Dudewitbow@lemmy.zip
        link
        fedilink
        arrow-up
        10
        ·
        5 months ago

        part of the reason why Nvidias drivers are larger is because theres a lot of functionality that nvidia throws onto as software rather than hardware. after kepler, nvidia moved the hardware scheduler off the gpu and into the driver. this resulted in lower power consumption, but higher cpu usage (reletive to amd). Its why AMD gpus fare better when paired with a aging cpu than Nvidia does.

        • umbrella@lemmy.ml
          link
          fedilink
          arrow-up
          3
          arrow-down
          1
          ·
          5 months ago

          this is interesting as fuck.

          where can i read more about his architectural change?

          • Dudewitbow@lemmy.zip
            link
            fedilink
            arrow-up
            0
            ·
            5 months ago

            i cant remember the article that mentions the archetectual change, but theres a few videos, one by Hardware Unboxed that goes over the phenomena.

            I first learned of it when a user who was using an i7-3770k “upgraded” from an AMD R9-290 to a Nvidia 1070 for battlefield reasons (idr which one). the user essentially lost FPS because he was being heavily CPU bottlenecked due to the Nvidia GPU/Driver.

            • umbrella@lemmy.ml
              link
              fedilink
              arrow-up
              1
              ·
              edit-2
              5 months ago

              that explains that effort to improve gpu scheduling in windows 10 a few years ago. turns out they were just compensating for this.

              ill look this video up, i love this subject but its hard to get information about it like this!