In a 1938 article, MIT’s president argued that technical progress didn’t mean fewer jobs. He’s still right.

Compton drew a sharp distinction between the consequences of technological progress on “industry as a whole” and the effects, often painful, on individuals.

For “industry as a whole,” he concluded, “technological unemployment is a myth.” That’s because, he argued, technology "has created so many new industries” and has expanded the market for many items by “lowering the cost of production to make a price within reach of large masses of purchasers.” In short, technological advances had created more jobs overall. The argument—and the question of whether it is still true—remains pertinent in the age of AI.

Then Compton abruptly switched perspectives, acknowledging that for some workers and communities, “technological unemployment may be a very serious social problem, as in a town whose mill has had to shut down, or in a craft which has been superseded by a new art.”

  • BurnSquirrel@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    5 months ago

    From what I noticed working tech, there is a pressure to be on the cutting edge at all costs and a lot of stuff gets over hyped to sell things to MBAs. I’ve seen a few disruptive technologies come in. They are almost never wrong about what the thing is or will be, but they are almost always wrong about the timeline it comes into being in a really mature way.

    • AnarchistArtificer@slrpnk.net
      link
      fedilink
      English
      arrow-up
      1
      ·
      5 months ago

      Are there any technologies that ended up being disruptive, but at the time, you thought "wow, what a load of hot air?. What about the opposite?

      • BurnSquirrel@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        5 months ago

        For a fizzle, its a little deep in the weeds, but network automation was totally overblown. 10-15 years ago it was said all the network guys would be coders and never touch a physical network device again. Nowadays, most of the services that do this are either jenky and bad, locked to a single vendor, or both, and most of the idea of “coding” is kind of abstracted from it. Some places build their own code infrastructure for networking but it’s usually just a few specialized devops guys, supporting a full network team, and making the benign changes that don’t require much though. It just never materialized like they predicted it would. It will someday, it’s just long been delayed.

        For a flash, I’d say cloud services, but it hasn’t lead to the mass lay offs of sys admins like was originally predicted. Sys admins traded fussing about wattages and rackspace for build more complicated logical infrastructures for servers they can spin up out of nothing.