• peereboominc@lemm.ee
    link
    fedilink
    English
    arrow-up
    37
    ·
    1 year ago

    Also all the tracking and analysis tools that are integrated in most ecommerce are huge. We made a website for a company that was really fast and snappy. Then they added all their tools and it turned into a slug.

    • mostlypixels@programming.dev
      link
      fedilink
      English
      arrow-up
      15
      ·
      1 year ago

      “We will take facebook pixel and tiktok and klaviyo for the events, and oh our chatbot, google analytics, this other analytics thing, and…”

      I routinely have murderous urges. Ma’am, your site crashes my 6yo phone’s browser. Stop. I beg you.

  • variouslegumes@reddthat.com
    link
    fedilink
    English
    arrow-up
    33
    arrow-down
    2
    ·
    1 year ago

    Prioritizing developer experience is not the reason we use front-end frameworks. People expect the web to work like a desktop app (no page reloads). The initial request might take a little bit longer, but in the end a well written front-end app will feel faster.

    The problem is that people don’t worry about bundle size and cram every library off of npm into their website.

    • o11c@programming.dev
      link
      fedilink
      English
      arrow-up
      24
      arrow-down
      2
      ·
      1 year ago

      The solution is quite simple though: dogfood.

      Developers must test their website on a dialup connection, and on a computer with only 2GB of RAM. Use remote machines for compilation-like tasks.

      • variouslegumes@reddthat.com
        link
        fedilink
        English
        arrow-up
        17
        arrow-down
        1
        ·
        1 year ago

        Totally, pretty much all browsers include a way to simulate network conditions. Chrome also includes a way to simulate CPU slowdown.

        • o11c@programming.dev
          link
          fedilink
          English
          arrow-up
          8
          ·
          1 year ago

          and yet the very fact that you have to go out of your way to enable them means people don’t use them like they should.

      • masterspace@lemmy.ca
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        1 year ago

        Server rendered sucks ass. Why would I want to pay for an always running server just to render a webpage when the client’s device is more than capable of doing so?

        Centralization is just pushed because it’s easier for companies to make money off servers.

        • sznio@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Because it’s better to deliver a page in a single request, than to deliver it in multiple. If you render the page on the client you end up making a lot of requests, each one requiring a round trip and adding more and more delay.

        • philm@programming.dev
          cake
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          You don’t have to render everything on the server, a good hybrid is usually the way to go. Think SEO and initial response. I think lemmy-ui could will also benefit from it (google results)

          • masterspace@lemmy.ca
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            Yeah, it will give you the best of both worlds, but at a fundamental level I still hate that I have to pay for an always running server just for SEO, if I can get away with it I’d much prefer a purely static site that has to have its content pages rebuilt when they change.

    • asyncrosaurus@programming.dev
      link
      fedilink
      English
      arrow-up
      10
      ·
      edit-2
      1 year ago

      People expect the web to work like a desktop app (no page reloads).

      Do users expect it, or do product owners expect it? Because from my experience, typical users dgaf if a site is a SPA or is SSR as long as it’s functional and loads quickly. When we did user surveys, the legacy Wordpress version scored just as well as the fancy schmancy React re-write. Only time SPA outscored a traditional web page is (obviously) heavily interactive components (e.g. chat, scheduling calendar)

      • I Cast Fist@programming.dev
        link
        fedilink
        English
        arrow-up
        6
        ·
        1 year ago

        My personal bane are SPA with fixed scrolling. I’m on a fucking desktop, stop treating me like a fucking monkey incapable of scrolling exactly to where I want and fading text in only while in focus

  • HewlettHackard@lemmy.ca
    link
    fedilink
    English
    arrow-up
    29
    ·
    1 year ago

    This was more interesting than I expected. Though they didn’t clarify why it costs $700,000, given the context I assume it’s customers on slower devices/connectivity leaving rather than something like bandwidth?

      • I Cast Fist@programming.dev
        link
        fedilink
        English
        arrow-up
        9
        ·
        1 year ago

        The funny thing is that internet speeds, back in 2006, were significantly lower than today. And here we are, with 10x the speeds and pages somehow loading slower than back then!

    • Orvanis@lemm.ee
      link
      fedilink
      English
      arrow-up
      20
      ·
      1 year ago

      That was what I got from the article too. That the 700k was lost opportunity due to a poor user experience, not that it actually was them spending more money.

    • neil@programming.dev
      link
      fedilink
      English
      arrow-up
      16
      arrow-down
      1
      ·
      edit-2
      1 year ago

      Often, it boils down to one common problem: Too much client-side JavaScript. This is not a cost-free error. One retailer realized they were losing $700,000 a year per kilobyte of JavaScript, Russell said.

      “You may be losing all of the users who don’t have those devices because the experience is so bad,” he said.

      They just didn’t link to the one retailer’s context. But it’s “bring back old reddit” energy directed at everything SPA-ish.

      edit to give it a little personal context: I was stuck on geosat internet for a little while and could not use amazon’s site across the connection. I’m not sure if they’re the retailer mentioned. But the only way I could make it usable was to apply the ublock rule *.images-amazon.com/*.js^ described here.

      What really stunk about it was that if you’re somewhere where geosat is/was the only option, then you’re highly dependent on online retail. And knowing how to manage ublock rules is not exactly widespread knowledge.

    • malloc@programming.dev
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      1 year ago

      One retailer realized they were losing $700,000 a year per kilobyte of JavaScript, Russell said.

      It’s a retailer so definitely lost sales or conversions

  • lemillionsocks@beehaw.org
    link
    fedilink
    English
    arrow-up
    22
    arrow-down
    1
    ·
    1 year ago

    It feels like this has been an issue for some time now with the internet ballooning in how resource heavy it is despite many websites not becoming all that more functional. It’s the reason there is a meme of people being surprised that their browser tab is taking up so much ram. I mean yeah that news website may function similarly to how it did 10 years ago, but that tiny thumbnail is technically autoplaying a 1080p video, and despite being zoomed out in frame the photos uploaded in the background and thumbnails are also fairly large and high res even before you click on them, and there are countless other things running in the background that just arent worth it.

    There was a period in the late 00s and early to mid 10s where the rise of the smart phones delayed this trend and forced developers to reconsider more minimal global experience. Flash was killed off and things got lighter weight and the new media rich features were better optimized for performance.

    I think it’s also not just that the developers tend to have better devices as much as it’s a result of time and energy and resources put towards building software. Its similar to videogames. In the old days to save on resources a 2d game might use a single texture tile that could be mirrored, rotated, or color swapped so that precious ram space can be spared. Once the baseline or average hits a certain point(or a new console gen appears) a lot of that “optimization” goes away because it’s not needed. Sometimes it’s obvious and we’re better for it like clouds no longer having to play double duty as bushes, but othertimes it means that we move onto something that technically looks marginally better but absolutely leaves a good chunk of contemporary hardware in the dust.

    I think the most frustrating things about websites is that things arent that different for all the under the hood changes we get. Google maps is a lot slower in firefox than it used to be, and the android app uses more resources on mid range hardware than it used to(I’d know I remember using it on my HTC Dream/G1). Functionally I have been able to do the same things I can do now on google maps for probably more than a decade now. New technology has been introduced in the backend to make maps “better” but it is at the cost of CPU ticks and snappiness. Likewise a lot of news and article websites dont look that much different than they used to 10 years ago. Sure things are laid out differently and aesthetics change, but the navigation is fairly steady. But we have all this javascript and bandwidth sucking media autoloading and creating a slower experience. Even modern hardware can suffer from this.

  • bboplifa@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    1 year ago

    The article mentions that the new Javascript frameworks cater more to developers. As if developers have to use the garbage sites that they are developing, this article is very confused and written in a very bad way. Also, I would love to see a source to that clickbaity title other than “some guy said something”. I also find this sentence hilarious as a developer. Russell contended we’ve over-prioritized developer experience and the end user experience suffers as a result. This is clearly from somebody who doesnt understand the concept of javascript framework fatigue

    • Johanno@lemmy.fmhy.ml
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      I would assume that the problem with slow websites is unexpierenced developers combined with CEOs that don’t understand what their website is doing and analytics and ads and stuff.

  • I Cast Fist@programming.dev
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    1
    ·
    1 year ago

    Nearly everything done nowadays with 500MB+ of javascript libraries could already be done more than a decade ago with jQuery + AJAX.

    • ThoughtGoblin@lemm.ee
      cake
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      edit-2
      1 year ago

      I’m not a big subscriber to this notion. After working in both of the technologies (and more), and React/Vue is a significant boost in developer productivity compared to jQuery and AJAX. More features, less bugs, a more app-like web experience. Not to mention things like Native or Electron potentially saving on the cost of entirely separate apps.

      Further, the resulting assets can be even smaller after minification and bundling as long as you aren’t creating one giant blob that gets shipped on every minor, unrelated change and includes all the dependencies and source maps and assets – it’s important to remember many bundlers include media files – on production.

      I think there’s numerous opportunities for improvement to be had (diff-based updates, semver-aware CDN, smarter defaults, more leveraging of things like WebASM and improvement on the standards), for sure, but talk of “the good ole day” of jQuery certainly seems rose-tinted given how much of a mess it was in practice (for me, of course).

      • I Cast Fist@programming.dev
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Let me quote myself:

        Nearly everything done nowadays (…) could already be done more than a decade ago with jQuery + AJAX

        I dunno where or why you thought I put on nostalgia glasses and began to remiss the good old days. I stated a simple fact, at least about about web pages. I’m indifferent to jQuery and absolutely despise AJAX, it really was a complete mess.

        Electron is just Chrome bundled with a self hosted NPM server. If the intent is to write JS once and deliver everywhere, Neutralino and TauriJS are much better alternatives. The real irony is that it’s not hard to find stuff (mostly games) packaged in Electron, but distributed only for one OS, completely negating the main selling point.

        WebAssembly is one of the strangest things to ever grace us, when you think about it. Javascript is too slow, thus, in order to have faster apps, let’s make browsers capable of dealing with WebASM. So, in order to make bloated software work better, add more bloat? Oh, sorry, “feature”. Layers and layers and layers of extremly situational features, because we’re too lazy to come up with/use more efficient ways of making stuff crossplatform.

        • ThoughtGoblin@lemm.ee
          cake
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago
          1. Sorry if it was an assumption, I was speaking to the context you posted.
          2. I’m not discriminating between the specific abstraction layer. Anything that provides an HTML canvas, CSS, and JS is fine. But, at least with Electron, you can fine-tune things down really well with the use of native code and an API less constrained than the web standards. This is why VS Code is quite the snappy fella.
          3. Cross-platform is Electron’s second selling point, really. The first is the ability to create desktop apps using the fun JS web frameworks rather than learning Java, C#, or C++ and having to use the unpleasant UI frameworks they have - like QT. Clearly that’s the case for all the folk who only support one platform, at least.
          4. WebAssembly doesn’t seem weird to me at all? The web is a great way of distributing end-user software but can suffer from performance and control issues in the case of heavier applications. Web assembly is the logical conclusion that allows us to leverage the browser’s crazy powerful and optimized DOM, JS runtime, and layout engines, while having a super fast layer with a low interop cost to do that heavy work. Especially as they move towards gaming support via WebGL. Furthermore, it provides a sandboxed runtime with privilege control that downloading binaries from Itch simply can’t. It has a real purpose. Albeit, I again agree it’s execution has some issues.

          All this just to say: I think the common denigration of this tech (not specifically your comment, since you clarified) is a cynical take that ignores important economic factors. Modern web development is flawed, but the direction it has moved is still forward.

          Anyway, hope you have a good day!