• bss03@infosec.pub
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    11 days ago

    The whole industry needs rebuilt from the foundations. GRTT with a grading ring that tightly controls resources (including, but not limited to RAM) as the fundamental calculus, instead of whatever JS happens to stick to the Chome codebase and machine codes spewed by your favorite C compiler.

      • bss03@infosec.pub
        link
        fedilink
        English
        arrow-up
        0
        ·
        11 days ago

        If someone wants to collab, I’ve been writing various codes around it: https://gitlab.com/bss03/grtt

        Right now, it’s a bunch of crap. But, it’s published, and I occasionally try to improve it.

        Also, Granule and Gerty are actual working implementations, tho I think some of the “magic” is in the right grading ring for the runtime, and they and more research oriented, allowing for fairly arbitrary grading rings.

    • Undearius@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      11 days ago

      It took me a long time to figure out that “GRTT” is “Graded Modal Type Theory”. Letting others know, if they want to look into it further.

  • merc@sh.itjust.works
    link
    fedilink
    arrow-up
    0
    ·
    11 days ago

    You do really feel this when you’re using old hardware.

    I have an iPad that’s maybe a decade old at this point. I’m using it for the exact same things I was a decade ago, except that I can barely use the web browser. I don’t know if it’s the browser or the pages or both, but most web sites are unbearably slow, and some simply don’t work, javascript hangs and some elements simply never load. The device is too old to get OS updates, which means I can’t update some of the apps. But, that’s a good thing because those old apps are still very responsive. The apps I can update are getting slower and slower all the time.

    • ssfckdt@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      0
      ·
      11 days ago

      It’s the pages. It’s all the JavaScript. And especially the HTML5 stuff. The amount of code that is executed in a webpage these days is staggering. And JS isn’t exactly a computationally modest language.

      Of the 200kB loaded on a typical Wikipedia page, about 85kb of it is JS and CSS.

      Another 45kB for a single SVG, which in complex cases is a computationally nontrivial image format.

      • 87Six@lemmy.zip
        link
        fedilink
        arrow-up
        0
        ·
        11 days ago

        I don’t agree. It’s both. I’ve opened basic no JS sites on old tablets to test them out and even those pages BARELY load

          • Passerby6497@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            11 days ago

            Probably just the browser itself, considering how bloated they’re getting. It’s not super surprising, considering the apps run about as fast (on a good day) as it did 5-10 years ago on a new phone, it’s gonna run like dogshit on a phone from that era.

    • NecroParagon@midwest.social
      link
      fedilink
      arrow-up
      0
      ·
      11 days ago

      I can’t update YouTube on my iPad 2 that I got running again for the first time in years. It said it had been 70,000~ hours since last full charge. I wanted to use it to watch videos on when I’m going to bed. But I can’t actually login to YouTube because the app is so old and I seemingly can’t update it.

      I was using the web browser and yeah I don’t remember it being so damn slow. It’s crazy how that is.

  • fonix232@fedia.io
    link
    fedilink
    arrow-up
    0
    ·
    11 days ago

    Uh, that’s a bit off to be fair.

    Our computers are 15x faster than they were about 15-20 years ago, sure…

    But 1, the speed is noticeable and 2, not all the new performance is utilised 100%.

    Sure, operating systems have started utilising the extra hardware to deliver the same 60-120fps base experience with some extra fancy (transparency with blur, for example, instead of regular transparency), but overall computer UX has plateaued a good decade and half ago.

    The bottleneck is not the hardware or bad software but simply the idea of “why go faster when going this speed is fine, and we now don’t need to use 15-30% of the hardware to do it but just 1-2%”.

    Oh and the other bottleneck is stupidity, like letting long running tasks onto the main/UI thread, fucking up the experience. Unfortunately, you can’t help stupid.

    • dfyx@lemmy.helios42.de
      link
      fedilink
      English
      arrow-up
      0
      ·
      11 days ago

      Well, until you open a browser… or five, because these days nobody wants to build native applications anymore and instead they shove webapps into electron containers.

      Right now, my laptop doesn’t have to run much. Just a combination of KDE, browser, emails, music player, a couple of messengers and some background services. In total, that uses about 9.5 GB of RAM. 20 years ago we would have run the same workload with less than 1 GB.

  • OwOarchist@pawb.social
    link
    fedilink
    English
    arrow-up
    0
    ·
    11 days ago

    When you become one with the penguin, though … then you can begin to feel how much faster modern hardware is.

    Hell, I’ve got a 2016 budget-model chromebook that still feels quick and snappy that way.

    • MudMan@fedia.io
      link
      fedilink
      arrow-up
      0
      ·
      11 days ago

      But… 2016 was a decade ago. If it feels quick and snappy that way that means the post is right.

      Which it kinda isn’t but hey.

      • Skullgrid@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        11 days ago

        the point is the software is what’s wrong, not the hardware. it feels snappy because it’s linux, not because it’s hold hardware.

        • MudMan@fedia.io
          link
          fedilink
          arrow-up
          0
          ·
          11 days ago

          Except the Linux userbase has been saying that exact thing for the past ten years, so again, has Linux also degraded in sync or, hear me out here, is this mostly a nostalgia thing that makes you forget the cludgy performance issues of the software you used when you were younger and things have mostly gotten snappier over time across the board?

          As a current dual booter I’ll say that Windows and Linux don’t feel fundamentally different these days, for good and ill. Windows has a remarkably crappy and entirely self-inflicted issue with their online-search-in-Start-menu feature, which sucks but is toggleable at least. Otherwise I have KDE and Win11 set up the same way and they both work pretty much the same. And both measurably better than their respective iterations 10, let alone 15 or 20 years ago.

          • OwOarchist@pawb.social
            link
            fedilink
            English
            arrow-up
            0
            ·
            11 days ago

            Windows and Linux don’t feel fundamentally different these days

            Try Windows 11 vs. Linux on a shitty old laptop with a budget 2-core processor and 2GB of RAM. Then tell me Windows and Linux don’t feel any different.

            • swelter_spark@reddthat.com
              link
              fedilink
              English
              arrow-up
              0
              ·
              11 days ago

              My bf bought me a brand new laptop with Win 10 preinstalled, and even after disabling or uninstalling as much as I could, it was literally like watching a slideshow. Then I installed Linux, and it…worked like you’d expect a brand new computer to work, fast and smooth. Never used Win 11 because I stopped using Windows after that.

    • iByteABit@lemmy.ml
      link
      fedilink
      arrow-up
      0
      ·
      11 days ago

      I’ve got a 2007 laptop that was shitty even for its time, and it does the job perfectly as a home server with Debian and a few good open source services I want to host

  • GreenShimada@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    11 days ago

    For anyone unsure: Jevon’s Paradox is that when there’s more of a resource to consume, humans will consume more resource rather than make the gains to use the resource better.

    Case in point: AI models could be written to be more efficient in token use (see DeepSeek), but instead AI companies just buy up all the GPUs and shove more compute in.

    For the expansive bloat - same goes for phones. Our phones are orders of magnitude better than what they were 10 years ago, and now it’s loaded with bloat because the manufacturer thinks “Well, there’s more computer and memory. Let’s shove more bloat in there!”

    • GamingChairModel@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      11 days ago

      Jevon’s Paradox is that when there’s more of a resource to consume, humans will consume more resource rather than make the gains to use the resource better.

      More specifically, it’s when an improvement in efficiency cause the underlying resource to be used more, because the efficiency reduces cost and then using that resource becomes even more economically attractive.

      So when factories got more efficient at using coal in the 19th century, England saw a huge increase in coal demand, despite using less coal for any given task.

      • Quetzalcutlass@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        11 days ago

        Also Eli Whitney inventing the cotton gin to make extracting cotton less of a tedious and backbreaking process, which lead to a massive expansion in slavery plantations in the American South due to the increased output and profitability of the crop.

    • VibeSurgeon@piefed.social
      link
      fedilink
      English
      arrow-up
      0
      ·
      11 days ago

      Case in point: AI models could be written to be more efficient in token use

      They are being written to be more efficient in inference, but the gains are being offset by trying to wring more capabilities out of the models by ballooning token use.

      Which is indeed a form of Jevon’s paradox

      • errer@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        11 days ago

        Costs have been dropping by a factor of 3 per year, but token use increased 40x over the same period. So while the efficiency is contributing a bit to the use, the use is exploding even faster.

    • frunch@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      11 days ago

      I always felt American car companies were a really good example of that back in the 60s-70s when enormously long vehicles with giant engines were the order of the day. Why not bigger? Why not stronger? It also acted as a symbol of American strength, which was being measured by raw power just like today lol.

      This also reminds me of the way video game programmers in the late 70s/early 80s had such tight limitations to work within that you had to get creative if you wanted to make something stand out. Some very interesting stories from that era.

      I also love to think about the tricks the programmer of Prince of Persia had employed to get the “shadow prince” to work…

      https://www.youtube.com/watch?v=sw0VfmXKq54

  • Quicky@piefed.social
    link
    fedilink
    English
    arrow-up
    0
    ·
    11 days ago

    I feel like this is Windows specific. Linux is rapid on PCs and my MacBook is absurdly quick.

    • Auster@thebrainbin.org
      link
      fedilink
      arrow-up
      0
      ·
      11 days ago

      Mint Xfce on my 2015 laptop compared to its previous system was the difference between usable and waiting 10 minutes for it to even boot, and things like gaming, VMs, comically large spreadsheets (surprisingly the memory hog), etc., were an eternal challenge on it. On my current laptop, I have the luxury of picking the systems by aesthetics and non-optimization functions instead. And to compare, I’ve run even the same updates on the two laptops, as the older one still works.

    • zwerg@feddit.org
      link
      fedilink
      arrow-up
      0
      ·
      11 days ago

      My 12? 13? Year old Dell laptop does just fine running Ubuntu. It’ll probably be fine for my needs for another 3 or 4 years at least.

    • RaccoonBall@lemmy.ca
      link
      fedilink
      arrow-up
      0
      ·
      11 days ago

      App launch time can be annoyingly slow on mac if you’re not offline or blocking the server it phones home to

      it can be the difference between one bounce or seven bounces of the icon on my end

      • Quicky@piefed.social
        link
        fedilink
        English
        arrow-up
        0
        ·
        11 days ago

        What apps out of interest? I’m a new Mac owner, so limited experience, but everything seems insanely quick so far. Even something like Xcode is a one-bounce on this M4 Air.

        • RaccoonBall@lemmy.ca
          link
          fedilink
          arrow-up
          0
          ·
          10 days ago

          All of them. The device has to phone home to apple to ask permission to run them.

          to test close app (really shut, make sure dot on icon isnt glowing) then open and measure time

          close app and then disconnect from the internet and launch again

          the speed difference depends on how overloaded apples servers are.

          • Quicky@piefed.social
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            9 days ago

            I’ve not come across this but I’ll check it out. Is that App Store apps only?

            I think probably 90% of the apps I’ve installed have been through the homebrew package manager which likely means they don’t do any phoning home, but I’ll check out the pre-installed stuff and see if I can replicate.

    • sp3ctr4l@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      11 days ago

      PC games are software.

      Unfortunately many PC games are also like this, astoundingly poorly optimized, just assume everyone has a $750 GPU.

      Proton can only do so much.

      … and Metal basically can’t do that that much.

      Look at Metal Gear Solid 5 or TitanFall 2, and tell me realtime video game graphics have dramatically increased in visual fidelity in the last decade.

      They haven’t really.

      They shifted to a poorly optimized, more expensive paradigm for literally everyone involved; publisher, developer, player.

      Everything relating to realtime raytracing and temporal antialiasing is essentially a scam, in the vast majority of actual implementations of it.

      • Quicky@piefed.social
        link
        fedilink
        English
        arrow-up
        0
        ·
        11 days ago

        I guess the counter argument for games is load times have dramatically improved, though that’s less about software development than hardware improvements.

        If we put consoles in the same bracket as computers, the literally instant quick-resume feature on an Xbox (for example) feels like sci-fi.

        • sp3ctr4l@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          0
          ·
          11 days ago

          Yeah, you kinda defeated your own argument there, but you do seem to recognize that.

          You can instant resume on a Steam Deck, basically.

          You can alt tab on a PC, at least with a stable game that is well made and not memory leaking.

          Yeah, better RAM / SSDs does mean lower loading times, higher streaming speeds/bus bandwidths, but literally, at what cost?

          You could just actually take the time to optimize things, find non insanely computationally expensive ways to do things that are more clever, instead of just saying throw more/faster ram at it.

          RAM and SSD costs per gig are going up now.

          Moore’s Law is not only dead, it has inverted.

          Constantly cheaper memory going forward turned out to not the best assumption to make.

    • doleo@lemmy.one
      link
      fedilink
      English
      arrow-up
      0
      ·
      11 days ago

      Anyone opening the app menu (from the dock or Home Screen) on an iPad will tell you that it’s not exclusive to windows pcs.

  • caschb@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    11 days ago

    The problem is not so much badly written programs and websites in terms of algorithms, but rather latency. The latency of loading things from storage, sometimes through the internet is the real bottleneck and why things feel so slow.

    • furry toaster@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      0
      ·
      11 days ago

      even with mothern ssds, things sometimes feel slower than what they were with hdds with time accurate software, windows 7 was snappy even on a hdd, windows 11 is slow and sluggish everywhere

  • MudMan@fedia.io
    link
    fedilink
    arrow-up
    0
    ·
    11 days ago

    That’s some nonsense, though.

    For one thing, it’s one of those tropes that people have been saying for 30 years, so it kinda stops making sense after a while. For another, the reason it doesn’t make sense is it doesn’t account for modern computers doing more now than they did then.

    In 2016 I had a 970 that’s still in an old computer I use as a retro rocket, and I can promise you that wonderful as that thing was, I couldn’t have been playing Resident Evil this week on that thing. So yeah, I notice.

    And I had a Galaxy S7 then, which is still in use as a bit of a display and I assure you my current phone is VERY noticeably faster, even discounting the fact that its displaying 120fps rather than 60.

    Old people have been going “things were better when I was a kid” for millennia. I’m not assuming we’re gonna stop now, but… maybe we should.

    • skisnow@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      11 days ago

      it doesn’t account for modern computers doing more now than they did then.

      We know they do more, but most of that “more” is bullshit bloat and sloppy engineering, neither of which we asked for.

      When I boot up Windows 11 and start no programs, somehow there’s already over 8Gb RAM already consumed, the CPU shows 8% utilisation, there’s 244 processes running, and it still took every bit as long to get there as my Windows 7 machine from 2011. That’s what the rest of us are talking about.

      • MudMan@fedia.io
        link
        fedilink
        arrow-up
        0
        ·
        11 days ago

        When was the last time you booted a 2011 machine? Because man, is that not true.

        And that’s a 2016-2017 era PC.

        Windows 7 didn’t even have fast boot support at all. I actively remember recommending people to let their PCs sit for a couple of minutes after booting so that Windows could finish whatever the hell it was trying to do in the background faster instead of clogging up whatever else you were trying to do.

        Keeping my old hardware around compulsively really impacts my perception of this whole “things were better when I was a teenager” stuff.

        • skisnow@lemmy.ca
          link
          fedilink
          English
          arrow-up
          0
          ·
          11 days ago

          I very specifically said “my windows machine from 2011” because it’s still in use.

          • MudMan@fedia.io
            link
            fedilink
            arrow-up
            0
            ·
            11 days ago

            Then you’re either lying about it or haven’t booted a newer PC. Fast Boot was a back of the box feature for Windows 8 for a reason. It was becoming a huge meme at the time how slow Win7 was to boot.

            If your 2020s PC with Windows 11 is taking 45 seconds to boot on the Windows logo like Win 7 does (as seen in the benchmarks above) then you need some tech support because something is clearly not working as expected. I don’t think even my weaker Win11 machines take longer than 10 secs from boot starting to the password screen.

            That may be true anyway, because the tiny hybrid laptop I’m using to write this is reporting 2-5% CPU utilization even with a literal hundred tabs open in this browser. So… yeah, either you have a knack for hyperbole or something broken.

            • skisnow@lemmy.ca
              link
              fedilink
              English
              arrow-up
              0
              ·
              11 days ago

              Neither of them are fresh out the box clean installs, they are what they are.

              But anyway, I’ve been trying to engage in good faith but you keep being obnoxious on every single post you make, so it’s Block time

              • MudMan@fedia.io
                link
                fedilink
                arrow-up
                0
                ·
                11 days ago

                Hah. You do you. I get how it’d be obnoxious to be called out, but man, it’s not my fault that you chose the worst possible example for this. Like, literally the worst iteration of Windows for the specific metric you called out, in a clearly demonstrable way that a ton of people measured because it was such a meme.

                You can block me, but “they are what they are” indeed.

                Incidentally, this is a classic opportunity to remind people that blocking on AP applications sucks ass and the only effect it has is for the blocker to stop being able to see what the blockee is saying about them while everybody else still gets access to both. Speaking of software degradation, somebody should look into that.

  • ssfckdt@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    0
    ·
    11 days ago

    The program expands so as to fill the resources available for its execution

    – C.N. Parkinson (if he were alive today)