• 1 Post
  • 54 Comments
Joined 1 year ago
cake
Cake day: June 9th, 2023

help-circle
  • Yeah im not an apple fan. (My brother would have a heart attack if I didnt say that. He loves them).

    But the fact they controll both hardware and software means they can run on lower specs. They dont use it as well as they could. But android having to allow others to develop hardware. Provides a bit more ability for manufactures to implement less efficient drivers. This is why some higher spec low value stuff seems so slow compared to equal speced cheaper Samsung stuff etc.


  • Well nowadays yes. But when the term smartphone was invented. Really not.

    The 1st iPhone was way lower spec then many high end phones of the time. Mainly Nokia but others as well.

    Early androids and others def had no specific specs that differed them from other high-end phones such as Symbian Win CE (as crap as the OS was but then so was the smartphone mareted version recreated later on)

    Seriously, marketing was the only thing that differed them from phones like the N95 and communicator etc etc.

    And as I mentioned, the locked store front. That really seem to be the main difference but really I still find non-advantageous myself.





  • High use Blender users tend to avoid AMD for the reasons you point out.

    This leads to less updates due to amd users not being to interested in the community.

    It is an issuw without any practicle solution. Because as I need a long overdue update. Again nvidia seems the only real choice.

    Everyone is sorta forced to do that unless we can convince amd users to just try out blender and submit results.

    So hi any AMD users who dont care about blender.

    Give it a try and submit performance data please.





  • I will add as a narrowboater.

    I found towpaths also have this issue with definition of surface.

    I am legally blind. (Some vision but bad)

    I have a few times tried to add more ditail to areas of towpath that will help the others like me know what to expect before mooring.

    Seems anything that improves this will help in your issues as well.




  • Cool. At the time, it was one of the best. Although, I also liked sun-os.

    I also worked with VMS a lot after uni. Hated using it. But had to respect the ideals behind it.

    But watching the growth of Linux has been fantastic. In 2024. It does seem to have out evolved all the others. ( Evolved, defined as developed the ability to survive by becoming so freaking useful. )

    I am starting to think it is time for a micro kernel version, though.



  • Late 1990s my uni had unix workstations HPUX.

    So all projects etc were expected to be done on those. Linux at the time was the easy way to do it from home.

    By the time I left uni in 98. I was so used to it windows was a pain in the butt.

    For most of the time since I have been almost 100% linux. With just a dual boot to sort some hardware/firmware crap.

    Ham radio to this day. Many products can only do updates with windows.



  • Yeah any reverse engineering of closed source code takes time. It’s a huge job on its own. Adding the need to avoid actions that may lead to legal issues.

    Well yep, It’s very likely this may never round to a perfect replacement product.

    But it still has value. For starters, it will encourage new open source projects to use it rather than the propria try version, long before it’s a direct replacement capable product.

    So the effort is worth some excitement. At least a pat on the back and free beer for some of the guys trying.


  • Just of the top of my head discovered today.

    Not a GUI as one exists. But a more configurable one as it is crap for visually impaired.

    Rpi-imager gui dose not take theme indications for font size etc. Worse it has no configuration to change such thing.

    Making it pretty much unsuable for anyone with poor vision.

    Also it varies for each visually impaired indevidual. But dark mode is essential for some of ua.

    So if your looking for small projects. Youd at least make me happy;)



  • Yep pretty much but on a larger scale.

    1st please do not believe the bull that there was no problem. Many folks like me were paid to fix it before it was an issue. So other than a few companies, few saw the result, not because it did not exist. But because we were warned. People make jokes about the over panic. But if that had not happened, it would hav been years to fix, not days. Because without the panic, most corporations would have ignored it. Honestly, the panic scared shareholders. So boards of directors had to get experts to confirm the systems were compliant. And so much dependent crap was found running it was insane.

    But the exaggerations of planes falling out of the sky etc. Was also bull. Most systems would have failed but BSOD would be rare, but code would crash and some works with errors shutting it down cleanly, some undiscovered until a short while later. As accounting or other errors showed up.

    As other have said. The issue was that since the 1960s, computers were set up to treat years as 2 digits. So had no expectation to handle 2000 other than assume it was 1900. While from the early 90s most systems were built with ways to adapt to it. Not all were, as many were only developing top layer stuff. And many libraries etc had not been checked for this issue. Huge amounts of the infra of the world’s IT ran on legacy systems. Especially in the financial sector where I worked at the time.

    The internet was a fairly new thing. So often stuff had been running for decades with no one needing to change it. Or having any real knowledge of how it was coded. So folks like me were forced to hunt through code or often replace systems that were badly documented or more often not at all.

    A lot of modern software development practices grew out of discovering what a fucking mess can grow if people accept an “if it ain’t broke, don’t touch it” mentality.