• 5 Posts
  • 407 Comments
Joined 4 years ago
cake
Cake day: January 21st, 2021

help-circle

  • kevincox@lemmy.mltoSelfhosted@lemmy.worldMini pc arriving tomorrow
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    6
    ·
    4 days ago

    IMHO Arch is actually a great choice. They do have a minimum update frequency you need to maintain (I don’t recall exactly, I think it is somewhere between 1 and 3 months) but if you do, and read the news before updates (and you are usually fine if you don’t, usually the update will just refuse to run until you intervene) things are pretty seamless. I had many arch machines running for >5 years with no issues and no reason to expect that it would change. This is many major version updates for other distros which are often not as seamless.

    That being said I am on NixOS now which takes this to the next level, I am running nixos-unstable but thanks to the way NixOS is structured I don’t need to worry about any legacy cruft accumulating from the many years of updates.

    And after all of that I don’t think it really matters. I think any major distro you pick, weather stable, release-based or LTS will be fine. They all have some sort of update path these days. (unlike in the past where some distros just recommended a re-install for major updates).


  • Only if they gain possession when the device is running with the drive decrypted and they keep it running the whole time. That is a lot higher bar then being able to turn the machine on at any time and then recover the key. For example if this is a laptop that you are flying with. Without auto-decryption you can simply turn it off and be very secure. With auto-decryption they can turn it on then extract the key from memory (not easy, but definitely possible and with auto-decryption they have as long as they need, including sending the device to whatever forensics lab is best equipped to extract the key).


    1. Wiping the drive is a lot easier, just overwrite the root key a few times.
    2. If you store the key on a different drive you can safely dispose of the drive just by separating the two. (I do on my home server, keeping the decryption key on a USB drive. If I need to ship the server or discard old hardware I can just hold onto the thumb drive and not worry about the data being read.)

    Security is always about tradeoffs. On my home server unattended reboots are necessary so it needs to auto-decrypt. But using encryption means I don’t need to worry about discarding broken hardware or if I need to travel with the server were it may be inspected. For my laptop, desktop and phone where I don’t need unattended reboots I require the encryption key on bootup.



  • That’s true. And I’m not saying B2 is bad, it is just something that you should be aware of.

    Their automatic replication isn’t quite as seamless as GCS or S3 though. For example deletes aren’t replicated so you will need a cleanup strategy. Plus once you 2x or 3x the price B2 isn’t as competitive on price. My point is that it is very easy to compare apples to oranges looking at cloud storage providers and it is important to be aware.

    For me B2 is a great fit and I am happy with it, but I don’t wan to mislead peope.


  • I think it depends on your needs. IIUC their storage is “single location”. Like a very significant natural disaster could take it offline or maybe even lose it. Something like S3 or Google Cloud Storage (depending on which durability you select) is multi-location (as in significantly distinct geographical regions). So still very likely that you will never lose any data, but in the extreme cases potentially you could.

    If I was storing my only copy of something it would matter a lot more (although even then you are best to store with multiple providers for social reasons, not just technical) but for a backup it is fine.



  • there will be scaling with all of its negative consequences on perceived quality

    In theory this is true. If you had a nice high-bitrate 1080p video it may look better on a 1080 display than any quality of 1440p video would due to loss while scaling. But in almost all cases selecting higher resolutions will provide better perceived quality due to the higher bitrate, even if they aren’t integer multiples of the displayed size.

    It will also be more bandwidth efficient to target the output size directly. But streaming services want to keep the number of different versions small. Often this will already be >4 resolutions and 2-3 codecs. If they wanted to also have low/medium/high for each resolution that would be a significant cost (encoding itself, storage and reduction in cache hits). So they sort of squish the resolution and quality together into one scale, so 1080p isn’t just 1080p it also serves as a general “medium” quality. If you want “high” you need to go to 1440p or 2160p even if your output is only 1080.


  • For me the biggest benefit is the ease of applying patches. For example in Nix I can easily take a patch that is either unreleased, or that I wrote myself, and apply it to my systems immediately. I don’t need to wait for it to be released upstream then packaged in my distro. This allows me to fix problems and get new features quickly without needing to mess with my system in any other way (no packages in other directories that need to be cleaned up, no extra steps after updates to remember, no cases where some packages are using different versions and no breaking due to library ABI breaks).

    Another benefit that you are pointing at is changing build flags. Often times I want to enable an optional feature that my distro doesn’t enable by default.

    Lastly building packages with different micro-architecture optimizations can be beneficial. I don’t do this often but occasionally if I want to run some compute-heavy work it can be nice to get a small performance boost.


  • the reason no one posts the bitrates is because it’s not exactly interesting information for the the general population.

    But they post resolutions, which are arguably less interesting. The “general public” has been taught to use resolution as a proxy of quality. For TVs and other screens this is mostly true, but for video it isn’t the best metric (lossless video aside).

    Bitrate is probably a better metric but even then it isn’t great. Different codes and encoding settings can result in much better quality at the same bitrate. But I think in most cases it correlates better with quality than resolution does.

    The ideal metric would probably be some sort of actual quality metric, but none of these are perfect either. Maybe we should just go back to Low/Med/High for quality descriptions.




  • require a separate device that looks like a calculator to use online banking

    To be fair this actually provides a very high level of security? At least in my experience with AIB (in Ireland) you needed to enter the amount of the transactions and some other core details (maybe part of the recipient’s account number? can’t quite recall). Then you entered your PIN. This signed the transaction which provides very strong verification that you (via the PIN) authorize the specific transaction via a trusted device that is very unlikely to be compromised (unless you give someone physical access to it).

    It is obviously quite inconvenient. But provides a huge level of security. Unlike this Safety Net crap which is currently quite easy to bypass.


  • which is supposed to enforce to run apps in secured phones

    The point of the Google Play Integrity API is to ensure that the user is not in control of their phone, but that one of a small number of megacorps are in control.

    Can the user pull their data out of apps? Not acceptable. Can the user access the app file itself? Not acceptable. Can the user modify apps? Not acceptable.

    Basically it ensures that the user has no control over their own computing.




  • To put it another way you want to be using all of your RAM and swap. It becomes a problem if you are frequently reading from Swap. (Writing isn’t usually as much of an issue as they may be proactive writes in case more memory needs to be filled up).

    Basically a perfect OS would use RAM + Swap such that the least disk reads need to be issued. This can mean swapping out some idle anonymous memory so that the space can be used as disk cache for some hotter data.

    In this screenshot the OS decided that it was better to swap out 3GiB of something to use that space for the disk cache (“Cached” ). It is likely right about this decision (but is not always).

    3 GiB does seem a bit high. But if you have lots of processes running that are using memory but are mostly idle it could definitely happen. For example in my case I often have lots of Language Servers running in my IDE, but many of them are for projects that I am not actively looking at so they are just waiting for something to happen. These often take lots of memory and it may make sense to swap these out until they are used again.


    1. Launching Steam games outside of Steam can be very difficult. Some games outright won’t allow it.
    2. Steam provides native libraries such as the overlay, networking and matchmaking tools, achievements… You need to have Windows versions of these which wouldn’t be distributed by default in the Linux version of Steam.
    3. In the past Steam just didn’t run under Linux, so you had no other option.