• atomicbocks@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    8
    ·
    5 hours ago

    This was something I actually used to do with my one gigabyte hard drive in my Windows 95 machine. Though turning on drive compression through Windows would cause the machine to be soooo fucking sloooooow.

    • NONE@lemmy.world
      link
      fedilink
      English
      arrow-up
      23
      ·
      10 hours ago

      I’m still amazed at how much 7z can compress. The other day I extracted a 1.5 GB file, and it ended up being 4.7 GB.

      • Nawor3565@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        18
        ·
        8 hours ago

        That sounds like it was a DVD image that was mostly empty space, so any compression tool would have been able to save space. But yes, 7z is still impressive.

    • gandalf_der_12te@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 hour ago

      yeah but it doesn’t make much sense

      what’s actually taking up disk space are images and video, and those are already compressed. you cannot compress already compressed data again, it just won’t reduce the filesize. so there’s no point compressing everything, because where it matters, it doesn’t do anything, and where it doesn’t matter, it makes things slow and adds complexity, which is an additional failure point.

    • Trainguyrom@reddthat.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 hour ago

      In short, yes. Windows has a checkbox in the settings somewhere (I think the partition manager? I can’t remember now) to enable compression on a given partition, so you can in effect enable filesystem compression on the C: drive. Through the command line you can also compress individual directories with different compression algorithms and I had incredible luck compressing game files with LZX compression with some games compressing down by 3 or 4 times (these were notably games with 100s of gigabytes of user-generated assets. More normal games only saw around a 20-40% reduction in storage space usage, which isn’t bad at all)

      Outside of Windows there’s popular filesystems like zfs and btrfs which support filesystem compression and general encourage it by default because the speed of (de-)compression after reading from disk is almost always faster than just reading the uncompressed data from the disk directly

    • Wirlocke@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 hour ago

      I know the opposite can be done because I did it just recently.

      I have a nearing 10 year old set up from when it still made sense to have a 200gb SSD with a 2tb HDD for games. This hard drive is absolutely struggling with these massive games like Baldur’s Gate 3 and Cyberpunk (and Baldur’s Gate 3 has the annoying habit of not waiting for assets to finish loading before playing a cutscene).

      I used this thing called bcache to take a 100gb partition of my SSD to automatically cache the most frequented files from the HDD. Even though Baldur’s Gate 3 is 120gbs (which I don’t think it needs to be, I think it’s poorly optimized) it was still enough to mostly get rid of any loading issues.

      To make this relevant to your question, you could get a massive cheap but slow hard drive or even an external drive and use something like bcache to get the performance of your internal SSD.

    • Scarlet0952@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      23
      ·
      8 hours ago

      Some filesystems (like btrfs and ZFS) support comression on the filesystem level, where each block is compressed with some algorithm automatically, completely transparently to applications.

      Most modern CPUs are fast enough at the light levels of compression that is used, that usually its also faster, because you read less data, and the read + decompress time is often lower than how long it would have taken to just read more data; though of course that depends on what data exactly, but overal its often faster (though usually its not by a very significant amount) for most average uses.