Making a rar archive of your windows folder doesn’t leave your OS unbootable…
This was something I actually used to do with my one gigabyte hard drive in my Windows 95 machine. Though turning on drive compression through Windows would cause the machine to be soooo fucking sloooooow.
Rookie mistake, they should have compressed the rar file with 7zip to save even more space.
I’m still amazed at how much 7z can compress. The other day I extracted a 1.5 GB file, and it ended up being 4.7 GB.
That sounds like it was a DVD image that was mostly empty space, so any compression tool would have been able to save space. But yes, 7z is still impressive.
wrong, should’ve used 7z or zip under LZMA2 for best compression in this case
question, is there a way to use compression on the whole system, so files are decompressed on the fly.
Exchange overall storage space for speed?
yeah but it doesn’t make much sense
what’s actually taking up disk space are images and video, and those are already compressed. you cannot compress already compressed data again, it just won’t reduce the filesize. so there’s no point compressing everything, because where it matters, it doesn’t do anything, and where it doesn’t matter, it makes things slow and adds complexity, which is an additional failure point.
In short, yes. Windows has a checkbox in the settings somewhere (I think the partition manager? I can’t remember now) to enable compression on a given partition, so you can in effect enable filesystem compression on the C: drive. Through the command line you can also compress individual directories with different compression algorithms and I had incredible luck compressing game files with LZX compression with some games compressing down by 3 or 4 times (these were notably games with 100s of gigabytes of user-generated assets. More normal games only saw around a 20-40% reduction in storage space usage, which isn’t bad at all)
Outside of Windows there’s popular filesystems like zfs and btrfs which support filesystem compression and general encourage it by default because the speed of (de-)compression after reading from disk is almost always faster than just reading the uncompressed data from the disk directly
I know the opposite can be done because I did it just recently.
I have a nearing 10 year old set up from when it still made sense to have a 200gb SSD with a 2tb HDD for games. This hard drive is absolutely struggling with these massive games like Baldur’s Gate 3 and Cyberpunk (and Baldur’s Gate 3 has the annoying habit of not waiting for assets to finish loading before playing a cutscene).
I used this thing called bcache to take a 100gb partition of my SSD to automatically cache the most frequented files from the HDD. Even though Baldur’s Gate 3 is 120gbs (which I don’t think it needs to be, I think it’s poorly optimized) it was still enough to mostly get rid of any loading issues.
To make this relevant to your question, you could get a massive cheap but slow hard drive or even an external drive and use something like bcache to get the performance of your internal SSD.
Some filesystems (like btrfs and ZFS) support comression on the filesystem level, where each block is compressed with some algorithm automatically, completely transparently to applications.
Most modern CPUs are fast enough at the light levels of compression that is used, that usually its also faster, because you read less data, and the read + decompress time is often lower than how long it would have taken to just read more data; though of course that depends on what data exactly, but overal its often faster (though usually its not by a very significant amount) for most average uses.
It’s existed since DOS. It doesn’t work all that well. But it does work if you’re desperate for space.
https://en.wikipedia.org/wiki/DriveSpace
https://www.tomshardware.com/reviews/ssd-ntfs-compression,3073-11.html





