But it appears like we’re in a situation where it’s not used for specific situations, but for lots of different things. Just a few Flatpak programs starts to chew through a significant amount of disk space, and some programs are only being distributed as Flatpaks.
flatpak distribution is generally done by the developer as a common packaging method. if a distribution wants a native install it’s up to package maintainers of the distribution to support the application. although the package maintainers have to make sure they’re packaging the right versions of dependencies which becomes a problem known as dependency hell.
in your example of handbrake it’s true the main application is pretty small but that’s because it relies on libraries and is a wrapper for ffmpeg. even if you install through a package manager you still need to compare the total size of dependencies.
the disc space usage becomes a problem due to installing libraries both natively and in sandbox. however if you keep a relatively small system install and install applications through flatpak the disc usage will be pretty negligible. if disc space is really a concern then using something like btrfs with compression+dedup would probably solve most problems.
It just seems like it’s a lot of papering over a fairly substantial problem. While the example I gave was Handbrake, which does seem like it should be a unique example, every other piece of software that I check Flatpak versions of also had ludicrously wasteful storage issues.
I’m aware of dependency hell, but it seems to me that most software doesn’t have that as a problem, not if the libraries are sensibly maintained? After all, the fact that upgrading a library can improve all the software that uses it seems like it’s usually a positive thing. And the ballooning storage requirements of Flatpak make it a tool that should be used occasionally, rather as a primary way to release software. Using a filesystem that can detect duplicates would help, but itself also seems like a special-case kind of solution, and not a great solution to turn to just to avoid what seems to me to be a significant issue.
But it appears like we’re in a situation where it’s not used for specific situations, but for lots of different things. Just a few Flatpak programs starts to chew through a significant amount of disk space, and some programs are only being distributed as Flatpaks.
flatpak distribution is generally done by the developer as a common packaging method. if a distribution wants a native install it’s up to package maintainers of the distribution to support the application. although the package maintainers have to make sure they’re packaging the right versions of dependencies which becomes a problem known as dependency hell.
in your example of handbrake it’s true the main application is pretty small but that’s because it relies on libraries and is a wrapper for ffmpeg. even if you install through a package manager you still need to compare the total size of dependencies.
the disc space usage becomes a problem due to installing libraries both natively and in sandbox. however if you keep a relatively small system install and install applications through flatpak the disc usage will be pretty negligible. if disc space is really a concern then using something like btrfs with compression+dedup would probably solve most problems.
It just seems like it’s a lot of papering over a fairly substantial problem. While the example I gave was Handbrake, which does seem like it should be a unique example, every other piece of software that I check Flatpak versions of also had ludicrously wasteful storage issues.
I’m aware of dependency hell, but it seems to me that most software doesn’t have that as a problem, not if the libraries are sensibly maintained? After all, the fact that upgrading a library can improve all the software that uses it seems like it’s usually a positive thing. And the ballooning storage requirements of Flatpak make it a tool that should be used occasionally, rather as a primary way to release software. Using a filesystem that can detect duplicates would help, but itself also seems like a special-case kind of solution, and not a great solution to turn to just to avoid what seems to me to be a significant issue.