Try goharbor.io, that’s what I use. I think (but I’m not sure) that Forgejo/Gitea and Gitlab can also cache images.
Try goharbor.io, that’s what I use. I think (but I’m not sure) that Forgejo/Gitea and Gitlab can also cache images.
I have limited Python experience, but I always thought that’s what virtualenvs and requirements.txt files are for? When I used those, I found it easy enough to use.
Cloud-init. The config yaml is rather straight forward, but I can’t convince my VM to execute it, and it’s driving me nuts.
Good to hear! When you go with the National Archives UK, you can’t fail. They have some very, VERY competent people in staff over there, who are also quite active in the DigiPres community. They are also the inventors of DROID and the maintainers of the widely used PRONOM database of file formats. https://www.nationalarchives.gov.uk/PRONOM/Default.aspx Absolute heroes of Digital Preservation.
Yeah, you can always go crazy with (off site) copies. There’s a DigiPres software system literally called LOCKSS (Lots Of Copies Keep Stuff Safe).
The German Federal Office for Information Security recommends a distance of at least 200km between (professional) sites that keep georedundant copies of the same data/service, so depending on your upload capacity and your familiarity with encryption (ALWAYS backup your keys!), some cloud storage provider might even be a viable option to create a second site.
Spare drives do absolutely work as well, but remember that, depending on the distance, data there will get more or less outdated and you might not remember to refresh the hardware in a timely manner.
A safe deposit box is something that I hadn’t considered for my personal preservation needs yet, but sounds like a good idea as well.
Whatever you use, also remember to read back data from all copies regularly and recalculate checksums for fixity checks to make sure your data doesn’t get corrupted over time. Physical objects (like books) decay slowly over time, digital objects break more spontaneously and often catastrophically.
This is my day job, so I’d like to weigh in.
First of all, there’s a whole community of GLAM institutions involved in what is called Digital Preservation (try googling that specifically). Here in Germany, a lot of them have founded the Nestor Group (www.langzeitarchivierung.de) to further the case and share knowledge. Recently, Nestor had a discussion group on Personal Digital Archiving, addressing just your use case. They have set up a website at https://meindigitalesarchiv.de/ with the results. Nestor publishes mostly in German, but online translators are a thing, so I think you will be fine.
Some things that I want to address from your original post:
Come back at me if you have any further questions.
400 staff German state institution, Windows desktops are standard, but you can get a supported and standardized Linux Mint installation provided by IT on your personal computer upon request. A few dozen people do. We also provide some 150 publicly accessible PCs for research in or brach locations, all of which are Mint as well. And IT staff is allowed to install any system on their hardware they want, no questions asked; many run Linuxes. Linuces. Linnixees.
Thx kind stranger.
+1 for gpodder, and of you have a Nextcloud, you can sync your subscriptions using an NC plugin that I can’t remember the name of.
Yes, such a program is called an installer. /s
Sorry, I don’t have an answer for you that’s more helpful than the rest of the comments here, they all did well. I second booting a live system.
If you have a network share available on your LAN, you might want to try the FolderSync App. It can make your phone sync its photos every time you’re in your WiFi and plugged into the charger.
Alternatively, if you have NextCloud, the NextCloud App can do that for you.
Metube might be right for you.
Neither does mine, but, I keep it to test a new tool from time to time.
Rest of the list:
DNS tools:
Good stuff for pentesters and security researchers:
### .bashrc
### CUSTOM FUNCTIONS
# https://www.linuxjournal.com/content/boost-productivity-bash-tips-and-tricks
ftext () {
grep -iIHrn --color=always "$1" . | less -R -r
}
duplicatefind (){
find -not -empty -type f -printf "%s\n" | sort -rn | uniq -d | \
xargs -I{} -n1 find -type f -size {}c -print0 | \
xargs -0 md5sum | sort | uniq -w32 --all-repeated=separate
}
generateqr (){
# printf "$@" | curl -F-=\<- qrenco.de
printf "$@" | qrencode -t UTF8 -o -
}
deleted by creator
bash
, because I never had the time to learn anything else.
shebang.bash
is just fine for me, though I’ve customized it using Starship and created some aliases to have colored/pretty output where possible.shellcheck
before running your scripts in production, err on the side of caution, set -o pipefail
. There are best practices guides for Bash, use those and you’ll probably be fine.set -x
inside your Bash script or bash -x scriptname
on the CLI for debugging. Remember that you can always fallback to interactive CLI to test/prepare commands before you put them into your script. Think before you type. Test. Optimize only what needs optimization. Use long options for readability. And remember: Always code as if the guy who ends up maintaining your code will be a violent psychopath who knows your address.I switched to fish because it has tab completion Yeah, so does Bash, just install it.
Oh, I also “curate” a list of Linux tools that I like, that are more modern alternatives to “traditional” Linux tools or that provide information I would otherwise not easily get. I’ll post i
Debian-Packages available
no Deb pkg avail
___
I love the clock, but it doesn’t seem to be part of the launcher; at least I couldn’t find it after installing the launcher. Where can I find it?
EDIT: I just realized you’re running GrapheneOS. Does the clock come with Graphene? Nice background pic btw!
Uuuuh, weird, I love it!
With a username like this, I’d give all my hosts and servers moon names. Like the moons of Jupiter (Io, Europa, Ganymede, and Callisto).
Gitlab at work, because, well, it’s there and it works just fine.
Forgejo at home, because it’s far less resource hungry.
In the end Git is a) a command line tool for b) distributed working, so it really doesn’t matter much which central web service you put in place, you can always get your local copy via
git clone REPO
.