I could research this on my own, but was interested in hearing from the community.

Software tends to fall in categories based on who has control, how it is accessed, and who owns the data.

For instance, a FOSS project hosts encrypted user data for free, and the user easily controls who accesses it, but if the server/service goes down, users lose access to everything. Or, a user has their own offline files they control 100%, but sharing is more cumbersome.

Where does git fall in this spectrum? It seems that it’s a mix, where authoritative copies may be offline at times before merging, when it returns to the hosted version. Its hosted, but can be self-hosted, and multiple copies of code canbee offline as well. Does it rely on a central source hosting, and a company willing to support the software?

I’ve never contributed to a project with version control before, though I’ve worked in a few places that used JIRA or git. It interests me how it works, and I’m just curious to read a Lemmy discussion while it’s raining where I am.

(As I prepare to press SUBMIT it occurs to me this is a FOSS question more than a Linux one. If this is a stupid post for this /r/, please report/remove or ask me to and I will.)

  • Eideen@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    1 year ago

    To be maintained, any software needs to be supported. If not supported and development, other options will prevail.