List Of Ethical Code Hostings by 3rd-Parties
8) Notabug (Gogs)
9) Framasoft (GitLab)
10) Sourcehut (Sourcehut)
Wow, even @ReverseEagle retooted my post. Thank you very much for bringing up @codeberg and everything to the mainstream.
Hello @dibi58, long time no see and glad to find you again. Hope you are in healthy and fine condition.
Wow, how lively today is. I just came back to Mastodon and I saw many retoots on my Ethical Code Hostings. Even @datenschutzratgeber helped me retoot. Thank you very much for you all!
But could I ask 2 features, if I permitted to do? I think Codeberg needs a clear user guide & explanation of each account features.
Greetings from Indonesia!
(I'm not a programmer)
@codeberg looking at this post I just realised that there is a typo in your slogan. I suppose it should be: Give *your* free project a free home.
@marix hehe, thank you!
Very interesting list! Thanks for sharing. We looked around for git hosting + coding community - venture capital. We didn't find many options. But we are also happy we spotted and went for salsa.debian.org
@ademalsasa There's only one problem with all of these - survivability. Self hosted platforms are amazing (I run and love Gitea myself) but what happens when someone stops paying the hosting bills? Critical open source software could be lost forever. Not suggesting everyone should use Github, just that it's a problem worth thinking about.
provided they all allow you to import/export data with an open protocol that may not be such an issue.
Remember Nokia, Microsoft and Windows Mobile? (They weren't even particularly bad mobile phones, had good cameras and I used them for many years myself).
Microsoft walked away from that after just a few years in spite of pouring millions of dollars into the venture...
@vfrmedia Open protocols are important, but I'm not talking about data mobility, I'm talking about data *retention*. Being able to look at old source code, old versions of things, even projects no longer maintained or used can be a super important part of doing quality engineering. Why do you think people are still zealously hoarding V7 UNIX 30 years after it's become technically irrelevant? Because it's an important learning tool.
@feoh I agree with this, but to some extent the two are interlinked, if there's a way of storing this old data in multiple locations its more likely to survive - I have noticed in hobbies like amateur radio, restoring electronics, and retrocomputing there does seem to be at least some attempt to preserve useful historical data (including taking note that the humans who curate it aren't immortal and many are getting on in years)
@vfrmedia For sure! What I'd love to see is a Github *LIKE* thing that could serve as a central hub for development funded by some company with very big pockets that can easily handle the associated costs, and a distributed protocol where the data can be federated so we don't lose the whole enchilada if the hub goes down :)
The nature of Git itself mitigates a lot of this.
Any developer with a Git clone has the full history of the project. (This includes the person who submitted a three-line patch to fix something nobody else was bothered by.)
If there are no developers at all then it is an unmaintained project and the lack of source is the least of the problems. It needs to be pulled from use ASAP.
@yam655 Wow oh wow do I disagree with this. Being able to examine historical source code is IMPORTANT. It's one thing to say "This is old. Don't use it. Go use THIS instead" but saying "Anything not in regular use should be erased from the face of the earth" feels like a VERY dangerous tack to take.
I see places for active development and business use as being distinct and different from data museums that don't see code changes and don't expect people to be using them in production.
Code as art. Code as history. Code as research ideas for new code. All are good, valid use-cases.
But they don't need functioning repositories that can be confused with things that are safe to have as core technology stacks for public use.
@yam655 I was a release engineer for just shy of two decades. You're NEVER gonna get me to sign up to that being a good idea. Let's agree to disagree.
@yam655 Also I see things like BitTorrernt as providing a refutation of your claim. Have you ever encountered a torrent which was still listed as active but didn't have enough people actually seeding it such that it was in effect no longer available? I have.
Linus Torvalds didn't use source control at all for the Linux kernel for decades. He operated with the knowledge that enough people already had the source downloaded so that there was built-in redundancy.
If nobody has been making changes to the code, it has been abandoned. Abandoned code still in use is extremely dangerous and should be replaced ASAP.
Folks should be replacing abandoned code well before the source disappears. If it were important, it would have a maintainer.
You're right. It feels like longer, but even discounting BitKeeper (which I had entirely forgotten about) Git was in place in 2005, which made my statement about "decades" clearly an exaggeration.
It looks like I mostly ignored Git until 2011, which I could say is why I thought it was decades, but honestly, I think it's just a part of getting older.
@ademalsasa check sr.ht aka sourcehut. It kinda special because it not funded by stakeholders etc but fund by paid user. It follows kernel workflow (plaintext mailing list, patch by email not by PR on UI and more)
To be a contributer to any project, just send email + patch, no need to register
To have own repo, ml and more of coz you need to register
For people who care about, support, or build Free, Libre, and Open Source Software (FLOSS).