(I work at Mozilla, but not on the VCS tooling, or this transition)
To give a bit of additional context here, since the link doesn't have any:
The Firefox code has indeed recently moved from having its canonical home on mercurial at hg.mozilla.org to GitHub. This only affects the code; bugzilla is still being used for issue tracking, phabricator for code review and landing, and our taskcluster system for CI.
In the short term the mercurial servers still exist, and are synced from GitHub. That allows automated systems to transfer to the git backend over time rather than all at once. Mercurial is also still being used for the "try" repository (where you push to run CI on WIP patches), although it's increasingly behind an abstraction layer; that will also migrate later.
For people familiar with the old repos, "mozilla-central" is mapped onto the more standard branch name "main", and "autoland" is a branch called "autoland".
It's also true that it's been possible to contribute to Firefox exclusively using git for a long time, although you had to install the "git cinnabar" extension. The choice between the learning hg and using git+extension was a it of an impediment for many new contributors, who most often knew git and not mercurial. Now that choice is no longer necessary. Glandium, who wrote git cinnabar, wrote extensively at the time this migration was first announced about the history of VCS at Mozilla, and gave a little more context on the reasons for the migration [1].
So in the short term the differences from the point of view of contributors are minimal: using stock git is now the default and expected workflow, but apart from that not much else has changed. There may or may not eventually be support for GitHub-based workflows (i.e. PRs) but that is explicitly not part of this change.
On the backend, once the migration is complete, Mozilla will spend less time hosting its own VCS infrastructure, which turns out to be a significant challenge at the scale, performance and availability needed for such a large project.
Thanks for the context. IMHO I don't think Mozilla should have decided to move to a closed-source platform owned by Microsoft.
fguerraz · 1h ago
Thanks to the decentralised nature of git, this should matter only moderately.
JeremyNT · 39m ago
Exactly, now they have the best of both worlds: let Microsoft host the code using a standard VCS, but avoid lock in by continuing to use their own issue tracker and project management software.
iamcreasy · 5h ago
Thanks for the added context.
If I may - what were the significant scale challenges for self hosted solution?
jgraham · 4h ago
Again, I can only comment from the perspective of a user; I haven't worked on the VCS infrastructure.
The obvious generic challenges are availability and security: Firefox has contributors around the globe and if the VCS server goes down then it's hard to get work done (yes, you can work locally, but you can't land patches or ship fixes to users). Firefox is also a pretty high value target, and an attacker with access to the VCS server would be a problem.
To be clear I'm not claiming that there were specific problems related to these things; just that they represent challenges that Mozilla has to deal with when self hosting.
The other obvious problem at scale is performance. With a large repo both read and write performance are concerns. Cloning the repo is the first step that new contributors need to take, and if that's slow then it can be a dealbreaker for many people, especially on less reliable internet. Out hg backend was using replication to help with this [1], but you can see from the link how much complexity that adds.
Firefox has enough contributors that write contention also becomes a problem; for example pushing to the "try" repo (to run local patches through CI) often ended up taking tens of minutes waiting for a lock. This was (recently) mostly hidden from end users by pushing patches through a custom "lando" system that asynchronously queues the actual VCS push rather than blocking the user locally, but that's more of a mitigation than a real solution (lando is still required with the GitHub backend because it becomes the places where custom VCS rules which previously lived directly in the hg server, but which don't map onto GitHub features, are enforced).
why github and not codeberg?
badwidth? $$$ from microsoft? (traffic, free training for copilot, ..)
GuB-42 · 3h ago
I would say that using GitHub only for a public git repository is pretty good value.
It is free and robust, and there is not much bad Microsoft can do to you. Because it is standard git, there is no lockdown. If they make a decision you don't like, migrating is just a git clone. As for the "training copilot" part, it is public, it doesn't change anything that Microsoft hosts the project on their own servers, they can just get the source like anyone else, they probably already do.
Why not Codeberg? I don't know, maybe bandwidth, but if that's standard git, making a mirror on Codeberg should be trivial.
That's why git is awesome. The central repository is just a convention. Technically, there is no difference between the original and the clone. You don't even need to be online to collaborate, as long as you have a way to exchange files.
immibis · 2h ago
I am banned from GitHub because I didn't want to give them my phone number. They ignored a legally binding GDPR request to delete all my data. I haven't got around to suing them yet.
Recently I also got "rate limited" after opening about three web pages.
Microsoft can do something to you, and that is to arbitrarily deny you access after you've built a dependence on it, and then make you jump through hoops to get access back.
alabastervlog · 1h ago
> Recently I also got "rate limited" after opening about three web pages.
People who haven’t used it logged out recently may be surprised to find that they have, for some time, made the site effectively unusable without an account. Doing one search and clicking a couple results gets you temporarily blocked. It’s effectively an account-required website now.
baobun · 51m ago
At least you had the choice. Many potential contributors live in countries to which GitHub does not support SMS verification but still requires it. So there's a second tier of effectively blocked countries besides the officially sanctioned ones.
LadyCailin · 1h ago
This is kind of a weird hill to die on, but you’re well within your rights, so you do you.
However, it is clearly not correct to say that you were banned from GitHub. It’s like saying “I was banned from Google because I refuse to use computing devices.”
Not really a ban, just self flagellation, which, again, whatever works for you.
immibis · 1h ago
Give me your social security number or you may not reply to my comments. If you don't give me your social security number, choosing instead to die on this weird hill, it's not correct to say you're banned - you're merely self-flagellating.
Slartie · 3h ago
I'm pretty sure that Copilot already saw the Firefox source code, and that they didn't have to wait for Firefox moving to GitHub for that.
Macha · 3h ago
I'm not sure codeberg has managed two 9s of uptime while I've been using it. Manageable when it's just a public mirror for occasional publishing of my small hobby projects, but I wouldn't recommend it for Firefox sized projects
Miaourt · 3h ago
Maybe if Mozilla gave one hundredth of their CEO's salary in donation to Codeberg they would be more than happy and able to scale to nine nine :p
prepend · 2h ago
Maybe. Maybe not. If I was the person responsible for the code, I wouldn’t want to gamble on them becoming good enough for me to use.
executesorder66 · 1h ago
Yeah, it's not like they care about improving the state of the open source ecosystem anyway.
freeopinion · 18m ago
I had a similar thought. I am disappointed that Mozilla didn't take some of the money they were spending on a self-hosted homegrown solution and throw it to something like Codeberg. I guess that a little funding from the likes of Mozilla could go a long way in helping Forgejo pioneer some super interesting federation.
Of course Mozilla is free to make their own choices. But this choice will be read as the latest alarm bell for many already questioning the spirit of Mozilla management.
jorvi · 2h ago
Why did you omit (self-hosted) gitlab..?
dspillett · 2h ago
[not OP, but making educated guesses from what has already been said]
Given the post above, issues regarding self-hosting were at least part of the reason for the switch so a new self-hosted arrangement is unlikely to have been considered at all.
I don't know what the state of play is right now, but non-self-hosted GitLab has had some notable performance issues (and, less often IIRC, availability issues) in the past. This would be a concern for a popular project with many contributors, especially one with a codebase as large as Firefox.
bayindirh · 4h ago
I guess it's the CI/CD infrastructure. Pipeline and time requirement grows exponentially as the code supports more operating systems and configurations.
I used a GitLab + GitLab Runner (docker) pipeline for my Ph.D. project which did some verification after every push (since the code was scientific), and even that took 10 minutes to complete even if it was pretty basic. Debian's some packages need more than three hours in their own CI/CD pipeline.
Something like Mozilla Firefox, which is tested against regressions, performance, etc. (see https://www.arewefastyet.com) needs serious infrastructure and compute time to build in n different configurations (stable / testing / nightly + all the operating systems it supports) and then test at that scale. This needs essentially a server farm, to complete in reasonable time.
An infrastructure of that size needs at least two competent people to keep it connected to all relevant cogs and running at full performance, too.
So yes, it's a significant effort.
jgraham · 4h ago
This is all true, but as the sibling says, not really related to the change discussed here.
Firefox does indeed have a large CI system and ends up running thousands of jobs on each push to main (formerly mozilla-central), covering builds, linting, multiple testsuites, performance testing, etc. all across multiple platforms and configurations. In addition there are "try" pushes for work in progress patches, and various other kinds of non-CI tasks (e.g. fuzzing). That is all run on our taskcluster system and I don't believe there are any plans to change that.
arp242 · 4h ago
> I guess it's the CI/CD infrastructure
Your guess is wrong as Firefox doesn't use GitHub for any of that, and AFAIK there are no plans to either.
The blog post linked in the top comment goes in to this in some detail, but in brief: git log, clone, diff, showing files, blame, etc. is CPU expensive. You can see this locally on large repo if you try something like "git log path/to/dir".
Add to this all the standard requirements of running any server that needs to be 1) fast, and 2) highly available.
And why bother when there's a free service available for you?
bayindirh · 4h ago
It was a guess and I never claimed it was 100% correct, and I'm happy to stand corrected. No hard feelings there.
tempaccount420 · 3h ago
"I guess..." != "I'm guessing..."
bayindirh · 2h ago
That's new to me. Can you expand on that a little?
notpushkin · 4h ago
I think the CI/CD infra stays intact here though? (and even then, I imagine GitHub Actions bill would be enormous for a project like Firefox)
saghm · 2h ago
If the CI/CD is the most intensive part, it seems reasonable to move all of the other parts to a free provider to focus on the part that would be harder and more expensive to move. Even if they don't ever move any of the CI/CD over, I feel like I can understand the rationale for reducing the scope to just that rather than the source hosting. I've worked on plenty of projects with way less traffic than Firefox over the years that used GitHub for source hosting but alternate CI/CD; GitHub didn't even have built in CI for a while, so that was the only way to use it.
Given the frequency I see comments on this site about Mozilla trying to do far too much rather than just focusing their efforts on core stuff like Firefox, I'm honestly a bit surprised that there aren't more people agreeing with this decision. Even with the other issues I have with Mozilla lately (like the whole debacle over the privacy policy changes and the extremely bizarre follow-up about what the definition of "selling user data" is), I don't see it as hypocritical to use GitHub while maintaining a stance that open solutions are better than closed ones because I think trying to make an open browser in the current era is a large and complicated goal for it to be worth it to set a high bar for taking on additional fights. Insisting on spending effort on maintaining their own version control servers feels like a effort that they don't need to be taking on right now, and I'd much rather than Mozilla pick their battles carefully like this more often than less. Trying to fight for more open source hosting at this point is a large enough battle that maybe it would make more sense for a separate organization focused on that to be leading the front in that regard; providing an alternative to Chrome is a big enough struggle that it's not crazy for them to decide that GitHub's dominance has to be someone else's problem.
notpushkin · 6m ago
Yeah, I agree that everything that helps reduce maintenance overhead is good for Mozilla (although I believe there’s more low-hanging fruits that could be addressed before that).
I would love to see Mozilla moving to Codeberg.org (though I’d ask if they’re okay with it first) or something like that. Using GitHub is okay-ish? Personally, I frown upon it, but again I agree – it’s not the most important issue right now.
bayindirh · 4h ago
I think it can be done half/half. Do some, well-defined builds at GitHub and pull in for testing. Another comment tells that some users needed 10+ minutes to get a lock to pass their tests through CI, so maybe some sanity tests can be offloaded to GitHub actions.
I'm not claiming that my comment was 100% accurate, but they plan to move some of the CI to GitHub, at least.
TheDong · 2h ago
> but they plan to move some of the CI to GitHub, at least
Really? I've seen no indication of that anywhere, and I'd be amazed if they did.
They're not using github PRs, and github actions really fights against other development workflows... not to mention they already have invested a lot in TaskCluster, and specialized it to their needs.
Where are you getting that from?
bayindirh · 1h ago
It was an, apparently very wrong, educated guess. Nothing more.
lupusreal · 2h ago
> This only affects the code; bugzilla is still being used for issue tracking
Grim.
The best reason to be using github at all is to maximize the portion of your users who are comfortable submitting bug reports, as they already have an account and are familiar with how the platform works (due to network effects.) Projects which host code on github but chose not to take bug reports there are effectively gate keeping bug submission, by asking their users to jump through the hoops of finding the site, signing up for it, and learning to use a new interface. I've done this before, with Bugzilla and Firefox, to submit a bug report for an accessibility bug on MacOS and it was a pain in the ass that I put off for a long time before becoming annoyed enough to go through the process. (End result: the bug was confirmed but never fixed..)
jgraham · 1h ago
Gecko and Firefox have been using Bugzilla for more than 25 years at this point. There's a lot of internal workflows, tooling and processes that are really dependent on the specific functionality in Bugzilla. I think it would be an extremely high risk project to try and replace Bugzilla with GitHub issues.
That said, there are also other teams and projects who do use GitHub for issue tracking. However the closer to Firefox/Gecko you are the harder this gets. For example it's hard to cross-reference GitHub issues with Bugzilla issues, or vice versa. I've seen people try to build two-way sync between GitHub and Bugzilla, but there are quite considerable technical challenges in trying to make that kind of cross-system replication work well.
However your point that GitHub makes issue submission easier for people who aren't deeply embedded in the project is a good one. I'm directly involved with webcompat.com, which aims to collect reports of broken sites from end users. It's using a GitHub issue tracker as the backend; allowing developers to directly report through GitHub, and a web-form frontend so that people without even a GitHub account can still submit reports (as you can imagine quite some effort is required here to ensure that it's not overwhelmed by spam). So finding ways to enable users to report issues is something we care about.
However, even in the webcompat.com case where collecting issues from people outside the project is the most important concern, we've taken to moving confirmed reports into bugzilla, so that they can be cross-referenced with the corresponding platform bugs, more easily used as inputs to prioritization, etc. That single source of truth for all bugs turns out to be very useful for process reasons as well as technical ones.
So — (again) without being any kind of decision maker here — I think it's very unlikely that Firefox will move entirely to GitHub issues in the foreseeable future; it's just too challenging given the history and requirements. Having some kind of one-way sync from GitHub to Bugzilla seems like a more tractable approach from an engineering point of view, but even there it's likely that there are non-trivial costs and tradeoffs involved.
dspillett · 2h ago
Moving the existing data over might not be a quick and easy task, so takes planning. Perhaps they intend to move over but didn't want to do everything in one go. Making many changes at the same time can be much more risky than a staged approach.
> are effectively gate keeping bug submission
Of course this could be a benefit… Have you seen the quality of bug reports coming from some people, even other devs? :-)
matkoniecz · 2h ago
I suspect that Firefox is not bottlenecked on number of bug reports they got.
AlienRobot · 52m ago
If you really want bug reports just make it a single form without the need to create an account. Github, Gitlab, etc., is a wall for 99% of web browser users.
floriangosse · 6h ago
I think it's actually an understandable strategical move from Mozilla. They might loose some income from Google and probably have to cut the staff. But to keep the development of Firefox running they want to involve more people from the community and GitHub is the tool that brings most visibility on the market right now and is known by many developers. So the hurdle getting involved is much lower.
I think you can dislike the general move to a service like GitHub instead of GitLab (or something else). But I think we all benefit from the fact that Firefox's development continues and that we have a competing engine on the market.
fhd2 · 4h ago
In my experience, most contributors who are deterred from contributing because they can't use GitHub aren't particularly valuable contributors. I'm sure there's exceptions, but I haven't seen any for non-trivial open source projects I've been involved in. I might even argue that it could be good to have a slightly higher bar to deter low quality one time contributors.
arp242 · 4h ago
I spent quite some time writing a patch for FreeBSD and Linux a few months ago, including getting to grips with their contribution process.
Both patches have been ignored thus far. That's okay, I understand limited resources etc. etc. Will they ever be merged? I don't know. Maybe not.
I'm okay with all of this, it's not a complaint. It's how open source works sometimes. But it also means all that time I spent figuring out the contribution process has been a waste. Time I could have spent on more/other patches.
So yeah, there's that.
It's certainly true that making the bar higher will reduce low-quality contributions, because it will reduce ALL contributions.
(aside: FreeBSD does accept patches over GitHub, but it also somewhat discourages that and the last time I did that it also took a long time for it to get reviewed, although not as long as now)
elric · 3h ago
In all likelihood, if the patch had been a pull request, the pull request would have been ignored as well. Much like the thousands of pull requests that are often ignored by various larger open source projects. Ain't nobody got time to triage drive-by pull requests from unknown contributors, especially on large projects.
There's no easy solution. Much like the recent curl security kerfuffle, the signal:noise ratio is important and hard to maintain.
amanda99 · 40m ago
I think the OP's point here was that if it's a PR and it's ignored: you spent a bunch of time writing a PR (which may or may not have been valuable to you, e.g. if you maintain a fork now). On the other hand, if it was an esoteric contribution process, you spent a lot of time figuring out how to get the patch in there, but that obviously has 0 value outside contributing within that particular open source project.
struanr · 4h ago
Although I have certainly created pull requests before that have been ignored so not sure GitHub solves this problem.
arp242 · 4h ago
GitHub PRs don't solve anything about that, but I wouldn't have to spend (waste) time figuring out the contribution process. At least I learned a few things writing the patches. I learned nothing of value dealing with git email or Phabricator. It's just work of the boring and tedious kind.
elric · 3h ago
Many projects have rules about what kinds of pull requests they accept. You would still have had to familiarise yourself with those rules, as well as the usual things like coding style, testing policies, etc.
andybak · 1h ago
Surely the claim being made is that the overall effort was increased in this case. That makes sense to me. I guess you can debate "but by how much?" but it seems fairly clear that there is more friction than there would have been via Github PRs
TheDong · 2h ago
Dealing with github is the boring and tedious thing, you have to run huge amount of proprietary javascript, keep up with their weird UX changes, start X11 to open a browser to render their html, overclock your CPU for a large PR review conversation to scroll without locking up your computer for minutes, constantly click "load more" since their webpage keeps hiding comments (while still lagging massively)...
Email is simple. It's just text, there's no weird javascript or html or lag. I don't have to open X11. I can just open mutt and read or write. I can type "git send-email". It's all open source, so I can read the code to understand it, and write scripting around it. It runs on any computer with ease. Even on a slow connection, it's quite speedy.
I totally agree with you about Phabricator though.
arp242 · 1h ago
"Boo hoo I need to start X11"? Seriously?
I have some unconventional workflows. And I try not to bother anyone else with it, especially in a volunteer driven open source context. It would be selfish to do otherwise.
To be honest based on what you've written here, keeping you out of my projects sounds like a good thing. What a bunch of piss and vinegar over how other people are choosing to work in a way that works for them.
elteto · 9m ago
Starting X takes forever on his PDP11. Only real way to run Unix.
berkes · 4h ago
You just showed the poster-child of gatekeeping that is harming Open Source.
Every contributor is valuable, it's in the name, the definition of "contribute".
Any bar to entry is bad, it certainly never is the solution to a different problem (not being able to manage all contributions). If anything, in the longer run, it will only make it worse.
Now, to be clear, while I do think GitHub is currently the "solution" to lower barriers, allow more people to contribute and as such improve your Open Source Project, the fact this is so, is a different and other problem - there isn't any good alternative to Github (with broad definitions of "good") why is that and what can we do to fix that, if at all?
fhd2 · 3h ago
In spirit, I agree.
In practice, if you get dozens of PRs from people who clearly did it to bolster up their CV, because their professor asked them or something like that, it just takes a toll. It's more effort than writing the same code yourself. Of course I love to mentor people, if I have the capacity. But a good chunk of the GitHub contributions I've worked on were pretty careless, not even tested, that kind of thing. I haven't done the maintainer job in a while, I'm pretty terrified by the idea of what effect the advent of vibe coding had on PR quality.
I feel pretty smug the way I'm talking about "PR quality", but if the volume of PRs that take a lot of effort to review and merge is high enough, it can be pretty daunting. From a maintainer perspective, the best thing to have are thoughtful people that genuinely use and like the software and want to make it better with a few contributions. That is unfortunately, in my experience, not the most common case, especially on GitHub.
arp242 · 2h ago
In my experience low-quality PRs aren't that common, but I do agree dealing with them is annoying. You can't just tell people to go away because they did spend their spare time on it. On the other hand it's also garbage. Sometimes it's garbage by people who really ought to know better. IMHO low-quality issues are the bigger problem by the way, a problem that existed well before GitHub.
But I just don't see how GitHub or a PR-style workflow relates. Like I said in my own reply: I think it's just because you'll receive less contributions overall. That's a completely fair and reasonable trade-off to make, as long as you realise that is the trade-off you're making.
int_19h · 4h ago
This is just blatantly wrong on so many levels.
Proposed contributions can in fact have negative value, if the contributor implements some feature or bug fix in a way that makes it more difficult to maintain in the long term or introduces bugs in other code.
And even if such contribution is ultimately rejected, someone knowledgeable has to spend time and effort reviewing such code first - time and effort that could have been spend on another, more useful PR.
dgb23 · 2h ago
It's not wrong, it's just based on the assumption that the projects wants contributors.
Quite obviously, any incidental friction makes this ever so slightly harder or less likely. Good contributions don't necessarily or only come from people who are already determined from the get go. Many might just want to dabble at first, or they are just casually browsing and see something that catches their attention.
Every projects needs some form of gatekeeping at some level. But it's unclear to me whether the solution is to avoid platforms with high visibility and tools that are very common and familiar. You probably need a more sophisticated and granular filter than that.
lpln3452 · 3h ago
This isn't a platform issue — it's a problem with the PR system, and arguably with open source itself. If you're unwilling to spend time on anything beyond writing code, maybe keep the project closed-source.
majewsky · 3h ago
Or, more obviously, make it open-source, and make a big fat note in the README of "I will not accept PRs, this repo is just for your consumption, fork it if you want to change it".
int_19h · 1h ago
It's not a binary. Many projects do want PRs, but it doesn't mean they have to accept any random PR, or fawn over every contributor who creates an obviously low-effort one. It's perfectly fine to "gatekeep" on quality matters, and that does mean acknowledging the fact that not all contributors are equally valuable.
nicman23 · 3h ago
lol go closed then
matkoniecz · 2h ago
> Every contributor is valuable, it's in the name, the definition of "contribute".
No. I definitely seen people who created multitude of misleading bug reports, flood of stupid feature requests. I personally did a bit of both.
There are people who do both repetitively, fill issue reports without filling requested fields. Or open issue again when their previous report was closed.
I got once bug report where someone was ranting that app is breaking data. Turned out (after wasting my time on investigating it) that user broke data on their own with different software, through its misuse.
There were PRs adding backdoors. This is not a valuable contribution.
There were PRs done to foment useless harmful political mess.
Some people pretend to be multiple people and argue with themselves in pull requests or issues (using multiple accounts or in more bizarre cases using one). Or try to be listed multiple times as contributor.
Some people try to sneak in some intentionally harmful content one way or another.
Diversity, here too, is of crucial importance. It's why some Open Source software has sublime documentation and impeccible translations, while the other is technically perfect but undecipherable. It's why some Open Source software has cute logos or appeals to professionals, while the other remains this hobby-project that no-one ever takes serious despite its' technical brilliance.
myfonj · 3h ago
Also don't forget that not all contributions are done through PRs or are actual code changes. There are folks that do tests, make MREs, organise issue reports, participate in forums … they all are also contributing: their time and efforts.
Philpax · 4h ago
I can say that I've chosen not to bother when submitting a fix requires me to stray away from GitHub, and doubly so when it doesn't use a PR/MR workflow. There are only so many hours in the day, and I don't have the patience to deal with unconventional workflows when there are other things I could be doing with my time.
For projects that I'd be interested in being a long-term contributor to, this is obviously different, but you don't become a long-term contributor without first dealing with the short-term, and if you make that experience a pain, I'm unlikely to stick around.
A big part of this is the friction in signing up; I hope federated forges become more of a thing, and I can carry my identity around and start using alternate forges without having to store yet another password in my password manager.
Handler9246 · 1h ago
Sad we're at a stage where people don't contribute to free software projects because the service it's hosted on isn't the proprietary, corporate giant.
"Friction in signing up" being a big part for you is also weird, considering basically all free software GitHub alternatives (Gitea, GitLab, Forgejo) support SSO via GitHub.
lpln3452 · 4h ago
Contribution isn’t driven by a desire for rewards, but by goodwill. Friction only gets in the way. If the friction is worth it, fine - but what exactly is being lost by moving the repository to GitHub?
baobun · 41m ago
> but what exactly is being lost by moving the repository to GitHub?
Contributors who can't use GitHub because either 1) they are fresh and can't activate a new account 2) their old grandfathered account is no longer usable or 3) their old account id doxxed and they can no longer safely contribute under the old identity.
Once you trigger phone-number verification requirement your account is globally shadowbanned and support blocked pending SMS code verification. Aside from the privacy issue it's completely blocking people in countries to which GitHub won't even try to SMS/call.
Remember that registering a second account would be violating GitHub ToS.
stevekemp · 3h ago
The number of emails I get "Your website is vulnerable to clickjacking attacks, PS. how much bounty have I earned?" suggests that there are many for whom a desire for literal rewards is their sole driver.
Not to mention the AI-generated security "issues" that are reported against curl, for example, suggests there can indeed be negative value for reports, and contributions.
lpln3452 · 2h ago
You're right. And that's not an issue with any particular platform, but with open source projects that accept issues and PR in general.
I don't think this is the place for a debate about the overall utility of open source.
Aachen · 3h ago
> what exactly is being lost by moving the repository to GitHub?
Alternatives to github
We lament Google's browser engine monopoly, but putting the vast majority of open source projects on github is just the expected course to take. I guess we'll repeat history once microsoft decides to set in the enshittification, maybe one day mobile OSes replace Windows and they're strapped for cash, who knows, but it's a centralised closed system owned by a corporation that absolutely adores FOSS
I don't mind any particular project (such as this one) being in Github and I can understand that Mozilla chooses the easy path, they've got bigger problems after all, but it's not like there are no concerns with everyone and everything moving to github
lpln3452 · 2h ago
Did you ever use the alternatives before GitHub took off?
GitLab? It was awful. Slow, and paying for that kind of experience felt like a bad joke.
It's much better now but it was borderline unusable back in the day.
Or SourceForge, before Git was mainstream? Also terrible.
GitHub succeeded because it quickly established itself as a decent way to host Git - not because it was exceptional, but because the competition had abysmal UX.
Unlike other lock-in-prone services, moving a Git project is trivial. If GitHub loses its advantages due to enshittification, you just move. Case in point: Mozilla hopping on and off GitHub, as this article shows.
Philpax · 2h ago
I believe GitLab post-dates GitHub, but I otherwise agree with the sentiment.
lpln3452 · 2h ago
You're right. But as far as I remember, neither GitHub nor GitLab were really mainstream at the time.
I think the real competition began around the same time.
matkoniecz · 1h ago
> Unlike other lock-in-prone services, moving a Git project is trivial.
not really
just moving issue tracker and discussions is highly annoying
trying to get your users to move is likely hard and you will lose many
still, may be easy in comparison
rendaw · 4h ago
How can you judge the quality of people who don't contribute? They don't contribute, so what's there to judge?
fhd2 · 3h ago
Not possible, but I have a comparison between projects on GitHub and projects not on GitHub (and generally more ceremony).
A lot more contributions on GH, but the majority of them ignored guidelines and/or had low code quality and attention to detail. Just my anecdotal experience of course.
pornel · 3h ago
The barriers may keep out low effort submissions*, but they also keep out contributors whose time is too valuable to waste on installing and configuring a bespoke setup based on some possibly outdated wiki.
* contributors need to start somewhere, so even broken PRs can lead to having a valuable contributor if you're able to guide them.
arichard123 · 3h ago
Hang on. If they are deterred, then by definition they are not valuable contributors. They have not contributed. If they have contributed, they were not deterred.
nicman23 · 4h ago
"gatekeeping good"
no.
7bit · 3h ago
They are everywhere. It's like a plague.
Aachen · 3h ago
Am I understanding you correctly that using github instead of a more obscure system where you might need to register a fresh account and find where the buttons are etc. raises the bar for contributions and so it's good to use github?
Somehow I think you're holding the difficulty scale backwards!
7bit · 3h ago
So, you're saying that because they don't know to use A they are likely to also don't know enough to contribute to B?
Being a good coder has absolutely no correlation to being good at using Mercurial.
madeofpalk · 2h ago
I absolutely gave up on trying to contribute a patch to Firefox because the combination of both gh and phabricator was too much for me.
I struggled to understand how the two interacted with each other, and I didn't know how to 'update my branch/pr' and I eventually just gave up.
noobermin · 43m ago
I get moving to Github being a change but I'd imagine the real story is the move from mercurial to git, although I'd guess the the social considerations might have influenced the technical decisions.
kgwxd · 1h ago
Anyone that couldn't overcome those "hurdles" shouldn't even be filing bug reports, let alone modifying code.
Kuinox · 5h ago
It's good that they fixed one of the major tech debt for contributing to firefox.
When I tried a few years ago, mercurial took multiple hours to clone, and I already had to use the unofficial git support in order to have things working before the end of the day.
Their docs was also a mess back then and made me recompile everything even if it wasnt needed.
mritzmann · 5h ago
What is the source of “Firefox Moves to GitHub”? It could be a mirror, just like Linux also has an mirror on GitHub.
It’s interesting how pull requests remain the only tab (apart from code) that cannot be disabled by the repo owners.
I get it from GitHub’s perspective, it’s a nudge to get people to accept the core premise of ”social coding” and encouraging user pressure for mirrored projects to accept GitHub as a contribution entrypoint. I’m impressed by their successes and would attribute some of that to forced socialization practices such as not allowing PRs to be disabled. I’ve grown to dislike it and become disillusioned by GitHub over the course of a long time, but I’m in awe of how well it has worked for them.
upcoming-sesame · 58m ago
Why did they choose the mozilla-firefox org as opposed to the already existing mozilla org ?
Now, both the desktop and the mobile version will be on Github, and the "issues" will stay on Bugzilla.
This will take advantage of both GitHub's good search and source browsing and Git's familiar system.
As a former Firefox and Thunderbird contributor, I have to say that I used local search instead of trying to find something on the mozilla-central website.
Of course, when you're actively developing software, you search inside your IDE, but allowing to find things easily on the website makes it more welcoming for potential new contributors.
adrian17 · 5h ago
> I have to say that I used local search instead of trying to find something on the mozilla-central website.
On the contrary, I find searchfox to be the best code navigation tool I used. It has nice cross-language navigation features (like jumping from .webidl interface definition to c++ implementation), it has always-on blame (with more features too) and despite that it's really fast and feels extremely lightweight compared to GitHub interface. I really wish I had this with more projects, and I'll be sad if it ever dies.
antalis · 3h ago
Searchfox didn't exist back then, "there [was] only xul", I mean MXR of course.
Then MXR got replaced by DXR, itself replaced in 2020 by Searchfox (introduced in 2016).
> This will take advantage of both GitHub's good search and source browsing and Git's familiar system.
The source browsing has detoriated severely relatively recently IME, to the point where i can't be called "good" anymore.
It now loads asynchronously (requiring js) and lazily, randomly breaks on shaky connections and in-page search is broken.
The recent issues/PRs revamp is also a pretty major step back. Try searching in PRs with all uBlock Origin lists enabled.
noobermin · 48m ago
I guess the dream is dead. Even in open source, we have consolidation with no real hard monetary markets involved.
EDIT: skimming these comments, I like how none of the top comments are talking about the bigger story here which is the move away from mercurial to git and instead everyone is focusing on github itself. This has essentially sealed hg away to obscurity forever. Do people not realise git is a program that runs on your computer and github is just a service that uses git? May be this is an old man gripe at this point but I'm surprised at the lack of technical discussion around this.
upcoming-sesame · 57m ago
Why did they use mozilla-firefox org name instead of the already existing https://github.com/mozilla one ?
mlenz · 7h ago
Great to see, but I wonder what lead to the decision of creating a new org instead of using github.com/mozilla
moontear · 7h ago
Without knowing their reason, there are a few things tied to the org where multiple orgs make sense. If you do SSO for example you tie the org to a SSO provider, you can’t tie „just a few users“ to the SSO provider (afaik). The Firefox repo may have totally different authentication / users than the main Mozilla repo.
pornel · 3h ago
The GitHub SSO is annoying. I can't even view public issues if I'm logged in to GitHub, but haven't recently re-authenticated with SSO.
GitHub also has a lot of features and authentication scopes tied to the whole org, which is pretty risky for an org as large as Mozilla.
GitHub are terrible at this, because you can't have levels other than Org and Repository. And many things (SSO, visibility rules, common configs) are on the org level.
Unfortunately often the cleaner option is to create a separate org, which is a pain to use (e.g. you log in to each separately, even if they share the same SSO, PATs have to be authorised on each one separately, etc).
In Gitlab, you would have had one instance or org for Mozilla, and a namespace for Firefox, another one for other stuff, etc.
captn3m0 · 6h ago
There is an “Enterprise” level above the org, but that obviously needs an Enterprise account. It lets you manage some policies across multiple orgs, including membership.
sofixa · 5h ago
But it still requires multiple orgs, and the UX is still poor.
It's like AWS accounts vs GCP projects. Yeah, there are ways around the organisational limitations, but the UX is still leaky.
nolok · 2h ago
I hope the bugzilla stay there even if only read only. There is a lot of historical data in there, especially for the web which was built as a "ad-hoc" platform, many times when you wonder why does X the answer can only be found in bugzilla (which will explain that some random website that used to be major but doesn't even exists anymore, did something for some browser that used to be major but doesn't even exists anymore).
fergie · 21m ago
Bugzilla was really good, and in retrospect decades ahead of its time. There is probably no self hosted bug tracker that comes close (or it there?)
thrdbndndn · 6h ago
Correct me if I'm wrong, IIRC the previous "master" branch is `mozilla-central`.
Now it has "main" and "autoland", what are they? Which one is the equivalent of mozilla-central before?
chme · 6h ago
Not a firefox dev, but pretty sure its 'main'
The "new" git default branch name is 'main' and 'autoland' existed before next to 'mozilla-central' and is the one where commits usually appear first.
jamienicol · 5h ago
I am a Firefox developer, and you're spot on. Previously there were separate hg repos for central, beta, release. I think ESRs too. And autoland. Now they're all branches in the same repo, and central is renamed main.
Commits land in autoland and get backed out if they cause test failures. That's merged to main ~twice per day when CI is happy
thrdbndndn · 5h ago
Thanks for the clarification!
I've mostly encountered these branches/repos when checking commits linked to Bugzilla tickets, and I don't recall seeing "autoland" show up too much in those cases.
mintplant · 2h ago
Why is the mozilla-firefox org full of forks of the main repo named after trees?
To me it seems absurd that such organization like Mozilla uses third-party hosting like GitHub instead of something self-hosted or at least running under their own name. I understand that one-person projects use GitHub, but forcing contributors to make account with third-party service seems contributor-hostile.
CorrectHorseBat · 7h ago
So they moved from hg to git? Or is this just an official mirror
shit_game · 7h ago
firefox development has been moved from mercurial to git since early november of 2023
Interesting that their issues are blamed on "dual SCM", not on Mercurial itself. I guess just the weight of contributors expecting Git as the default is sinking the big Mercurial projects these days.
Kuinox · 6h ago
I tried to contribute a few years ago.
The mercurial clone was taking multiple hours.
They already had an non official git, which took 15 minutes to clone.
dgoldstein0 · 6h ago
Isn't mercurial abandonware? Or maybe I'm just remembering that gitlab dropped support. If it's not dead yet seems to be getting there
arp242 · 5h ago
They had a release just a few days ago. It's definitely not abandonware.
swiftcoder · 4h ago
It’s still used by Meta, at any rate (albeit a very scaled version thereof). Meta picked it for their monorepo when Linus wasn’t willing to play ball on extending Git for their use case.
arp242 · 4h ago
Is it still used there? I know they did in the past, but reading up a bit on the background on all of this I found https://github.com/facebook/sapling, and it seems that's what they're using now?
IshKebab · 7h ago
They supported Git and Hg until now. This means they are dropping Hg support.
bingemaker · 3h ago
They already have an org github.com/mozilla. Why didn't they move ff source there?
tgsovlerkhgsel · 6h ago
On one hand, centralization at a commercial provider isn't great.
On the other hand, the plethora of different self-hosted platforms with limited feature sets is a huge pain. Just finding the repo is often a frustrating exercise, and then trying to view, or worse, search the code without checking it out is often even more frustrating or straight out impossible.
elric · 3h ago
> Just finding the repo is often a frustrating exercise
Surely most open source projects have a link to their source code? Whether it's github, gitlab, sourcehut, or anything else?
smallnix · 6h ago
I wish I could search on GitHub without logging in
hedayet · 5h ago
I wish that too, and I’ve always wanted to offer features like this in everything I build.
But it’s a lot of work to prevent abuse, especially for resource intensive features when supporting unsigned-in use cases.
Not for PRs or issues though which are arguably the biggest reasons to use GitHub. Still this is definitely an improvement.
baq · 6h ago
Which is fascinating since both suck. Gerrit (replace with whatever if you please) is a much better change submission experience and basically anything else is a better bug tracker.
The killer feature is collocation of features to a single forge, combined with a generous free tier it’s the windows xp of the ecosystem: everybody has it, everybody knows it, almost nobody knows anything else.
elric · 3h ago
GitHub's issue tracker is easily the worst issue tracker I've ever used. It's at the same time incredibly limited in features, but somehow hard to navigate.
As for PRs: I'm sure Mozilla welcome contributions, but accepting GitHub PRs is going to be a recipe for thousands of low-value drive-by commits, which will require a lot of triage.
IshKebab · 3h ago
Count yourself lucky you haven't had to use Jira! Or bugzilla for that matter.
I agree it is rather basic but I don't see how it's hard to navigate.
> accepting GitHub PRs is going to be a recipe for thousands of low-value drive-by commits, which will require a lot of triage.
I don't think that really happens based on what I've seen of other huge projects on GitHub.
elric · 2h ago
> Count yourself lucky you haven't had to use Jira! Or bugzilla for that matter.
Jira and bugzilla are vastly superior to GH Issues.
Jira doesn't even deserve 10% of the hate it gets. Most of what makes Jira awful is the people using it. Bugzilla is getting a bit long in the tooth, but at least it's still free and open source.
IshKebab · 1h ago
> Jira and bugzilla are vastly superior to GH Issues.
I think you're in the tiny minority with that opinion.
> Most of what makes Jira awful is the people using it.
Not even close. Yes, people aren't good at administering it, but there are soooo many reasons that it's shit apart from that. Not least the hilarious slowness. Jira Cloud is so slow that not even Atlassian use it.
Also I don't think you can just say "you're holding it wrong". Part of the reason people screw up Jira configs so much is that it makes it so easy to screw them up. You can't separate the two.
> but at least it's still free and open source.
Just being open source doesn't make something good.
elric · 53m ago
> I think you're in the tiny minority with that opinion.
I'm not. The whole "I hate Jira thing" is a meme among a very vocal minority of tech enthusiasts. They don't have tens of millions of users because Jira is awful. The reason why so many people cry about it (apart from the meme-factor) is that people conflate Jira with their team's failed approach at scrum.
Sure, it has rough edges, and sure, Atlassian as a company sucks. I have a bug report open on their Jira for some 20 years and I don't think it will ever get fixed. And yes, Jira Cloud is very slow, it's ridiculous. And in spite of that, GH Issues is still objectively worse. It's so far behind in terms of features that it isn't even a fair comparison.
matkv · 6h ago
So is this now just a mirror? I'm not sure what the point of moving to GitHub was then.
IshKebab · 3h ago
It's the primary repo rather than a mirror, but yeah I agree it they don't get most of the benefits. Moving issues and PRs is probably an enormous effort so I get why they aren't doing it all at once.
bandrami · 7h ago
Pretty cool that Linus Torvalds invented a completely distributed version control system and 20 years later we all use it to store our code in a single place.
SCdF · 7h ago
I get what you're saying, but tbf hosting on github doesn't (yet!) box you out of just moving back to that system. It's still just git. It's still distributed, in the sense that if github goes down you could still generate patches and email them around, and then push back to github when it's back.
Everything surrounding code: issues, CICD, etc, is obviously another story. But it's not a story that is answered by distributed git either. (though I would love a good issue tracking system that is done entirely inside git)
nextaccountic · 5h ago
Unfortunately the project is not just code. It also has issues, PRs and other stuff. Github has two kinds of lock in, a) your stuff is there and if you move elsewhere you probably will wipe your issues etc (huge loss of institutional knowledge), and b) there is a network effect because everyone has a github account and people are used to just hop on a repository and file an issue (rather than being greeted by a log in page), cross-reference issues between repositories (hard to make work if repos aren't in the same site, unless both sites use some interop thing like activitypub which github will never use), etc
> Everything surrounding code: issues, CICD, etc, is obviously another story. But it's not a story that is answered by distributed git either. (though I would love a good issue tracking system that is done entirely inside git)
There is https://github.com/git-bug/git-bug - would love if people started o use it, even in a read only way: use github issues normally, but also have a bot that saves all coments to git-bug, so that i can read issues without an internet connection. Then, at a later date, make it so that people that make issues on git-bug also gets the issue posted on github, making a two way bridge.
Then, optionally, at a later stage when almost everyone migrated to git-bug, make the github issues a read only mirror of the git-bug issues. Probably not worth it: you lose drive-by comments from newcomers (that already have a github account but probably never heard of git-bug), raising the friction to report bugs
SCdF · 5h ago
> Unfortunately the project is not just code.
The literal project we are discussing is just code. It's literally just code. It doesn't have issues, PRs are disabled as much as they can be (by a GitHub action that automatically closes all PRs with a note that code should be submitted elsewhere), and all "other stuff" is disabled.
What you are referring to is more of a mirror-like approach usage of GitHub.
Some big repos or organizations might be able to pull this off, but good luck having a small project and then directing users to go through all of those hoops to submit issues somewhere else, open PRs somewhere else, etc.
mkingston · 5h ago
I was reading the git-bug documentation and found "bridges" to third-party platforms:
This is one area where Gerrit Code Review is (was? I don't know if it changed) is superior. It stores everything it knows about in git repositories (preferences in a separate meta git repository, comments, patches). With the right refspec, you can pull it all down and have a full backup.
sshine · 6h ago
> if github goes down you could still generate patches and email them around, and then push back to github when it's back.
You could, but generally people can’t. They learn a set of narrow workflows and never explore beyond. GitHub use translates into GitLab use, but not into general git use workout a central repository.
> Everything surrounding code: issues, CICD, etc, is obviously another story. But it's not a story that is answered by distributed git either. (though I would love a good issue tracking system that is done entirely inside git)
Radicle offers one. CLI-based, too.
flohofwoe · 6h ago
> They learn a set of narrow workflows and never explore beyond.
And tbh, that's how it should be for a version control system. Before git with its byzantine workflows and a thousand ways to do the same thing, version control (e.g. svn) was a thing that's just humming along invisibly in the background, something that you never had to 'learn' or even think about, much like the filesystem.
I don't need to know how a filesystem works internally to be able to use it.
And having a centralized store and history helps a lot to keep a version control system conceptually simple.
baq · 6h ago
svn was not 'humming' unless you confined yourself to a very narrow set of functionality, e.g. merging was best left to experts.
flohofwoe · 6h ago
In a centralized version control system with a single history, branching and merging is also much less important.
In git, working on your own branch is essential to not step on other people's feet and to get a clean history on a single main/dev branch (and tbf, git makes this easy for devs and text files). With a centralized version control system, both problems don't even exist in the first place.
When we did game development with a team of about 100 peeps (about 80 of those non-devs, and about 99% of the data under version control being in binary files) we had a very simple rule:
(1) do an update in the morning when you come to work, and (2) in the evening before you leave do a commit.
Everybody was working on the main branch all the time. The only times this broke was when the SVN server in the corner was running full and we either had to delete chunks of history (also very simple with svn), or get more memory and a bigger hard drive for the server.
writebetterc · 6h ago
You don't need to learn how git works internally to be able to use it. You need to know a lot about filesystems in order to use them: Folders, files, symbolic links, copy, cut, paste, how folders can exist on different devices, etc. There's just a tonne of assumed knowledge regarding them, and it's very obvious when you meet someone that doesn't have it (regular people often don't have all it).
Subversion also isn't some thing humming along invisibly in the background, it has its own quirks that you need to learn or you'll get stung.
vishnugupta · 5h ago
svn was a nightmare when it came to handling conflicts. So at least for me, humming in the background wasn’t the term used for it at work.
flohofwoe · 4h ago
This was only for true before svn 1.5 (before it had 'merge tracking'). Also, branching and merging by far wasn't as essential in svn as it is in a decentralized version control system like git. In a centralized version control system it works perfectly well to do all development in the main branch, and only branch off dead-end 'release branches' which are never merged back.
Tbh, I really wonder where the bad reputation of svn is coming from. Git does some things better, especially for 'programmer-centric teams'. But it also does many things worse, especially in projects where the majority of data is large binary files (like in game development) - and it's not like git is any good either when it comes to merging binary data.
guappa · 5h ago
Have you ever actually used svn?
flohofwoe · 4h ago
Yes for about 18 years(?) in the context of game development (I don't exactly remember when we had switched from cvs to svn, but it must have been around 2003..2005) in teams up to about 100 people, working copy sizes up to about 150 GB (with most of the data being binary game asset files), and everybody working on trunk (we only used branches for releases which were branched off trunk but never merged back, only cherry-picking bugfixes from the main into release branches as needed).
We used TortoiseSVN as UI which worked well both for devs and non-devs.
With this sort of setup, git would break down completely if it weren't for awkward hacks like git-lfs (which comes with its own share of problems).
People could learn, if there was suddenly a need. Just like they learned the narrow workflows they use now.
laserbeam · 5h ago
> You could, but generally people can’t. They learn a set of narrow workflows and never explore beyond.
The point is you CAN. Joe can in theory do it, and Steve can make an alternative piece of software to do it for Joe. In most other centralized places (like social media), you CANNOT. Joe cannot take his data off of Facebook and interact with it outside of the platform or move it to another platform.
arp242 · 5h ago
"I only accept patches and bug reports over email" is just as much of a narrow set of workflows as "I only accept patches and bug reports through PRs".
account-5 · 6h ago
This is why I like fossil, it comes with most of the stuff I use built in, and you can deploy it as a website too. Use it for all of my personal projects and used it extensively for coursework at university.
int_19h · 3h ago
The annoying thing about Fossil is that it doesn't let you squash commits, not even in your private branches - they have some kind of philosophical point about that.
If you happen to agree with it, then yeah, it's great. If you like to commit quick and dirty and then tidy it up by squashing into logically complete and self-consistent commits, too bad.
account-5 · 2h ago
I can certainly see the appeal of having neat commits but I tend not to worry about them. On a couple of occasions, with my university writing, having a immutable history helped me figure out, for example, how something had ended up in a final draft without citation. I'd deleted the citation which was a quick URL paste in a comment block in an earlier draft, and I'd never saved it to zotero. If I'd been able to tidy up my commits I'd likely have lost it completely.
int_19h · 1h ago
The appeal depends on how messy your commits are to begin with. When you know that commit history can be rewritten later, it suddenly becomes okay to commit incomplete code that doesn't properly run or even build, effectively using git as an undo system with branching. But the resulting history is completely unsuitable for any future attempt to use `git blame` and such.
This should be one of the very first links in the readme.
dijit · 6h ago
> Everything surrounding code: issues, CICD, etc, is obviously another story. But it's not a story that is answered by distributed git either. (though I would love a good issue tracking system that is done entirely inside git)
Embrace, Extend..
(largely this is unfair, as plain git leaves much to be desired- but you can’t deny that the things surrounding git on github are very sticky).
blueflow · 5h ago
Lets pray that Microsoft won't use Github to find new ways to extract money.
wordofx · 6h ago
Build a bridge and…
frizlab · 6h ago
Like fossil?
kaichanvong · 5h ago
while --it-is possible seeing how fossil confuses, for the Github conversation, it's not really in the same category, conversation, some clever happenings happening within fossil-scm, however, it's not really the same as the problem design-led github solves given people saying downtimes; sure, git, github; however how people using github, different–similar, git, however, github.
However, were you to say liken-able (slang keywords: comparative something else--) of, "fossil with git-github", then again: no.
Good call were the conversation (comments, almost interchangeable at-times haha!) being, everyone use git for Firefox, something kinda wild-topic!
littlestymaar · 6h ago
> Everything surrounding code: issues, CICD, etc, is obviously another story
That's what Github is though, it's not about the code itself it's about all your project management being on Github, and once you move it, moving out isn't realistic.
enos_feedler · 6h ago
And how are we suppose to solve this problem? By creating distributed versions of every possible component of every piece of software? Seems unrealistic. I think we should be grateful that the core underlying protocol for the most important data has the distributed properties we want. It's a lot more than we can say vs. lots of other platforms out there.
hnlmorg · 6h ago
As a GitHub user myself, I don’t disagree with your point. However I’d like to say that this isn’t quiet as different a problem to solve as it might first appear:
The issue tracking can be a branch and then you just need a compatible UI. In fact some git front ends do exactly this.
CI/CD does already exist in git via githooks. And you’re already better off using make/just/yarn/whatever for your scripts and rely as little on YAML as possible. It’s just a pity that githooks require users to set up each time so many people simply don’t bother.
groestl · 6h ago
> And how are we suppose to solve this problem? By creating distributed versions of every possible component of every piece of software? Seems unrealistic.
That's how we started out.
baq · 6h ago
Maybe that's the reason everything tends to get centralized.
groestl · 2h ago
It's an emergent phenomenon, it requires less energy expenditure overall. It's also the way of the Dodo.
int_19h · 3h ago
By storing issues etc in the repo itself. A git repo is just a generic object graph, after all, and objects don't necessarily describe files.
There are several such solutions already. The problem is that neither of them is popular enough to become a de facto standard. And, of course, centralized git providers like GitHub have a vested interest in keeping in this way, so they are unlikely to support any such solution even if it does become popular enough.
tigroferoce · 6h ago
GitHub is about the community. There are others alternatives, more in line with what Mozilla claim to be their view (I'm thinking to GitLab, for instance), but nothing gives you visibility like GitHub.
Sad to see that Mozilla is becoming less and less what they promised to be once Google funding are depleting.
SCdF · 6h ago
Right, but distributed git As Torvalds Intended™ doesn't solve those problems, so it's not related.
For the actual event we are commenting on, they have disabled all features other than code hosting and PRs.
Interestingly mozilla has effectively done this here, by using a GitHub action that automatically closes any PR with a message explaining that PRs are not to be used.
It's very silly they have to do this, but at least they can I suppose.
arp242 · 5h ago
GitHub has a fairly extensive API without too many limits AFAIK. You can definitely migrate all your data to $something_else if you want to.
xboxnolifes · 6h ago
Sure, but then we are no longer talking about git.
LtWorf · 5h ago
I managed to move to codeberg all my projects. There's everything except the secret deals with pypi to directly publish from github. Which is massively insecure anyway.
phire · 5h ago
People have forgotten just how bad centralised version control was in 2005.
If you weren't connected to the internet, you couldn't do a thing. You couldn't checkout. You couldn't commit. You could create branches. The only thing on your computer was whatever you checked out last time you were connected to the server.
People talk about SVN, but it wasn't that common in 2005. None of the project hosting platforms (like SourceForge) supported SVN, they were all still offering CVS. If you wanted to use SVN, you had to set it up on your own server. (From memory, google code was the first to offer SVN project hosting in mid-2006). Not that SVN was much better than CVS. It was more polished, but shared all the same workflow flaws.
Before Git (and friends), nothing like pull-requests existed. If you wanted to collaborate with someone else, you either gave them an account on your CVS/SVN server (and then they could create a branch and commit their code), or they sent you patch files over email.
The informal email pull requests of git were an improvement... though you still needed to put your git repo somewhere public. Github and its web-based pull requests were absolutely genius. Click a button, fork the project, branch, hack, commit, push, and then create a formal "pull request". It was nothing like centralised project management systems before it. A complete breath of fresh air.
chgs · 4h ago
Pull requests aren’t part of git. They are a feature of one implementation.
phire · 3h ago
This 2007 talk [1] of Linus Torvalds promoting git to Google was how many people were introduced to the concept of git in those days before GitHub, I remember watching it myself. Emails requesting other maintains to pull your branch was very much the suggested workflow around git.
And it was actually part of git. Even back in 2005, git included a script git request pull that generated these pull request emails. I'm pretty sure people called these emails "pull requests" before GitHub came along.
I am sure Sourceforge supported subversion by 2007 or 2008, I had a project there then. When was it added?
phire · 1h ago
It's hard to find dates for that type of thing (especially with sourceforge, their website seems actively mess with the wayback machine). But I dug deeper, apparently Sourceforge got support for SVN in 2006, which is a few months before google code.
2006 appears to be the year that SVN finally became somewhat mainstream, which is interesting because git was released in 2005. Github launched in 2008 and by 2009, everyone seemed to be abandoning SVN.
It feels like SVN was only really "mainstream" for about 3 years, Maybe 5 years at most; There was some early-adopter lead-up and then a long tail of repos refusing to switch to git.
IshKebab · 7h ago
Plenty of people use Codeberg and Gitlab. And it's still distributed - I don't need to lock files and ask coworkers if I can work on them.
Maybe if Git had native support for PRs and issues this wouldn't have happened. (And yes I'm aware of git send-email etc.)
qwertox · 6h ago
In Codeberg, how does one even search for files containing a given string? Probably the #1 thing I do on GitHub is searching for files in a project containing a given string.
sph · 6h ago
Given how terrible GitHub search in files is, what I usually do is clone the repo and run ripgrep.
nicce · 5h ago
If the repository is indexed, there isn’t really competitive search. You can find blog posts about it. They actually used ripgrep at some point. (not anymore I guess because too slow?).
Not sure when you tried last, but it's gotten a lot better over the years. If you need something from the latest master, you'll be able to find it.
mrweasel · 5h ago
But Github is actually pretty good at searching for something across all files in a repo.
IshKebab · 3h ago
Not remotely as good as grep.app.
throwaway290 · 3h ago
I'm not being sarcastic but how do you do it on github?;) it basically never works
Not only results are incomplete but it seems once they went into training LLMs on all code they host they made sure no one else can do the same easily and so now everything is madly rate limited.
Every time I just clone and grep.
jimbob45 · 6h ago
That exact exercise filled a quarter of my workday today.
mmis1000 · 6h ago
I think it would be great if git have some kind of soft lock by default (like attach a text on some file without make it into actual commit). It could probably make peoples' live easier when you and teamates need to communicate what files you are changing thus reduce the chance of conflict.
mashlol · 5h ago
FWIW git lfs does have support for locking files.
spookie · 6h ago
Yeah, especially binaries.
mhh__ · 6h ago
Git should have issue support or something like it as a convention but pull requests are an abomination that we are stuck with. No thank you.
captn3m0 · 6h ago
Not Git, but several forges are working towards an ActivityOub based federation format for these: https://f3.forgefriends.org/
eru · 6h ago
Git was invented with pull requests in mind. It's just that they were originally meant to be sent via email, not on the web.
kace91 · 6h ago
Can you expand on that? I’m probably younger but I can’t imagine a more comfortable way to review code.
eru · 6h ago
Pull requests are great, but the typical github UI isn't necessarily the best way to review code.
It's often useful. But sometimes you want to use other tools, like firing up your editor to explore.
baq · 4h ago
It’s only good if you haven’t tried anything else. Check out gerrit, but there are many more tools and workflows.
Note we’re talking about the GitHub UI mostly. Pulling and merging a remote branch is a basic git operation, almost a primitive.
snickerbockers · 6h ago
ironically hardly anybody outside of the linux kernel community uses it the way it was intended lol.
Didn't all this start with Linus getting into a spat with the bitkeeper dev involving some sort of punitive measure as a response to somebody making a reverse-engineered FOSS client? I don't remember the details and I'm sure I have at least half of them wrong, but that's easily one of the most disastrous decisions in the history of the software-business right up there with valve turning down minecraft and EA refusing to make sports games for the SEGA dreamcast (that last one isn't as well known but it led to SEGA launching the 2k sports brand to which outlasted the dreamcast and eventually got sold to a different company but otherwise still exists today and is still kicking EA's ass on basketball games).
eru · 5h ago
That's how git started.
But there were already quite a handful of other distributed version control systems around by the time git showed up.
So if Linus hadn't written git, perhaps we would be using darcs these days. And then we'd be debating whether people are using darcs the way it was intended. Or bazaar or monotone or mercurial etc.
I don't think what the original authors of any one tool intended matters very much, when there were multiple implementations of the idea around.
vintermann · 6h ago
> Didn't all this start with Linus getting into a spat with the bitkeeper dev
It's a joke that the bitkeeper dev has two revision control named after him, Mercurial and Git.
bitwize · 6h ago
I've heard the one that says much like Linux, Git is named after Linus himself.
midnightclubbed · 6h ago
EA not making sports games for Dreamcast wasn’t a bad decision for EA. It cost Sega a huge amount of money to produce and license their own sports games exclusively for Dreamcast, not having EA sports was a huge blow.
And while NBA 2k destroyed NBA Live it took until 2009 for that to start happening (long after Sega ownership), mainly down to sliding standards in EA’s NBA Live titles and eventually some disastrous EA launches.
rowanG077 · 5h ago
I don't see how EA creating their biggest rival is anything but a bad decision for them. Had they licenses they would have a monopoly and probably millions of more sales.
formerly_proven · 6h ago
It would've made sense to change many defaults in git for "normal users" ages ago (git 2?) instead of keeping the kernel-workflow defaults.
ratatoskrt · 7h ago
To be fair, Git itself is a bit of a pain, and GitHub's main achievement is/was to make it somewhat bearable.
casenmgreen · 7h ago
I regard the Git docs as being fully equal to scientific Wikipedia articles.
Everything is fully and completely explained, in terms which mean nothing.
eru · 6h ago
I find both Wikipedia and Git docs typically more useful than this. Much more.
"In astronomy, declination (abbreviated dec; symbol δ) is one of the two angles that locate a point on the celestial sphere in the equatorial coordinate system, the other being hour angle. The declination angle is measured north (positive) or south (negative) of the celestial equator, along the hour circle passing through the point in question."
Anyone who doesn't know what declination is, know from reading the introductory paragraph of this scientific Wikipedia article?
Anyone? no? :-)
I rest my case, m'lud.
executesorder66 · 29m ago
I've never heard of it before, and it makes perfect sense what it is from that intro.
On a celestial sphere (planet, star, etc) the declination angle (being 0 is at the equator, being 90 degrees is the north pole of the sphere, being -90 degrees, is at the south pole).
You also need another angle known as the "hour angle" to locate a point on the sphere. It doesn't explain what that is, but as can be seen on Wikipedia, you can easily click on that word to go to the entire page that explains what it is.
What don't you understand?
squigz · 2h ago
> Anyone who doesn't know what declination is, know from reading the introductory paragraph of this scientific Wikipedia article?
Why should this be a metric one would want Wikipedia to meet? It's an encyclopedia, not an astronomy course.
Of course, the brilliance of Wikipedia is that if you think you can write a clearer intro, you can do so! You could even add it to the simple language version of the page - https://simple.wikipedia.org/wiki/Declination
spookie · 6h ago
To be fair, most of the its difficulty is realized when you're stuck with a teammate rewriting history. Who, much like anyone anyone doing the same, hasn't bothered reading a book explaining things.
jamienicol · 5h ago
That problem is solved by preventing forced pushes. Rewriting history locally is encouraged.
Tainnor · 4h ago
Prevent forced pushes on protected branches (develop, main, hotfix etc.). I don't care if somebody force pushes their private feature branch.
cmrdporcupine · 1h ago
Force pushing onto PR branches is the only way to make the commit history in them sane.
But GH's PR process is broken anyways. I miss Gerritt.
baq · 6h ago
If you don't rewrite history in git, I don't want to bisect in your repos.
If you push rewritten history to master, you're a git.
Conclusion: learn your tools.
mkesper · 5h ago
The modern workflow is just to let GitHub squeeze yor shit commits into one and then rebasing that.
baq · 5h ago
Hardly anything modern about it, but it's a way of keeping a somewhat sane history. Certainly better than merging 'fix' 'fix' 'fix comments' into master.
The thing is, we could have done better (and have been) since before git even existed.
cmrdporcupine · 1h ago
There are legit reasons to have a series of commits within one PR, and rebase and merge them as is, and use amend/fixup and force pushes to maintain them cleanly.
It's not my favourite process, but...
tester756 · 6h ago
No, git's CLI is terrible mess.
mmis1000 · 6h ago
In some sense, git is actually like advanced zip versioning system. A commit is literally just a snapshot of code base except it tell you what is the previous version of this version.
Also, git store the files in a smarter way so file size won't explode like zip versioning.
eru · 6h ago
> A commit is literally just a snapshot of code base except it tell you what is the previous version of this version.
Or previous versions. Plural. Yes.
Well, that's one half of git. The other half is tooling to work with the snapshots and their history, eg to perform merges.
mmis1000 · 16m ago
On the other hand, the other part of git aren't really strictly work only for git. Create and apply diff also works for plain folder without git history. They are big part of the ecosystem while not bound to git in a strict way either.
johannes1234321 · 6h ago
The reason is that it is more than code. Managing identity is hard and for many projects besides having a source of truth for the repository you also need some degree of project management (bug tracking)
And: Even though source of truth is centralized for many projects in GitHub, git still benefits from being distributed: It's the basis for "forks" on VithUb and for the way people develop. Ja jung the clone locally and committing locally and preparing the change set for review. In the CVS/SVN days one had to commit to the ce teal branch way sooner and more direct.
eru · 6h ago
Yes, in git you get the benefit of fine-grained version control while you are still exploring.
Then later on for the PR, you can sanitise the whole thing for review.
In the bad old days, you only got the latter. (Unless you manually set up an unrelated repository for the former yourself.)
NexRebular · 5h ago
It really is a tragedy that git monoculture is winning over Mercurial, Fossil and other better designed alternatives. Don't even have a nice github-like service for Mercurial anymore as Bitbucket decided to give up.
int_19h · 3h ago
This happened mostly because the benefits of those other tools over git are so marginal that they don't provide a strong motivation to pick them over git unless everything else is equal. With GitHub in the picture, everything else is not equal, and so...
1wd · 3h ago
heptapod is GitLab with Mercurial support.
NexRebular · 2h ago
Which I used until they stopped releasing prebuilt packages without subscription.
ghosty141 · 5h ago
I don't get this. Git is still distributed, even if the "main" repo is on github, everybody still has a local copy. You are confusing project management (which github effectively does) and git. Git is still git, github is just a project management tool with git integration.
In the Linux kernel the project management is done via email (which is also just a centralized webserver in the end), so whats the problem?
miyuru · 5h ago
The problem is lot of Dev tools has centralized on GitHub, so much so that we cannot use IPv6 only servers for development because GitHub does not support IPv6.
From what I use composer and brew relies on GitHub to work.
When you clone a repo you store it on your computer, too. Don’t confuse version control with CI servers/bug trackers/software forges.
cookiengineer · 2h ago
To be fair: Linus didn't predict how painful email is in 2025. Self hosting email is a useless attempt if 99% of your emails land in spam anyways, and only spammer emails land in your inbox because they pay for azure or google business accounts.
The general issue that git has is making them interact with each other, I would love for git to get distributed issues, and a nice client UI that is actually graphical and usable by non-terminal users.
There were some attempts to make this distributed and discoverable via similar seed architectures like a DHT. For example, radicle comes to mind.
But staying in sync with hundreds of remotes and hundreds of branches is generally not what git is good at. All UIs aren't made for this.
I'm pointing this out because I am still trying to build a UI for this [1] which turned out to be much more painful than expected initially.
The code is still distributed. Every git clone usually creates a new, self-preserving copy (if we ignore some special flags). The problem is those features, which GitHub is offering outside of code. And I guess the irony is that GitHubs success is probably the reason nobody is adding them to git itself. Like add some subfolders into the repo for issues, wiki, discussions, etc. and have a UI for handling them all, easy. Instead, we have forges&tools supporting separate repos with flavours of widely used formats, making everything more complicated...
lmm · 6h ago
Turns out the important part wasn't the distributed-ness at all (unless you count being able to work offline). Many such cases.
globular-toast · 6h ago
Oh it is, but I think people forget what the distributed model gets you. It isn't just about having a completely decentralised workflow. When you clone a repo you have everything you need to keep working on that project. You have your own copy of all the branches which you are free to do whatever you want with. This is what makes it fast. Every clone has a brand new master branch and you never needed to ask anyone or get agreement to get your own branch to work on. Commits on your branch will never interfere with anyone else's work. You don't need to lock files and complete your work as quickly as possible. You can do as many commits as you like, a hundred in a day is not unheard of, because it's your branch. Previously people would commit once a day at most and sometimes not even until the end of the week, which is just unthinkable to a git user. A git clone is your own personal repo which allows you to use version control before you even share anything with anyone.
eru · 5h ago
> You have your own copy of all the branches which you are free to do whatever you want with.
That's the default. But git would work just as well, if by default it was only cloning master, or even only the last few commits from master instead of the full history.
You can get that behaviour today, with some options. But we can imagine an alternate universe were the defaults were different.
Most of what you say, eg about not needing lockfiles and being able to make independent offline commits, still applies.
globular-toast · 5h ago
The point wasn't really about having your own copy of the commit history, it's about having your own copy of the refs (which is all a branch is in git). Basically, your master branch is not the same branch as GitHub's master branch or anyone else's. This is one of the things people don't really seem to understand about git. It means you don't have to do the "feature branch" thing, for example, you can just do commits on your master branch then submit a PR.
spookie · 6h ago
Yup. It's an extremely powerful workflow, you won't fear trying new ideas, you aren't fully commited to them (hehe).
novaRom · 6h ago
Linus Torvalds is one of those people whos impact on the world is significant even if he was not driven by financial initiative. It’s crazy how much one person can change things just by solving their own problems really well.
No comments yet
vasco · 7h ago
Most people have it at least in two places if they work alone and in many places if they work with others. Having a consistent central UI doesn't take away from the distributed part, while adding a bunch of goodies.
csomar · 5h ago
distributed # decentralized. The point of distributed is to keep a copy of your own. The point of decentralized is to not have a central point of authority.
contravariant · 5h ago
I'm fine with it as long as ssh still works.
m-schuetz · 5h ago
I have no use for a distributed source control system. I want my stuff consolidated at one place.
starspangled · 4h ago
Really? You rm -rf your working trees each evening before you finish, and git clone them from github in the morning? :)
I store my code in a completely distributed fashion, often in several places on different local devices (laptop, build server, backup, etc) not to mention on remote systems. I use github and gitlab for backup and distribution purposes, as well as alternative ways people can share code with me (other than sending patch emails), and other people use git to get and collaborate on my work.
distributed version control system doesn't mean distributed storage magically happens. You still need to store your code on storage you trust at some level. The distributed in DVCS means that collaboration and change management is distributed. All version control operations can be performed on your own copy of a tree with no other involvement. Person A can collaborate with person B, then person B can collaborate with person C without person A being in the loop, etc.
ahoka · 6h ago
Git wouldn't be mainstream without GitHub though.
dijit · 6h ago
It might feel like that now, but in 2011 github was just one of a bunch of code forges and at the time they were all similar in quality.
Gitorious was chosen for the meego/maemo team for example.
petepete · 5h ago
In those days GitHub probably had more eyes on it in a day then Gitorious did in a quarter.
And I am one of the people saddened by the convergence on a single platform.
But you can't deny, it's always been pretty great.
Double_a_92 · 3h ago
How is it a single place if every dev has a full copy of the repository? Also unless it's some software that each user customizes and builds for themselves, you still need some kind of way to tell which is the official version.
OtomotO · 6h ago
I find this comment really interesting, because NONE of my clients in the last 10 years of (self-) employment had even a single codebase on GitHub.
I am contributing to a few open source projects on GitHub here and there though.
voidspark · 5h ago
GitHub is not Git.
Git is by far the most widely used VCS. The majority of code hosting services use it.
Moving to git is understandable (Mozilla was using mercurial) but Github, really?
It's not like the hairy C++ code base of Firefox will suddenly become less scary and attract more open source developers simply because it's moving to Github.
tester756 · 6h ago
Why should we care about what Linus invented?
TZubiri · 5h ago
pretty cool that we have a distributed version control system but people still complain that the distributed version control system is not itself hosted on a public distributed version control system like a ouroubouros of transparency so transparent that you can't even see the thing and you lose it because you don't know where it is and you lose yourself in a maze of infinitely branching dependency tree of self hosted bug trackers and federated account systems so that you can keep track of your bug reports and compile the bug tracker from scratch and all of a sudden you are building linux and you want to report a linux bug, but you need to send an email so you build an HTTP server but you don't have a C compiler yet so you download the latest C source code, but you don't have a C compiler to compile it, so you just make a github account and learn to compromise on your ideals welcome to adulthood.
Barrin92 · 6h ago
It's no more surprising than the fact that we invented distributed protocols to talk online and yet people use gmail or Facebook rather than sending data over the wire themselves.
People who are very insistent on distributed solutions never seem to understand that the economic, social and organizational reasons for division of labor, hierarchy and centralization didn't suddenly go away.
moralestapia · 7h ago
Care to explain?
joha4270 · 7h ago
If GitHub went down, how much would it impact the open source world?
Sure, there would be local copies everywhere, but for a distribution version control system, it's pretty centralized at GitHub
kgeist · 6h ago
If GitHub went down, the code would be fine (just announce a new official URL), but the main thing that would be lost is issues and pull requests. Maybe Git should add official support for issues and pull requests in its metadata to be fully decentralized.
nurumaik · 6h ago
Fully decentralized metadata so we can finally have merge conflicts in PR comments while discussing merge conflicts
joha4270 · 5h ago
Yes, as a I mentioned there is plenty of local copies of the code floating around.
Everything else... as the original comment said, is pretty centralized for a decentralized system.
aucisson_masque · 7h ago
Linus Torvalds invented git, which is what's used by GitHub and others like gitlab.
sorbusherra · 6h ago
which is also owned by microsoft, that uses github data to train large language model. So, after decades of trying to kill linux and sabotage it, they finally figured out how to own it.
masfoobar · 6h ago
As a Free software supporter, its just a matter of time before we lose out on our freedoms. Honestly, once Linus retires I do think Linux will continue to thrive with a good team but Linux, the kernel, will either have to adapt to current times (whatever that can be in the future) or something else will replace it and, likely, some AI aspect on top.
It wont be free software and, likely, it will be Microsoft.
guappa · 3h ago
Linus doesn't care about free software. You're thinking of rms.
octocop · 5h ago
Nice, I was just checking yesterday to find the source code of firefox. Even if it is only a mirror it's a nice step to make it more available I think.
nikolayasdf123 · 3h ago
nice, GitHub is defacto place to keep and release code
edelbitter · 3h ago
So no IPv6 in the foreseeable future?
DennisL123 · 6h ago
A BUILD.md could be useful.
kristel100 · 4h ago
Honestly surprised it took this long. For a project that depends so much on community contribution, being on GitHub just lowers the barrier for new devs. Curious to see if this revives contribution velocity.
berkes · 4h ago
In the very least, it will open up FTEs that can now work on what makes Mozilla projects unique, rather than on building and maintaining generic fundamentals.
It's a pet-peeve and personal frustration of mine. "Do one thing and do that well" is also often forgotten in this part of Open Source projects. You are building a free alternative to slack? spend every hour on building the free alternative to slack, not on selfhosting your Gitlab, operating your CI-CD worker-clusters or debugging your wiki-servers.
cubefox · 2h ago
I assume this is now one of the largest (lines of code) projects on GitHub.
rvz · 6h ago
Centralizing everything to GitHub really isn't a good idea given their frequent incidents every week.
roschdal · 4h ago
Firefox moves to GitHub. Now someone better make a fork to make a proper web browser, small, fast, lean and without bloat and surveillance.
sylware · 2h ago
Bad Move.
github.com broke noscript/basic (x)html interop for most if not all core functions (which were working before). The issue system was broken not that long time ago.
And one of the projects which should worry about, even enforce, such interop, moving to microsoft github...
The internet world is a wild toxic beast.
mhh__ · 6h ago
Inevitable. GitHub is a good platform in need of some proper dev workflows (pull requests are atrocious, branches footguns, yml driven CI is a noose) but they've obviously won.
jopsen · 5h ago
I don't Firefox is moving to Github Actions anytime soon. I was pretty involved with the TaskCluster setup years ago, and it still seems to be running a bunch of CI things.
mozilla-central has a LOT of tests -- each push burns a lot of compute hours.
mentalgear · 6h ago
Would have been great if they used an European alternative ( like Codeberg ).
selectnull · 6h ago
Mozilla is US organization, why would they care to?
neilv · 5h ago
As for European specifically, maybe the commenter was talking about data protection laws. If not, maybe (in many European countries at the moment) less national or business background of ruthlessness.
I was thinking something different: I wonder whether Mozilla considered GitLab or Codeberg, which are the other two I know that are popular with open source projects that don't trust GitHub since it sold out to Microsoft.
(FWIW, Microsoft has been relatively gentle or subtle with GitHub, for whatever reason. Though presumably MS will backstab eventually. And you can debate whether that's already started, such as with pushing "AI" that launders open source software copyrights, and offering to indemnify users for violations. But I'd guess that a project would be pragmatically fine at least near term going with GitHub, though they're not setting a great example.)
selectnull · 4h ago
Given the Mozilla direction lately, the last thing they want is good data protection laws.
fsflover · 3h ago
This is a huge exaggeration, borderline dishonest attack.
selectnull · 3h ago
Time will tell. I would love to be wrong.
fsflover · 13m ago
You didn't even provide any actual context making it impossible to argue with you. HN should have better conversations than shallow dismissals (according to the guidelines).
pparanoidd · 6h ago
lol codeberg is down right now, bad timing
berkes · 4h ago
I've used Codeberg for some projects and while their work and services are impressive and their progress steady and good, it's really not a proper alternative to Github for many use-cases.
"It depends", as always, but codeberg lacks features (that your use-case may not need, or may require), uptime/performance (that may be crucial or inconsequential to your use-case), familiarity (that may deter devs), integration (that may be time-consuming to build yourself or be unnessecary for your case) etc etc.
petepete · 5h ago
I wonder how long it'll take for my PR which entirely removes the built in Pocket integration will take to be dismissed.
reddalo · 7h ago
Why GitHub? If they truly cared about open-source they would've chosen something else, such as a self-hosted Forgejo [1], or its most common public instance Codeberg [2].
I would argue that part of "truly caring" about open-source is being where the contributors and community are. That's probably a large part of the move to GitHub, and neither of these other options would achieve that. As much as one can say "git is distributed, the server doesn't matter", the centre of the community very much does matter, and for better or worse that's currently Github.
arccy · 4h ago
If you maintain a popular project, you'll quickly find that github prs are a massive source of spam and low quality prs with people that don't even bother to follow up.
Bad PRs all around, with just a constant stream of drive by "why no merge?!?!?!" comments.
Tepix · 4h ago
We need to work on decentralisation of git forges, making it less relevant where a project is hosted by offering cross-instance collaboration and discoverability.
protocolture · 7h ago
If they truly cared about open source they would have hosted their own git on a run down pentium 2 in a nerds basement, never washed, and spent most of their time complaining online.
freddie_mercury · 7h ago
To assert that an organisation doesn't "truly" care about open source simply because they've chosen a tool that isn't is ridiculous.
Even before this Mozilla almost certainly used hundreds of closed source tools, including things like Slack, Excel, Anaplan, Workday, etc.
Lightkey · 4h ago
Using proprietary software in-house for management is one thing, forcing outside contributors to use them, another. That is why they went out of their way to avoid Slack when the time came to leave IRC behind and they chose Matrix instead.
mzi · 7h ago
Codeberg unfortunately have an abysmal uptime track record.
gsich · 7h ago
Probably only for visibility. Or MS is in the process of sponsoring them.
pndy · 6h ago
Considering image backlash they had over last year with: acquiring ad tech company created by former meta people, which in turn lead to introducing so-called "privacy preserving attribution" feature for ads tracking, changing ToS terms regarding data collection, firing CPO who was diagnosed with cancer. Then I do believe these all little changes are PR stunts with an attempt to regain trust of users who strongly criticised Mozilla in last year and earlier.
They should restructure instead, hire people who actually want to work on software and not use corporation and foundation around it as platform for their... peculiar "endeavours". But I doubt that's gonna happen - flow of Google cash and from all those naive people who think supporting Mozilla directly contributes to Firefox is too good it seems. But then it's understandable they do this - money from Google tap can get twisted.
aucisson_masque · 7h ago
> MS is in the process of sponsoring them.
Think you might be on something, with the incoming end of Google cash flow, Firefox may be in discussion with bing and that could be part of the agreement, use Microsoft server.
rurban · 6h ago
I maintain some project on all forges in parallel, even savannah. Savannah is even the default. But 99% of all reports and contributions are on the github mirror, 1% on savannah, 0% on gitlab and 0% on codeberg. Nobody cares about those islands.
issues are stored in git bug and automatically synced. Github is the only viable option, but you can keep the others as mirrors when github chooses to strike you.
AStonesThrow · 7h ago
> If they truly cared about open-source
Perhaps Microsoft offered to pick up the tab that Google has been paying, but is now imperiled, or at least lend some sort of financial support, and Firefox cares more about paying their bills than open source
To give a bit of additional context here, since the link doesn't have any:
The Firefox code has indeed recently moved from having its canonical home on mercurial at hg.mozilla.org to GitHub. This only affects the code; bugzilla is still being used for issue tracking, phabricator for code review and landing, and our taskcluster system for CI.
In the short term the mercurial servers still exist, and are synced from GitHub. That allows automated systems to transfer to the git backend over time rather than all at once. Mercurial is also still being used for the "try" repository (where you push to run CI on WIP patches), although it's increasingly behind an abstraction layer; that will also migrate later.
For people familiar with the old repos, "mozilla-central" is mapped onto the more standard branch name "main", and "autoland" is a branch called "autoland".
It's also true that it's been possible to contribute to Firefox exclusively using git for a long time, although you had to install the "git cinnabar" extension. The choice between the learning hg and using git+extension was a it of an impediment for many new contributors, who most often knew git and not mercurial. Now that choice is no longer necessary. Glandium, who wrote git cinnabar, wrote extensively at the time this migration was first announced about the history of VCS at Mozilla, and gave a little more context on the reasons for the migration [1].
So in the short term the differences from the point of view of contributors are minimal: using stock git is now the default and expected workflow, but apart from that not much else has changed. There may or may not eventually be support for GitHub-based workflows (i.e. PRs) but that is explicitly not part of this change.
On the backend, once the migration is complete, Mozilla will spend less time hosting its own VCS infrastructure, which turns out to be a significant challenge at the scale, performance and availability needed for such a large project.
[1] https://glandium.org/blog/?p=4346
If I may - what were the significant scale challenges for self hosted solution?
The obvious generic challenges are availability and security: Firefox has contributors around the globe and if the VCS server goes down then it's hard to get work done (yes, you can work locally, but you can't land patches or ship fixes to users). Firefox is also a pretty high value target, and an attacker with access to the VCS server would be a problem.
To be clear I'm not claiming that there were specific problems related to these things; just that they represent challenges that Mozilla has to deal with when self hosting.
The other obvious problem at scale is performance. With a large repo both read and write performance are concerns. Cloning the repo is the first step that new contributors need to take, and if that's slow then it can be a dealbreaker for many people, especially on less reliable internet. Out hg backend was using replication to help with this [1], but you can see from the link how much complexity that adds.
Firefox has enough contributors that write contention also becomes a problem; for example pushing to the "try" repo (to run local patches through CI) often ended up taking tens of minutes waiting for a lock. This was (recently) mostly hidden from end users by pushing patches through a custom "lando" system that asynchronously queues the actual VCS push rather than blocking the user locally, but that's more of a mitigation than a real solution (lando is still required with the GitHub backend because it becomes the places where custom VCS rules which previously lived directly in the hg server, but which don't map onto GitHub features, are enforced).
[1] https://mozilla-version-control-tools.readthedocs.io/en/late...
It is free and robust, and there is not much bad Microsoft can do to you. Because it is standard git, there is no lockdown. If they make a decision you don't like, migrating is just a git clone. As for the "training copilot" part, it is public, it doesn't change anything that Microsoft hosts the project on their own servers, they can just get the source like anyone else, they probably already do.
Why not Codeberg? I don't know, maybe bandwidth, but if that's standard git, making a mirror on Codeberg should be trivial.
That's why git is awesome. The central repository is just a convention. Technically, there is no difference between the original and the clone. You don't even need to be online to collaborate, as long as you have a way to exchange files.
Recently I also got "rate limited" after opening about three web pages.
Microsoft can do something to you, and that is to arbitrarily deny you access after you've built a dependence on it, and then make you jump through hoops to get access back.
People who haven’t used it logged out recently may be surprised to find that they have, for some time, made the site effectively unusable without an account. Doing one search and clicking a couple results gets you temporarily blocked. It’s effectively an account-required website now.
However, it is clearly not correct to say that you were banned from GitHub. It’s like saying “I was banned from Google because I refuse to use computing devices.”
Not really a ban, just self flagellation, which, again, whatever works for you.
Of course Mozilla is free to make their own choices. But this choice will be read as the latest alarm bell for many already questioning the spirit of Mozilla management.
Given the post above, issues regarding self-hosting were at least part of the reason for the switch so a new self-hosted arrangement is unlikely to have been considered at all.
I don't know what the state of play is right now, but non-self-hosted GitLab has had some notable performance issues (and, less often IIRC, availability issues) in the past. This would be a concern for a popular project with many contributors, especially one with a codebase as large as Firefox.
I used a GitLab + GitLab Runner (docker) pipeline for my Ph.D. project which did some verification after every push (since the code was scientific), and even that took 10 minutes to complete even if it was pretty basic. Debian's some packages need more than three hours in their own CI/CD pipeline.
Something like Mozilla Firefox, which is tested against regressions, performance, etc. (see https://www.arewefastyet.com) needs serious infrastructure and compute time to build in n different configurations (stable / testing / nightly + all the operating systems it supports) and then test at that scale. This needs essentially a server farm, to complete in reasonable time.
An infrastructure of that size needs at least two competent people to keep it connected to all relevant cogs and running at full performance, too.
So yes, it's a significant effort.
Firefox does indeed have a large CI system and ends up running thousands of jobs on each push to main (formerly mozilla-central), covering builds, linting, multiple testsuites, performance testing, etc. all across multiple platforms and configurations. In addition there are "try" pushes for work in progress patches, and various other kinds of non-CI tasks (e.g. fuzzing). That is all run on our taskcluster system and I don't believe there are any plans to change that.
Your guess is wrong as Firefox doesn't use GitHub for any of that, and AFAIK there are no plans to either.
The blog post linked in the top comment goes in to this in some detail, but in brief: git log, clone, diff, showing files, blame, etc. is CPU expensive. You can see this locally on large repo if you try something like "git log path/to/dir".
Add to this all the standard requirements of running any server that needs to be 1) fast, and 2) highly available.
And why bother when there's a free service available for you?
Given the frequency I see comments on this site about Mozilla trying to do far too much rather than just focusing their efforts on core stuff like Firefox, I'm honestly a bit surprised that there aren't more people agreeing with this decision. Even with the other issues I have with Mozilla lately (like the whole debacle over the privacy policy changes and the extremely bizarre follow-up about what the definition of "selling user data" is), I don't see it as hypocritical to use GitHub while maintaining a stance that open solutions are better than closed ones because I think trying to make an open browser in the current era is a large and complicated goal for it to be worth it to set a high bar for taking on additional fights. Insisting on spending effort on maintaining their own version control servers feels like a effort that they don't need to be taking on right now, and I'd much rather than Mozilla pick their battles carefully like this more often than less. Trying to fight for more open source hosting at this point is a large enough battle that maybe it would make more sense for a separate organization focused on that to be leading the front in that regard; providing an alternative to Chrome is a big enough struggle that it's not crazy for them to decide that GitHub's dominance has to be someone else's problem.
I would love to see Mozilla moving to Codeberg.org (though I’d ask if they’re okay with it first) or something like that. Using GitHub is okay-ish? Personally, I frown upon it, but again I agree – it’s not the most important issue right now.
I'm not claiming that my comment was 100% accurate, but they plan to move some of the CI to GitHub, at least.
Really? I've seen no indication of that anywhere, and I'd be amazed if they did.
They're not using github PRs, and github actions really fights against other development workflows... not to mention they already have invested a lot in TaskCluster, and specialized it to their needs.
Where are you getting that from?
Grim.
The best reason to be using github at all is to maximize the portion of your users who are comfortable submitting bug reports, as they already have an account and are familiar with how the platform works (due to network effects.) Projects which host code on github but chose not to take bug reports there are effectively gate keeping bug submission, by asking their users to jump through the hoops of finding the site, signing up for it, and learning to use a new interface. I've done this before, with Bugzilla and Firefox, to submit a bug report for an accessibility bug on MacOS and it was a pain in the ass that I put off for a long time before becoming annoyed enough to go through the process. (End result: the bug was confirmed but never fixed..)
That said, there are also other teams and projects who do use GitHub for issue tracking. However the closer to Firefox/Gecko you are the harder this gets. For example it's hard to cross-reference GitHub issues with Bugzilla issues, or vice versa. I've seen people try to build two-way sync between GitHub and Bugzilla, but there are quite considerable technical challenges in trying to make that kind of cross-system replication work well.
However your point that GitHub makes issue submission easier for people who aren't deeply embedded in the project is a good one. I'm directly involved with webcompat.com, which aims to collect reports of broken sites from end users. It's using a GitHub issue tracker as the backend; allowing developers to directly report through GitHub, and a web-form frontend so that people without even a GitHub account can still submit reports (as you can imagine quite some effort is required here to ensure that it's not overwhelmed by spam). So finding ways to enable users to report issues is something we care about.
However, even in the webcompat.com case where collecting issues from people outside the project is the most important concern, we've taken to moving confirmed reports into bugzilla, so that they can be cross-referenced with the corresponding platform bugs, more easily used as inputs to prioritization, etc. That single source of truth for all bugs turns out to be very useful for process reasons as well as technical ones.
So — (again) without being any kind of decision maker here — I think it's very unlikely that Firefox will move entirely to GitHub issues in the foreseeable future; it's just too challenging given the history and requirements. Having some kind of one-way sync from GitHub to Bugzilla seems like a more tractable approach from an engineering point of view, but even there it's likely that there are non-trivial costs and tradeoffs involved.
> are effectively gate keeping bug submission
Of course this could be a benefit… Have you seen the quality of bug reports coming from some people, even other devs? :-)
I think you can dislike the general move to a service like GitHub instead of GitLab (or something else). But I think we all benefit from the fact that Firefox's development continues and that we have a competing engine on the market.
Both patches have been ignored thus far. That's okay, I understand limited resources etc. etc. Will they ever be merged? I don't know. Maybe not.
I'm okay with all of this, it's not a complaint. It's how open source works sometimes. But it also means all that time I spent figuring out the contribution process has been a waste. Time I could have spent on more/other patches.
So yeah, there's that.
It's certainly true that making the bar higher will reduce low-quality contributions, because it will reduce ALL contributions.
(aside: FreeBSD does accept patches over GitHub, but it also somewhat discourages that and the last time I did that it also took a long time for it to get reviewed, although not as long as now)
There's no easy solution. Much like the recent curl security kerfuffle, the signal:noise ratio is important and hard to maintain.
Email is simple. It's just text, there's no weird javascript or html or lag. I don't have to open X11. I can just open mutt and read or write. I can type "git send-email". It's all open source, so I can read the code to understand it, and write scripting around it. It runs on any computer with ease. Even on a slow connection, it's quite speedy.
I totally agree with you about Phabricator though.
I have some unconventional workflows. And I try not to bother anyone else with it, especially in a volunteer driven open source context. It would be selfish to do otherwise.
To be honest based on what you've written here, keeping you out of my projects sounds like a good thing. What a bunch of piss and vinegar over how other people are choosing to work in a way that works for them.
Every contributor is valuable, it's in the name, the definition of "contribute".
Any bar to entry is bad, it certainly never is the solution to a different problem (not being able to manage all contributions). If anything, in the longer run, it will only make it worse.
Now, to be clear, while I do think GitHub is currently the "solution" to lower barriers, allow more people to contribute and as such improve your Open Source Project, the fact this is so, is a different and other problem - there isn't any good alternative to Github (with broad definitions of "good") why is that and what can we do to fix that, if at all?
In practice, if you get dozens of PRs from people who clearly did it to bolster up their CV, because their professor asked them or something like that, it just takes a toll. It's more effort than writing the same code yourself. Of course I love to mentor people, if I have the capacity. But a good chunk of the GitHub contributions I've worked on were pretty careless, not even tested, that kind of thing. I haven't done the maintainer job in a while, I'm pretty terrified by the idea of what effect the advent of vibe coding had on PR quality.
I feel pretty smug the way I'm talking about "PR quality", but if the volume of PRs that take a lot of effort to review and merge is high enough, it can be pretty daunting. From a maintainer perspective, the best thing to have are thoughtful people that genuinely use and like the software and want to make it better with a few contributions. That is unfortunately, in my experience, not the most common case, especially on GitHub.
But I just don't see how GitHub or a PR-style workflow relates. Like I said in my own reply: I think it's just because you'll receive less contributions overall. That's a completely fair and reasonable trade-off to make, as long as you realise that is the trade-off you're making.
Proposed contributions can in fact have negative value, if the contributor implements some feature or bug fix in a way that makes it more difficult to maintain in the long term or introduces bugs in other code.
And even if such contribution is ultimately rejected, someone knowledgeable has to spend time and effort reviewing such code first - time and effort that could have been spend on another, more useful PR.
Quite obviously, any incidental friction makes this ever so slightly harder or less likely. Good contributions don't necessarily or only come from people who are already determined from the get go. Many might just want to dabble at first, or they are just casually browsing and see something that catches their attention.
Every projects needs some form of gatekeeping at some level. But it's unclear to me whether the solution is to avoid platforms with high visibility and tools that are very common and familiar. You probably need a more sophisticated and granular filter than that.
No. I definitely seen people who created multitude of misleading bug reports, flood of stupid feature requests. I personally did a bit of both.
There are people who do both repetitively, fill issue reports without filling requested fields. Or open issue again when their previous report was closed.
I got once bug report where someone was ranting that app is breaking data. Turned out (after wasting my time on investigating it) that user broke data on their own with different software, through its misuse.
There were PRs adding backdoors. This is not a valuable contribution.
There were PRs done to foment useless harmful political mess.
Some people pretend to be multiple people and argue with themselves in pull requests or issues (using multiple accounts or in more bizarre cases using one). Or try to be listed multiple times as contributor.
Some people try to sneak in some intentionally harmful content one way or another.
Some contributors are NOT valuable. Some should be banned or educated (see https://www.chiark.greenend.org.uk/~sgtatham/bugs.html ).
Diversity, here too, is of crucial importance. It's why some Open Source software has sublime documentation and impeccible translations, while the other is technically perfect but undecipherable. It's why some Open Source software has cute logos or appeals to professionals, while the other remains this hobby-project that no-one ever takes serious despite its' technical brilliance.
For projects that I'd be interested in being a long-term contributor to, this is obviously different, but you don't become a long-term contributor without first dealing with the short-term, and if you make that experience a pain, I'm unlikely to stick around.
A big part of this is the friction in signing up; I hope federated forges become more of a thing, and I can carry my identity around and start using alternate forges without having to store yet another password in my password manager.
"Friction in signing up" being a big part for you is also weird, considering basically all free software GitHub alternatives (Gitea, GitLab, Forgejo) support SSO via GitHub.
Contributors who can't use GitHub because either 1) they are fresh and can't activate a new account 2) their old grandfathered account is no longer usable or 3) their old account id doxxed and they can no longer safely contribute under the old identity.
Once you trigger phone-number verification requirement your account is globally shadowbanned and support blocked pending SMS code verification. Aside from the privacy issue it's completely blocking people in countries to which GitHub won't even try to SMS/call.
Remember that registering a second account would be violating GitHub ToS.
Not to mention the AI-generated security "issues" that are reported against curl, for example, suggests there can indeed be negative value for reports, and contributions.
I don't think this is the place for a debate about the overall utility of open source.
Alternatives to github
We lament Google's browser engine monopoly, but putting the vast majority of open source projects on github is just the expected course to take. I guess we'll repeat history once microsoft decides to set in the enshittification, maybe one day mobile OSes replace Windows and they're strapped for cash, who knows, but it's a centralised closed system owned by a corporation that absolutely adores FOSS
I don't mind any particular project (such as this one) being in Github and I can understand that Mozilla chooses the easy path, they've got bigger problems after all, but it's not like there are no concerns with everyone and everything moving to github
GitLab? It was awful. Slow, and paying for that kind of experience felt like a bad joke. It's much better now but it was borderline unusable back in the day.
Or SourceForge, before Git was mainstream? Also terrible.
GitHub succeeded because it quickly established itself as a decent way to host Git - not because it was exceptional, but because the competition had abysmal UX.
Unlike other lock-in-prone services, moving a Git project is trivial. If GitHub loses its advantages due to enshittification, you just move. Case in point: Mozilla hopping on and off GitHub, as this article shows.
I think the real competition began around the same time.
not really
just moving issue tracker and discussions is highly annoying
trying to get your users to move is likely hard and you will lose many
still, may be easy in comparison
A lot more contributions on GH, but the majority of them ignored guidelines and/or had low code quality and attention to detail. Just my anecdotal experience of course.
* contributors need to start somewhere, so even broken PRs can lead to having a valuable contributor if you're able to guide them.
no.
Somehow I think you're holding the difficulty scale backwards!
Being a good coder has absolutely no correlation to being good at using Mercurial.
I struggled to understand how the two interacted with each other, and I didn't know how to 'update my branch/pr' and I eventually just gave up.
Their docs was also a mess back then and made me recompile everything even if it wasnt needed.
https://github.com/torvalds/linux
// EDIT: Source: https://news.ycombinator.com/item?id=43970574
https://github.com/mozilla-firefox/firefox/blob/main/.github...
I get it from GitHub’s perspective, it’s a nudge to get people to accept the core premise of ”social coding” and encouraging user pressure for mirrored projects to accept GitHub as a contribution entrypoint. I’m impressed by their successes and would attribute some of that to forced socialization practices such as not allowing PRs to be disabled. I’ve grown to dislike it and become disillusioned by GitHub over the course of a long time, but I’m in awe of how well it has worked for them.
https://github.com/mozilla
Now, both the desktop and the mobile version will be on Github, and the "issues" will stay on Bugzilla.
This will take advantage of both GitHub's good search and source browsing and Git's familiar system.
As a former Firefox and Thunderbird contributor, I have to say that I used local search instead of trying to find something on the mozilla-central website.
Of course, when you're actively developing software, you search inside your IDE, but allowing to find things easily on the website makes it more welcoming for potential new contributors.
On the contrary, I find searchfox to be the best code navigation tool I used. It has nice cross-language navigation features (like jumping from .webidl interface definition to c++ implementation), it has always-on blame (with more features too) and despite that it's really fast and feels extremely lightweight compared to GitHub interface. I really wish I had this with more projects, and I'll be sad if it ever dies.
Then MXR got replaced by DXR, itself replaced in 2020 by Searchfox (introduced in 2016).
https://discourse.mozilla.org/t/decommission-dxr/69475
https://billmccloskey.wordpress.com/2016/06/07/searchfox/
The source browsing has detoriated severely relatively recently IME, to the point where i can't be called "good" anymore.
It now loads asynchronously (requiring js) and lazily, randomly breaks on shaky connections and in-page search is broken.
The recent issues/PRs revamp is also a pretty major step back. Try searching in PRs with all uBlock Origin lists enabled.
EDIT: skimming these comments, I like how none of the top comments are talking about the bigger story here which is the move away from mercurial to git and instead everyone is focusing on github itself. This has essentially sealed hg away to obscurity forever. Do people not realise git is a program that runs on your computer and github is just a service that uses git? May be this is an old man gripe at this point but I'm surprised at the lack of technical discussion around this.
GitHub also has a lot of features and authentication scopes tied to the whole org, which is pretty risky for an org as large as Mozilla.
https://wiki.mozilla.org/GitHub#other_github
Unfortunately often the cleaner option is to create a separate org, which is a pain to use (e.g. you log in to each separately, even if they share the same SSO, PATs have to be authorised on each one separately, etc).
In Gitlab, you would have had one instance or org for Mozilla, and a namespace for Firefox, another one for other stuff, etc.
It's like AWS accounts vs GCP projects. Yeah, there are ways around the organisational limitations, but the UX is still leaky.
Now it has "main" and "autoland", what are they? Which one is the equivalent of mozilla-central before?
The "new" git default branch name is 'main' and 'autoland' existed before next to 'mozilla-central' and is the one where commits usually appear first.
Commits land in autoland and get backed out if they cause test failures. That's merged to main ~twice per day when CI is happy
I've mostly encountered these branches/repos when checking commits linked to Bugzilla tickets, and I don't recall seeing "autoland" show up too much in those cases.
https://github.com/mozilla-firefox
https://www.phoronix.com/news/Firefox-Going-Git
On the other hand, the plethora of different self-hosted platforms with limited feature sets is a huge pain. Just finding the repo is often a frustrating exercise, and then trying to view, or worse, search the code without checking it out is often even more frustrating or straight out impossible.
Surely most open source projects have a link to their source code? Whether it's github, gitlab, sourcehut, or anything else?
But it’s a lot of work to prevent abuse, especially for resource intensive features when supporting unsigned-in use cases.
The killer feature is collocation of features to a single forge, combined with a generous free tier it’s the windows xp of the ecosystem: everybody has it, everybody knows it, almost nobody knows anything else.
As for PRs: I'm sure Mozilla welcome contributions, but accepting GitHub PRs is going to be a recipe for thousands of low-value drive-by commits, which will require a lot of triage.
I agree it is rather basic but I don't see how it's hard to navigate.
> accepting GitHub PRs is going to be a recipe for thousands of low-value drive-by commits, which will require a lot of triage.
I don't think that really happens based on what I've seen of other huge projects on GitHub.
Jira and bugzilla are vastly superior to GH Issues.
Jira doesn't even deserve 10% of the hate it gets. Most of what makes Jira awful is the people using it. Bugzilla is getting a bit long in the tooth, but at least it's still free and open source.
I think you're in the tiny minority with that opinion.
> Most of what makes Jira awful is the people using it.
Not even close. Yes, people aren't good at administering it, but there are soooo many reasons that it's shit apart from that. Not least the hilarious slowness. Jira Cloud is so slow that not even Atlassian use it.
Also I don't think you can just say "you're holding it wrong". Part of the reason people screw up Jira configs so much is that it makes it so easy to screw them up. You can't separate the two.
> but at least it's still free and open source.
Just being open source doesn't make something good.
I'm not. The whole "I hate Jira thing" is a meme among a very vocal minority of tech enthusiasts. They don't have tens of millions of users because Jira is awful. The reason why so many people cry about it (apart from the meme-factor) is that people conflate Jira with their team's failed approach at scrum.
Sure, it has rough edges, and sure, Atlassian as a company sucks. I have a bug report open on their Jira for some 20 years and I don't think it will ever get fixed. And yes, Jira Cloud is very slow, it's ridiculous. And in spite of that, GH Issues is still objectively worse. It's so far behind in terms of features that it isn't even a fair comparison.
Everything surrounding code: issues, CICD, etc, is obviously another story. But it's not a story that is answered by distributed git either. (though I would love a good issue tracking system that is done entirely inside git)
> Everything surrounding code: issues, CICD, etc, is obviously another story. But it's not a story that is answered by distributed git either. (though I would love a good issue tracking system that is done entirely inside git)
There is https://github.com/git-bug/git-bug - would love if people started o use it, even in a read only way: use github issues normally, but also have a bot that saves all coments to git-bug, so that i can read issues without an internet connection. Then, at a later date, make it so that people that make issues on git-bug also gets the issue posted on github, making a two way bridge.
Then, optionally, at a later stage when almost everyone migrated to git-bug, make the github issues a read only mirror of the git-bug issues. Probably not worth it: you lose drive-by comments from newcomers (that already have a github account but probably never heard of git-bug), raising the friction to report bugs
The literal project we are discussing is just code. It's literally just code. It doesn't have issues, PRs are disabled as much as they can be (by a GitHub action that automatically closes all PRs with a note that code should be submitted elsewhere), and all "other stuff" is disabled.
https://github.com/mozilla-firefox/firefox
Some big repos or organizations might be able to pull this off, but good luck having a small project and then directing users to go through all of those hoops to submit issues somewhere else, open PRs somewhere else, etc.
https://github.com/git-bug/git-bug/blob/master/doc/usage/thi...
I have not tried it.
You could, but generally people can’t. They learn a set of narrow workflows and never explore beyond. GitHub use translates into GitLab use, but not into general git use workout a central repository.
> Everything surrounding code: issues, CICD, etc, is obviously another story. But it's not a story that is answered by distributed git either. (though I would love a good issue tracking system that is done entirely inside git)
Radicle offers one. CLI-based, too.
And tbh, that's how it should be for a version control system. Before git with its byzantine workflows and a thousand ways to do the same thing, version control (e.g. svn) was a thing that's just humming along invisibly in the background, something that you never had to 'learn' or even think about, much like the filesystem.
I don't need to know how a filesystem works internally to be able to use it.
And having a centralized store and history helps a lot to keep a version control system conceptually simple.
In git, working on your own branch is essential to not step on other people's feet and to get a clean history on a single main/dev branch (and tbf, git makes this easy for devs and text files). With a centralized version control system, both problems don't even exist in the first place.
When we did game development with a team of about 100 peeps (about 80 of those non-devs, and about 99% of the data under version control being in binary files) we had a very simple rule:
(1) do an update in the morning when you come to work, and (2) in the evening before you leave do a commit.
Everybody was working on the main branch all the time. The only times this broke was when the SVN server in the corner was running full and we either had to delete chunks of history (also very simple with svn), or get more memory and a bigger hard drive for the server.
Subversion also isn't some thing humming along invisibly in the background, it has its own quirks that you need to learn or you'll get stung.
Tbh, I really wonder where the bad reputation of svn is coming from. Git does some things better, especially for 'programmer-centric teams'. But it also does many things worse, especially in projects where the majority of data is large binary files (like in game development) - and it's not like git is any good either when it comes to merging binary data.
We used TortoiseSVN as UI which worked well both for devs and non-devs.
With this sort of setup, git would break down completely if it weren't for awkward hacks like git-lfs (which comes with its own share of problems).
The point is you CAN. Joe can in theory do it, and Steve can make an alternative piece of software to do it for Joe. In most other centralized places (like social media), you CANNOT. Joe cannot take his data off of Facebook and interact with it outside of the platform or move it to another platform.
If you happen to agree with it, then yeah, it's great. If you like to commit quick and dirty and then tidy it up by squashing into logically complete and self-consistent commits, too bad.
You might like git-bug:
https://github.com/git-bug/git-bug
This should be one of the very first links in the readme.
Embrace, Extend..
(largely this is unfair, as plain git leaves much to be desired- but you can’t deny that the things surrounding git on github are very sticky).
However, were you to say liken-able (slang keywords: comparative something else--) of, "fossil with git-github", then again: no.
Good call were the conversation (comments, almost interchangeable at-times haha!) being, everyone use git for Firefox, something kinda wild-topic!
That's what Github is though, it's not about the code itself it's about all your project management being on Github, and once you move it, moving out isn't realistic.
The issue tracking can be a branch and then you just need a compatible UI. In fact some git front ends do exactly this.
CI/CD does already exist in git via githooks. And you’re already better off using make/just/yarn/whatever for your scripts and rely as little on YAML as possible. It’s just a pity that githooks require users to set up each time so many people simply don’t bother.
That's how we started out.
There are several such solutions already. The problem is that neither of them is popular enough to become a de facto standard. And, of course, centralized git providers like GitHub have a vested interest in keeping in this way, so they are unlikely to support any such solution even if it does become popular enough.
Sad to see that Mozilla is becoming less and less what they promised to be once Google funding are depleting.
For the actual event we are commenting on, they have disabled all features other than code hosting and PRs.
It's very silly they have to do this, but at least they can I suppose.
If you weren't connected to the internet, you couldn't do a thing. You couldn't checkout. You couldn't commit. You could create branches. The only thing on your computer was whatever you checked out last time you were connected to the server.
People talk about SVN, but it wasn't that common in 2005. None of the project hosting platforms (like SourceForge) supported SVN, they were all still offering CVS. If you wanted to use SVN, you had to set it up on your own server. (From memory, google code was the first to offer SVN project hosting in mid-2006). Not that SVN was much better than CVS. It was more polished, but shared all the same workflow flaws.
Before Git (and friends), nothing like pull-requests existed. If you wanted to collaborate with someone else, you either gave them an account on your CVS/SVN server (and then they could create a branch and commit their code), or they sent you patch files over email.
The informal email pull requests of git were an improvement... though you still needed to put your git repo somewhere public. Github and its web-based pull requests were absolutely genius. Click a button, fork the project, branch, hack, commit, push, and then create a formal "pull request". It was nothing like centralised project management systems before it. A complete breath of fresh air.
And it was actually part of git. Even back in 2005, git included a script git request pull that generated these pull request emails. I'm pretty sure people called these emails "pull requests" before GitHub came along.
[1] https://www.youtube.com/watch?v=4XpnKHJAok8
2006 appears to be the year that SVN finally became somewhat mainstream, which is interesting because git was released in 2005. Github launched in 2008 and by 2009, everyone seemed to be abandoning SVN.
It feels like SVN was only really "mainstream" for about 3 years, Maybe 5 years at most; There was some early-adopter lead-up and then a long tail of repos refusing to switch to git.
Maybe if Git had native support for PRs and issues this wouldn't have happened. (And yes I'm aware of git send-email etc.)
Edit: ripgrep was just a test
More: https://github.blog/engineering/the-technology-behind-github...
Not only results are incomplete but it seems once they went into training LLMs on all code they host they made sure no one else can do the same easily and so now everything is madly rate limited.
Every time I just clone and grep.
It's often useful. But sometimes you want to use other tools, like firing up your editor to explore.
Note we’re talking about the GitHub UI mostly. Pulling and merging a remote branch is a basic git operation, almost a primitive.
Didn't all this start with Linus getting into a spat with the bitkeeper dev involving some sort of punitive measure as a response to somebody making a reverse-engineered FOSS client? I don't remember the details and I'm sure I have at least half of them wrong, but that's easily one of the most disastrous decisions in the history of the software-business right up there with valve turning down minecraft and EA refusing to make sports games for the SEGA dreamcast (that last one isn't as well known but it led to SEGA launching the 2k sports brand to which outlasted the dreamcast and eventually got sold to a different company but otherwise still exists today and is still kicking EA's ass on basketball games).
But there were already quite a handful of other distributed version control systems around by the time git showed up.
So if Linus hadn't written git, perhaps we would be using darcs these days. And then we'd be debating whether people are using darcs the way it was intended. Or bazaar or monotone or mercurial etc.
I don't think what the original authors of any one tool intended matters very much, when there were multiple implementations of the idea around.
It's a joke that the bitkeeper dev has two revision control named after him, Mercurial and Git.
And while NBA 2k destroyed NBA Live it took until 2009 for that to start happening (long after Sega ownership), mainly down to sliding standards in EA’s NBA Live titles and eventually some disastrous EA launches.
Everything is fully and completely explained, in terms which mean nothing.
(They ain't perfect, of course.)
"In astronomy, declination (abbreviated dec; symbol δ) is one of the two angles that locate a point on the celestial sphere in the equatorial coordinate system, the other being hour angle. The declination angle is measured north (positive) or south (negative) of the celestial equator, along the hour circle passing through the point in question."
Anyone who doesn't know what declination is, know from reading the introductory paragraph of this scientific Wikipedia article?
Anyone? no? :-)
I rest my case, m'lud.
On a celestial sphere (planet, star, etc) the declination angle (being 0 is at the equator, being 90 degrees is the north pole of the sphere, being -90 degrees, is at the south pole).
You also need another angle known as the "hour angle" to locate a point on the sphere. It doesn't explain what that is, but as can be seen on Wikipedia, you can easily click on that word to go to the entire page that explains what it is.
What don't you understand?
Why should this be a metric one would want Wikipedia to meet? It's an encyclopedia, not an astronomy course.
Of course, the brilliance of Wikipedia is that if you think you can write a clearer intro, you can do so! You could even add it to the simple language version of the page - https://simple.wikipedia.org/wiki/Declination
But GH's PR process is broken anyways. I miss Gerritt.
If you push rewritten history to master, you're a git.
Conclusion: learn your tools.
The thing is, we could have done better (and have been) since before git even existed.
It's not my favourite process, but...
Also, git store the files in a smarter way so file size won't explode like zip versioning.
Or previous versions. Plural. Yes.
Well, that's one half of git. The other half is tooling to work with the snapshots and their history, eg to perform merges.
And: Even though source of truth is centralized for many projects in GitHub, git still benefits from being distributed: It's the basis for "forks" on VithUb and for the way people develop. Ja jung the clone locally and committing locally and preparing the change set for review. In the CVS/SVN days one had to commit to the ce teal branch way sooner and more direct.
Then later on for the PR, you can sanitise the whole thing for review.
In the bad old days, you only got the latter. (Unless you manually set up an unrelated repository for the former yourself.)
In the Linux kernel the project management is done via email (which is also just a centralized webserver in the end), so whats the problem?
From what I use composer and brew relies on GitHub to work.
https://github.com/orgs/community/discussions/10539
The general issue that git has is making them interact with each other, I would love for git to get distributed issues, and a nice client UI that is actually graphical and usable by non-terminal users.
There were some attempts to make this distributed and discoverable via similar seed architectures like a DHT. For example, radicle comes to mind.
But staying in sync with hundreds of remotes and hundreds of branches is generally not what git is good at. All UIs aren't made for this.
I'm pointing this out because I am still trying to build a UI for this [1] which turned out to be much more painful than expected initially.
[1] https://github.com/cookiengineer/git-evac
That's the default. But git would work just as well, if by default it was only cloning master, or even only the last few commits from master instead of the full history.
You can get that behaviour today, with some options. But we can imagine an alternate universe were the defaults were different.
Most of what you say, eg about not needing lockfiles and being able to make independent offline commits, still applies.
No comments yet
I store my code in a completely distributed fashion, often in several places on different local devices (laptop, build server, backup, etc) not to mention on remote systems. I use github and gitlab for backup and distribution purposes, as well as alternative ways people can share code with me (other than sending patch emails), and other people use git to get and collaborate on my work.
distributed version control system doesn't mean distributed storage magically happens. You still need to store your code on storage you trust at some level. The distributed in DVCS means that collaboration and change management is distributed. All version control operations can be performed on your own copy of a tree with no other involvement. Person A can collaborate with person B, then person B can collaborate with person C without person A being in the loop, etc.
Gitorious was chosen for the meego/maemo team for example.
And I am one of the people saddened by the convergence on a single platform.
But you can't deny, it's always been pretty great.
I am contributing to a few open source projects on GitHub here and there though.
Git is by far the most widely used VCS. The majority of code hosting services use it.
It's not like the hairy C++ code base of Firefox will suddenly become less scary and attract more open source developers simply because it's moving to Github.
People who are very insistent on distributed solutions never seem to understand that the economic, social and organizational reasons for division of labor, hierarchy and centralization didn't suddenly go away.
Sure, there would be local copies everywhere, but for a distribution version control system, it's pretty centralized at GitHub
Everything else... as the original comment said, is pretty centralized for a decentralized system.
It wont be free software and, likely, it will be Microsoft.
It's a pet-peeve and personal frustration of mine. "Do one thing and do that well" is also often forgotten in this part of Open Source projects. You are building a free alternative to slack? spend every hour on building the free alternative to slack, not on selfhosting your Gitlab, operating your CI-CD worker-clusters or debugging your wiki-servers.
github.com broke noscript/basic (x)html interop for most if not all core functions (which were working before). The issue system was broken not that long time ago.
And one of the projects which should worry about, even enforce, such interop, moving to microsoft github...
The internet world is a wild toxic beast.
mozilla-central has a LOT of tests -- each push burns a lot of compute hours.
I was thinking something different: I wonder whether Mozilla considered GitLab or Codeberg, which are the other two I know that are popular with open source projects that don't trust GitHub since it sold out to Microsoft.
(FWIW, Microsoft has been relatively gentle or subtle with GitHub, for whatever reason. Though presumably MS will backstab eventually. And you can debate whether that's already started, such as with pushing "AI" that launders open source software copyrights, and offering to indemnify users for violations. But I'd guess that a project would be pragmatically fine at least near term going with GitHub, though they're not setting a great example.)
"It depends", as always, but codeberg lacks features (that your use-case may not need, or may require), uptime/performance (that may be crucial or inconsequential to your use-case), familiarity (that may deter devs), integration (that may be time-consuming to build yourself or be unnessecary for your case) etc etc.
[1] https://forgejo.org/ [2] https://codeberg.org/
Bad PRs all around, with just a constant stream of drive by "why no merge?!?!?!" comments.
Even before this Mozilla almost certainly used hundreds of closed source tools, including things like Slack, Excel, Anaplan, Workday, etc.
They should restructure instead, hire people who actually want to work on software and not use corporation and foundation around it as platform for their... peculiar "endeavours". But I doubt that's gonna happen - flow of Google cash and from all those naive people who think supporting Mozilla directly contributes to Firefox is too good it seems. But then it's understandable they do this - money from Google tap can get twisted.
Think you might be on something, with the incoming end of Google cash flow, Firefox may be in discussion with bing and that could be part of the agreement, use Microsoft server.
issues are stored in git bug and automatically synced. Github is the only viable option, but you can keep the others as mirrors when github chooses to strike you.
Perhaps Microsoft offered to pick up the tab that Google has been paying, but is now imperiled, or at least lend some sort of financial support, and Firefox cares more about paying their bills than open source