return to table of content

Why GitHub won

DannyBee
248 replies
1d1h

Actually, Google Code was never trying to win.

It was simply trying to prevent SF from becoming a shitty monoculture that hurt everyone, which it was when Google Code launched. Google was 100% consistent on this from the day it launched to the day it folded. It was not trying to make money, or whatever

I was there, working on it, when it was 4 of us :)

So to write all these funny things about taste or what not, is totally besides the point.

We folded it up because we achieved the goal we sought at the time, and didn't see a reason to continue.

People could get a good experience with the competition that now existed, and we would have just ended up cannibalizing the market.

So we chose to exit, and worked with Github/bitbucket/others to provide migration tools.

All of this would have been easy to find out simply by asking, but it appears nobody bothers to actually ask other people things anymore, and I guess that doesn't make as good a story as "we totally destroyed them because they had no taste, so they up and folded".

defen
101 replies
1d1h

Incidentally, this is why I don't use any Google products other than search. It's never clear to me which ones are real products that are trying to win and make money, and which ones are just fun side projects that a multibillion dollar corporation is doing out of the goodness of their hearts, but could get shut down at any time.

0x1ch
34 replies
1d

Every project by Google I was willing to jump platforms for, they killed. So outside of email, maps, and search, sometimes voice, they have nothing left that is worth investing time into. It will disappear within a couple years or less.

sureIy
18 replies
23h2m

Voice? Could easily fold and be sold to Sprint. I’m honestly surprised (and thankful) it’s been working for 15+ years. Completely out of Google character. I hope they forgot about it running on a server under someone’s desk in the basement just below a red stapler.

inanutshellus
3 replies
22h44m

Presumably because it's still providing value. It was originally about mass collecting voice data so they could train their speech-to-text capabilities with various dialects.

Dialects are always in flux and there are plenty of languages out there they haven't conquered, so ... I'd guess they're just leaving it running to detect new speech or languages or dialets or ... personal data gathering or... ?

kortilla
1 replies
11h9m

Youtube has significantly better voice data

aniviacat
0 replies
7h9m

The audio quality of Youtube videos is not representative of the audio quality of someone speaking into their phone.

The speech patterns also differ heavily, with Youtubers (usually) talking in a presentative manner and speaking very clearly.

I think the way people talk in private phone calls, as well as the call audio quality, are much more similar to that of someone speaking to an assistant.

0x1ch
0 replies
19h59m

Man, they collected a lot of prank calls from my younger-self lol.

selimthegrim
2 replies
21h25m

Does Sprint even exist anymore

bobthepanda
1 replies
19h38m

no, Sprint was folded into T-mobile.

The only reason that merger was even allowed to happen was because it was extremely clear that Sprint would not have been a going concern into the next year if it would stay independent.

kelnos
0 replies
19h33m

And as a result, T-Mobile has gotten worse, because now they don't have to try as hard to compete with AT&T and Verizon.

I feel like Sprint failing may have been better for the market; I expect more Sprint customers would have switched to AT&T or Verizon than to T-Mobile, and T-Mobile would still have to fight for customers.

Instead, T-Mo has jacked up prices just like everyone else, reduced nice perks, and their "uncarrier" movement is but a memory.

glenstein
2 replies
22h44m

Same here. I'm equal parts glad and puzzled that it hasn't been killed. I genuinely don't know what I would replace it with.

delichon
1 replies
21h17m

I used it since a few years before Google bought GrandCentral in '07. A couple of years ago I moved to voip.ms as part of my de-googling and am happy with it. There are oodles of such providers.

sureIy
0 replies
13h28m

Oodles of providers that give you a permanent US number for free?

buu700
2 replies
14h55m

I've been burned by Google too many times, but I'm definitely thankful for them continuing to maintain Voice, and even actively improving it over the last few years. I've been using it as my primary phone number since the GrandCentral days, so it'd be a pretty big pain to have the rug pulled out from under me at this point.

It would also be pretty shortsighted from a business perspective, IMO. Voice could still be extended to compete with iMessage and WhatsApp, like they sort of tried to do back when they were going to merge it with Hangouts.

sureIy
1 replies
13h29m

compete with iMessage and WhatsApp,

God please no more messaging apps from Google for the love of god.

buu700
0 replies
13h26m

lol, I'm certainly not advocating for that, but I'd rather they release 100 new messaging apps than kill Google Voice.

InvaderFizz
1 replies
19h2m

I wouldn't worry about voice. They offer it as a paid feature of GSuite to business customers. It's not going anywhere.

tracker1
0 replies
52m

I think there is/are some executives at Google/Alphabet that utilize GV so much that they couldn't kill it if they wanted to. I used to have a voip device that worked with my GV number, and Google extended the life of the API used several years, I since stopped using that device and the service had still been working for it.

I do miss when GV and SMS were integrated with Hangouts though... was imo a much better experience than you get today with all the broken out apps.

saurik
0 replies
21h24m

Hush now, don't remind them of it!! ;P

In all seriousness, though: if I were a user of Google Voice, I'd be seriously concerned that they would shut it down in a way that caused me to lose access to my phone number (even if only indirectly, as happened recently with their divesture of Google Domains or whatever their registrar was called)... you are much braver than I ;P.

paradox460
0 replies
18h30m

Sprint is T-Mobile now

no_wizard
8 replies
22h59m

Fiber is also still running, somehow. I don’t think they’re expanding much if at all but I’m shocked it hasn’t been sold off.

Frankly, I’m just as shocked they didn’t go full throttle with it either, because talk about a data selling gold mine with all that traffic

While I’m on that subject, it could have been a real opportunity for them to push fiber + YouTube TV as a package. Google isn’t good at making original content but at some point they have made a software + services play that makes such a package more palatable and user friendly, imagine subscribing to a YouTube channel and it becomes a channel as part of the TV app for instance. A lot of people watch channels like this as it is.

cymor
4 replies
21h51m

They've been expanding again, and are now offering 8gbps in my area. I've been very happy with it, and I'm still only paying $70/month.

manmal
3 replies
20h40m

That’s absolutely amazing. Besides hosting at home, do you feel any difference vs say a 1gbps line? Surely most servers won’t saturate this when downloading or browsing?

kstrauser
2 replies
20h12m

Our ISP (Sonic) offers 10Gb uncapped for $50/mo.

I don't feel any difference over our previous 1G service, other than it never lags, even if multiple kids are streaming video and I have a backup running. The biggest difference is that it's half the price of the 1G service that ran over AT&T's fiber.

nickstinemates
1 replies
19h33m

I really wish Sonic would expand out to West Marin. My only options are comcast and starlink.

kstrauser
0 replies
19h15m

Ouch. I was ecstatic when they started offering service in my neighborhood in Alameda.

runevault
0 replies
21h35m

Google Fiber is actually working on expanding to where I live, I keep getting ads on Youtube for it.

kelnos
0 replies
19h32m

talk about a data selling gold mine with all that traffic

IIRC, Google doesn't sell user data. But it's plenty valuable to them all locked up.

otteromkram
5 replies
22h55m

Search?

DuckDuckGo is great for most purposes.

The best product Google has is Maps. That's about it.

Dwedit
3 replies
22h48m

DuckDuckGo is literally Bing but with location-based ads added to the bottom of your search results.

Retric
2 replies
20h11m

When did that change? I haven’t tested it in a while, but in the past DuckDuckGo indexed some stuff that wasn’t on Bing.

I’ve been using Google and Duck on different kinds of searches these days, because neither is universally better but I might try Bing again.

Edit: NM, Bings UI is still annoying. It slides stuff in from the side when you scroll down.

latexr
1 replies
18h12m

When did that change?

It didn’t. The person you’re replying to is either exaggerating or uninformed.

Maybe give Ecosia a try. The interface is decent and I’ve been pleasantly surprised by the results.

pb7
0 replies
2h13m

You're right, it didn't change. It was always just a thin wrapper around Bing.

Arisaka1
0 replies
9h49m

The best product Google has is Maps. That's about it.

Are we just going to pretend that Chrome and Android don't exist just like that, or do I have a different definition of "has"?

richardlblair
15 replies
1d

Unrelated to the thread - do you use any email providers with a custom domain? If so, would you suggest them? Who are they?

softfalcon
2 replies
1d

Fastmail has been solid for the last several years. I would recommend it.

queuebert
0 replies
23h26m

Fastmail is just about perfect. Feels like email 30 years ago but with a spam filter.

datadrivenangel
0 replies
23h4m

Also a happy fastmail paying customer. The aliases are really good.

kstrauser
1 replies
20h9m

Apple hosts custom domains at no extra charge if you have iCloud+. I've used it for a couple years now to host my family's domain and a couple of hobby ones for myself.

pasc1878
0 replies
9h20m

I would probably still be on Apple mail and use its generated email aliases except it went through a stage of not dealing with various back bounce spam so I switched to fastmail and have had no issues.

Either now is probably a good choice.

umbra07
0 replies
21h38m

i really, really like purelymail. great pricing, good customer service, reliability, and documentation.

other popular options are fastmail, migadu, and mxroute.

pertique
0 replies
23h9m

I agree with a lot of the other options, but I'd be remiss if I didn't mention one that isn't always obvious.

With all the Big Corp asterisks, Microsoft Business Basic can be a pretty great deal at 6 USD/month. Solid reliability, aliases, (too many) config options, 1TB of OneDrive storage, cloud MS apps, etc.

n_plus_1_acc
0 replies
19h56m

uberspace because of SSH access

maeil
0 replies
15h17m

Zoho Mail. Cheapest, no nonsense, very straightforward UI, all the settings you'd ever wish for, support replies quickly even on the cheapest tier.

lbotos
0 replies
20h48m

I've used Fast Mail, liked it a lot, but I think at the time the pricing wasn't the best.

I then used some other platform that was quite "old school" that is recommended here. The Mail Admin was very opinionated which lead to some mail being blocked. That wasn't cool.

I'm now with Migadu on the cheapest plan and it's been fine. Had a few outages here and there, but otherwise solid.

I'd happily rec Fastmail or Migadu.

jacooper
0 replies
23h38m

Proton mail

homebrewer
0 replies
23h39m

If you need something cheap and are willing to deal with a tiny company, have a look at <https://purelymail.com>. I've been happy with them for two years, never had any problems with delivery, and they support infinite domains/aliases, and custom Sieve rules. But do not use it if you need 99.999999% SLAs or anything like that, because again -- it's a one-man show.

dasil003
0 replies
1d

Fastmail

dgellow
13 replies
23h47m

That's the big question mark I have for Flutter. Looks like a pretty nice platform from the outside, but I cannot see Google NOT killing it

kyrra
6 replies
22h21m

Googler, opinions are my own.

You have to look at what teams at Google are using Flutter. Any dev tool Google officially releases tends to be funded by some product that actually likes/uses it.

Current list from https://flutter.dev/showcase: GPay, Earth, Ads (this is a big one), and others.

There are a lot of teams using it, which is why it's still getting so much support. If you see Google apps moving away from it, then it's time to start looking for an alternative.

It's also why AngularDart still exists and is getting updated. There are large projects that use it, so they will keep supporting it.

troupo
3 replies
15h25m

Ads (this is a big one),

How is Ads using Flutter? It likely doesn't, or uses it for some largely irrelevant things like mobile SDK integrations

pjmlp
1 replies
3h59m

Ads is the reason Dart is still around, they saved the team after the project folded, after migrating from GWT to AngularDart, they weren't into doing yet another rewrite.

troupo
0 replies
3h28m

Ads is the reason Dart is still around, they saved the team after the project folded,

Wasn't it Analytics which had Dart? But I do remember something about Ads saving it.

they weren't into doing yet another rewrite.

It all depends on politics within the company. For example: youtube was hastily rewritten using alpha/beta versions of Polymer targeting Custom Elements v0. The moment they did that v0 was deprecated.

So within next 4 years they rewrote Youtube again with Polymer version targeting Custom Elements v1. The moment they did that, Polymer was deprecated and replaced with lit.

Even though they haven't rewritten Youtube, they've now spent time integrating Wiz, an internal Google framework (that also got merged with Angular).

The costs of rewrites don't matter at Google as much as promotions.

saagarjha
0 replies
12h20m

I think Ads has a mobile app for people who run them?

scarface_74
0 replies
19h22m

And none of Google’s flagship cross platform apps use Flutter.

runevault
0 replies
21h33m

The fact Pay and Ads both use, along with Youtube Create even, is a pretty good sign because if they have a non-trivial codebase of flutter/dart app(s) then killing it would impact all those teams who are doing important work. I've debated trying flutter/dart a few times and this makes me feel more willing to try it.

saghm
5 replies
22h46m

I assumed Flutter is open source; if they fix kill it off, is there a reason to not to expect the community to fork and maintain it? Presumably they'd have to rebrand it without Google giving permission for the name, but that alone doesn't seem like enough to stop it from existing in some form.

clhodapp
3 replies
22h36m

The base ROI of Flutter to Google isn't all that clear because it's relatively complex to maintain. Worse, it requires maintaining Dart, which appears to be a dead-end language in terms of adoption.

If Flutter and Dart were donated to the community, they would most likely slowly die because no one outside of Google gets enough benefit from them to justify maintaining an entire programming language.

nox101
2 replies
22h21m

It's worse. Flutter actually works against google's own interests. Webpages that render text in a canvas are not as easily indexable as webpages that emit html. It's funny too because the same is true for sites made with flutter. They aren't SEO friendly

You could suggest using AI to OCR the canvas but even that would be subpar because most sites that use HTML provide multiple screens worth of info but sites that render to a canvas render only what's visible. The rest of the data is internal. You'd not only need the AI to successfully OCR the text, you'd need it to interact with the page get it to render what's not currently displayed.

timschmidt
0 replies
18h50m

Accessibility interfaces are ideal for this situation, allowing an LLM to interact with a GUI strictly through text. Unfortunately, they're often an afterthought in implementations.

pjmlp
0 replies
3h57m

Same can be said from GWT.

kelnos
0 replies
19h25m

is there a reason to not to expect the community to fork and maintain it?

I think you should ask the opposite question instead, and this goes for any project that is corporate "owned"/sponsored like Flutter: why should we expect the community to maintain it if the company abandons it?

Are there any non-Googlers with commit access to the project? Do non-Googlers comprise a decent percentage of contributions? If not, the bar goes up quite a bit for a fork, or for a "peaceful handoff" of the existing project.

shiroiushi
11 replies
14h17m

Google Maps isn't going anywhere. I'd even argue that it's more important than search; it's far more difficult to switch to a competitor.

GMail isn't going anywhere either.

YouTube is a bit less clear: it doesn't really have any competitors, and is extremely popular and widespread, but it surely costs a fortune to keep running.

It'd be interesting to see the financials on these different business units to see which are the healthiest and which aren't.

askvictor
9 replies
12h35m

How is it difficult to switch to a maps competitor? Really depends on how much you use it, but most of my use involved looking up a location, then getting directions to it. There's no cost to switching, expect perhaps setting up my commonly used addresses, which are basically in my contacts anyway. I'm sure that there are cases that are harder to switch, but I'd guess that they don't apply to the majority of people.

Gmail, on the other hand, like any email service, is much harder to switch. Until we get mandated email address portability, like was done with phone numbers some years back.

shiroiushi
4 replies
12h15m

How is it difficult to switch to a maps competitor? >but most of my use involved looking up a location

Right, and that's where most of Google Maps' value is: it's really a business directory combined with a navigation system. I'm not going to look up an address for some place, I want to just search for "<business name> near me" and find all the nearby locations and pick one and go there. Even better, Google Maps has reviews built-in so you can see if people hate a certain place.

but I'd guess that they don't apply to the majority of people.

If you think the majority of people keep business names and addresses in a contacts list, you're really out of touch.

Also, GMaps has features competing systems usually don't, depending on locality. Here in Tokyo, it's tied into the public transportation system so it'll tell you which train lines to ride, which platform to use, which car to ride in, which station exit to use, when a train is delayed, etc. It'll even tell you exactly how much each option costs, since different train lines have different costs. Then, inside many buildings, it'll show you level-by-level what business is on what floor.

guappa
1 replies
5h44m

So you are not aware that public transport companies in general have a public API for that data, and that openstreetmap exists?

pjmlp
0 replies
3h56m

"In general" is quite a stretch.

I am aware of plenty public transports in European countries that you better print out those PDFs, from their website, assuming they have them in first place.

lolinder
0 replies
5h43m

If you think the majority of people keep business names and addresses in a contacts list, you're really out of touch.

This is not at all what they were saying, and I'm not sure where you got that. What they're saying is that Google doesn't have a monopoly on business location data, so searching for a business on a competitor (especially a large one like Apple, but mostly even OSM) does just work.

askvictor
0 replies
5h49m

Those are all great features, but they are not features which lock you into the platform. If bing or Apple maps had the same utility I could switch to them at the drop of a hat.

erremerre
3 replies
11h27m

It is the only map application that allow you to check public transport (bus/metro/tram) with changes other than (if it exists) a local app for the city.

As far as I know, there is no other map application that does that.

shiroiushi
0 replies
11h6m

Exactly. Here in Tokyo, there is a local app that's tied into public transit just like GMaps, but all it does is tell you how to get from Station X to Station Y. If that's all you want to know, it works quite well. But most people want to know more than that: where is Business A, and what's the fastest/cheapest way to get there from my current location? Which station exit should I use? And oh yeah, how are the reviews on it? And can I see the menu, and place a reservation too?

pasc1878
0 replies
9h23m

Apple Maps does this

lmm
0 replies
8h54m

YouTube is a bit less clear: it doesn't really have any competitors, and is extremely popular and widespread, but it surely costs a fortune to keep running.

There was a brief moment when it looked like Twitch would kill YouTube; then, much like Instagram responding to SnapChat, they took the best parts of it and added them to itself. I'd be amazed if YouTube wasn't profitable now with the combination of more ads, more subscription/superchat options and the pandemic-era boom in streaming.

iteratethis
7 replies
21h19m

As for real products, at Google's current size it has become near impossible to launch new ones worth their time.

Google currently has a quarterly revenue of $70-80B.

Imagine an internal team launches a new product to collect $100M in quarterly revenue. An earth-shattering success for any entrepreneur.

For Google...it doesn't even move the needle. Does nothing for stock, it's not strategic, and may become a liability later on.

You would need to launch a multi-billion new sub business for it to be of any interest to Google, which is impossibly hard.

nextos
5 replies
19h46m

This is why they should go full conglomerate and spin off companies all the time.

Otherwise, with those expectations, it's impossible to build something good and impactful.

nh2
2 replies
17h31m

What would be the benefit of doing it spun-off vs. in-house when it's still owned by the same company and making the same amount of money?

nextos
0 replies
17h20m

Less red tape, more freedom to operate.

If done inside a conglomerate, it can be the best of both worlds. Access to Google tech and funding, but free from rigid corporate hierarchies, at least at the beginning.

lmm
0 replies
12h41m

Spinning it off limits the downside/risk to the parent company. Basket of options worth more than an option on a basket and all that.

kelnos
1 replies
19h30m

I really thought this was what the plan would be when Alphabet was formed.

chadcatlett
0 replies
14h57m

Google X/X Development LLC is the Google incubator for possible spin-offs as far as I can tell.

johannes1234321
0 replies
19h39m

... which is the reason why many large corporations acquire products: only once they are big enough, they are relevant.

Issue for Google: They have to be carful for Anti-Trust not blocking the acquisition for some reason.

grues-dinner
4 replies
22h8m

fun side projects that a multibillion dollar corporation is doing out of the goodness of their hearts

Back then it felt like it was actually just possible that they just were that cool. Noughties Google was really something compared to the staid, implacable incumbents. 15GB (with a G!) of email storage that would grow forever! Google Earth! YouTube! Live long enough to see yourself become the villain indeed.

ThrowawayR2
2 replies
20h27m

Google's leadership made their intentions clear with the purchase of the much reviled DoubleClick in 2007. They didn't become villains; it was always all about the money, just like everyone else.

grues-dinner
1 replies
19h50m

I think at least at first it was genuinely not villainous. The highly technical founders, I think, did mean it with "Don't Be Evil". PageRank wasn't designed from the start to be a universal panopticon. After all, the global crash hadn't happened, rates weren't zero yet, no one had smartphones, social media wasn't a word, mobile meant J2EE and Symbian, and the now-familiar all-pervasive Silicon Valley financialisation MBA cancer was yet to come. That said, the realisation that "all the data" wasn't just about search index data and book scans did clearly hit for them (or, if that was the plan since 1998, surface) by the mid 00s.

DoubleClick was the year after Google Code. They had a good few years of being considered cool and, well, Not Evil. Google Earth was 2001, Gmail, Scholar and Books was 2004, Reader and Gtalk (RIP x 2) 2005, Patents in 2006 and dropping 1.65 billion plus (which sounds trivial now that everyone and their mum's are worth billions for whatever dog-walking app, but not then) on YouTube that year, even though it was only a year old. Mad times, you could barely sign up for a beta before another one landed. The search engine itself was of course revolutionary and peerless. And you could use it on your WAP phone until something called the "iPhone" happened.

For those of us who were young and, yes, naive, and who weren't paying attention in the first dot com crash in those newspapers adults read while we played (8 year olds in the 90s rarely using IRC to get the lowdown on Silicon Valley), it seemed like it was possibly a new age. "Going public" in 2004 wasn't yet another obvious "oh here we go again" moments, because they were among the first in the generation, with the pervious generation mostly fizzling before pervasive internet access (Amazon made it, but it was a slower burn).

Chrome and Android was 2008, and I remember first hearing around then the phrase "Google Mothership". Though I never stopped using Firefox (and Opera, I don't remember when I switched to Firefox), Chrome was undeniably a technical coup at the time. Being cool and shiny and actually good at being a browser, while kicking that evil Microsoft in the teeth helped too. It took time to burn though that goodwill. Even today, very many otherwise FOSSy people won't move from Chrome.

timschmidt
0 replies
18h48m

This made me nostalgic for the low bandwidth, no javascript, almost all text gmail interface. It felt so snappy.

rideontime
0 replies
14h4m

It was actually fun to go poke around and see what new sites they were cooking up. I'd forgotten.

bitpush
3 replies
1d1h

Business strategy is more than just launch product, make money. Companies operate in a dynamic space, and if you play any game (basketball, chess ..) you know that moving forward at all cost is not how you play the game. Sometimes you side-step, sometimes you take a step back, sometimes you sacrifice your piece.

If you expect you team to just go-go-go, you might something in the short term but you'll fail miserably in long-term.

chrisandchris
1 replies
1d

That is totally fine, but Google is a case where they go into new business units and fairly often kill those units quite soon.

It's not like they're doing cereals and now they're doing other cereals, so you can fall back to their previous cereals. You always have to find a new supplier, or then just start buying bread.

adamc
0 replies
23h55m

Yeah, there are so many examples. It's one of the reasons I was unwilling to jump on board Google Stadia... I kept thinking "let's wait and see how it does". Particularly since you had to buy non-transferable game licenses.

And, of course, they shut it down. At this point, Google is going to have to be in a space for at least 5 years and clearly be highly successful before I would even think about using them.

scarface_74
0 replies
19h19m

And I worked at AWS Professional Services and sat in on sales calls all of time where one of our talking points was “a company couldn’t trust Google to not to abandon a service they were going to be dependent on”.

Was that partially FUD? Probably. But there were plenty of services sales could point to.

sixo
2 replies
20h24m

a simple heuristic: does it make Google a boatload of money? if yes, it's safe

rubinelli
1 replies
20h20m

Anything besides ads, GCP, and Apps in that bucket?

progmetaldev
0 replies
18h46m

I won't disagree with you, but can you put ads in the product, and have them be front and center? That also counts as ads.

tracker1
0 replies
51m

I'm still dealing with the fallout of the domains selloff.

paradox460
0 replies
18h30m

I don't even use search anymore. Kagi schools it

magicalist
0 replies
1d1h

Eh, when it happened the world was ready. It felt like a different internet back then, and it was pretty great for a company to say that everyone was already on github anyways, so let's all go there (Microsoft followed almost the exact same timeline with Codeplex, and really who cares).

More Google style would be shortly after shutting down Google Code starting up a new code hosting project, migrating several times to always slightly incompatible frontends requiring developer changes, shutting down the project, and shortly afterwards starting up a new code hosting project...

jiveturkey
0 replies
22h33m

mentioning the sunset of a google products, in general, in any thread about some specific google product, is a kind of godwin's law.

because of ordering by vote, we can't see how quickly it happened in this case. in godwin's law the likelihood increases by length of discussion, implying a slow/linear kind of probability, but for google product sunset i would argue that the likelihood ramps up very aggressively from the start.

i hereby dub this `jiveturkey's law`.

like godwin's law, the subthread becomes a distraction. unlike godwin's law, the sunset distraction is always a legitimate point. it's just that it has become tired.

schacon
50 replies
1d1h

I'm not sure what "SF" means in this context. San Francisco? I can't figure out what you want to say Google Code was for exactly. If Google launches a major project, I find it hard to believe that it's just for fun.

blktiger
26 replies
1d1h

It’s short for Source Forge which is still around technically but a shadow of its former self.

PaulHoule
18 replies
1d1h

It was early example of the "enshittification" phenomenon. It was a particular bad example of advertising and other spammy distractions because sites for developers have the lowest CPM of anything except maybe anime fan sites.

It is super hard to break through a two-sided market but it is possible when a competitor has given up entirely on competition, which might have happened in the SourceForge case because the money situation was so dire they couldn't afford to invest in it.

giantrobot
8 replies
1d

A small SourceForge retrospective for those not around at the time:

This post's overview of contributing to Open Source is largely correct. You'd get the source tarball for a project, make some changes, and then e-mail the author/maintainer witch a patch. Despite the post's claim Open Source existed long before 1998.

Rarely did Internet randos have any access to a project's VCS. A lot of "projects" (really just a program written by a single person) didn't even have meaningful VCS, running CVS or RVS were skills unto themselves. There was also the issue that a lot of Open Source was written by students and hosted on school servers or an old Linux box in a dorm.

SourceForge came along riding the first Internet bubble. They let a lot of small FOSS projects go legit by giving them a project homepage without a .edu domain or tilde in it. They also got a managed VCS (CVS at first then Subversion later) and contact e-Mail addresses, forums, and other bits that made the lives of Linux distro and BSD ports maintainers much easier. They also had a number of mirror sites which enabled a level of high availability most projects could never have had previously.

Then SourceForge's enshitification began as bubble money ran out. The free tier of features was decreased and then they started bundling AdWare into Windows installers. SourceForge would literally repackage a Windows installer to install the FOSS application and some bullshit AdWare, IIRC a browser toolbar was a major one.

As the officially upstream source for FOSS projects bundled for package managers the AdWare wasn't much of a problem. But SourceForge was the distribution channel for a significant amount of Windows FOSS apps like VLC, MirandaIM, and a bunch of P2P apps which were impacted by the AdWare bundling at various points.

A GitHub founder patting themselves on the back for the success of GitHub is sort of funny because GitHub followed a similar track to SourceForge but got bought by Microsoft instead of a company coasting on VC money. I can easily imagine a world where an independent GitHub enshittified had they not been bought by a money fountain.

jakub_g
2 replies
23h55m

GitHub followed a similar track to SourceForge

Can you provide an example?

giantrobot
1 replies
18h48m

GitHub offered a better experience than the existing offerings. They then scaled massively to the point where it was expensive to run without a lot of good options for monetization. Thankfully for GitHub they got bought by Microsoft and not say Verizon or GameStop (which owns the corpse of SourceForge for unfathomable reasons).

GitHub could have easily enshittified in an effort to make money had they not been bought by someone with deep pockets.

rascul
0 replies
18h7m

I thought GameStop sold SourceForge off over a decade ago and that it has changed hands a couple times since.

samatman
1 replies
17h55m

Open Source existed long before 1998

It did not. Free software did. The term "open source" was coined by Christine Peterson at a meeting in January of 1998, as Netscape was contemplating releasing their source code as free software. The Open Source Initiative was founded a month later, and forked the Debian Free Software Guidelines, written by one of the OSI founders, Bruce Perens. This was a deliberate marketing exercise, both to avoid the unfortunate "as in beer" connotations of free software, and to distance the concept from the FSF and Richard Stallman.

All of this is well documented.

account42
0 replies
4h50m

Does something only exist once it is named? Free software is also open source software, even if that name was only coined later on.

progmetaldev
1 replies
16h49m

GitHub was not on the same track as SourceForge, and I would hazard they were in a completely different world than then one SourceForge developed in. For instance, GitHub is far less likely to host an executable for any software, which is where you're going to get bundled installers with AdWare or malware. I know that GitHub allows installers to be uploaded, but if we're going to compare the time period before Microsoft purchased GitHub, I really don't think this is fair. I understand the history of not trusting Microsoft, and even as someone who is deeply involved in using GitHub and Microsoft software and features, can understand a level of distrust. Everything you said about SourceForge is correct, so I don't mean to put down your entire comment here.

I believe GitHub's underlying use of the Git SCM, as well as the interface that allowed web users to look at "pull requests" as a concept was the real value in GitHub, far before hosting binaries or attracting the attention of Microsoft. The attraction to Microsoft was the ability to pull in the growing network of git users, the popularity of GitHub to a growing number of developers, and the ability to integrate the service into their developer offerings (this was the key that would have made the other two "positives" worthless to them).

I think any tool or technology you should have an "out", in case another corporation/company takes over and doesn't align with your values. Being stuck on SourceForge, Google Code, GitHub, Bitbucket, etc. is a recipe to lock yourself into being put down to pasture because you couldn't adapt and realize that there is a huge world out there, and tools and tech come and go. Always have something as an alternative for whatever you do, because things change too quickly, plus you get another point of view with solving problems (if that's your thing, and you aren't just developing for the money, which is fine if you can admit it to yourself). The fact that you are able to dive back into time with SourceForge tells me you are one of those people that have been into technology since pre-dot com bust, but probably got burned by Microsoft in some form. I'm not defending Microsoft for their past practices, only coming at this from what they have done with GitHub to this point. Hopefully I'm not wrong, but I do have a plan in place in case I am, and I think that's the most important thing in software.

giantrobot
0 replies
15h27m

I don't think GitHub's situation is completely analogous to SourceForge. You're right that GitHub doesn't have a huge moat by virtue of the way git works. I think Microsoft realizes that, no one necessarily loves GitHub so much they'd not jump ship if GitHub became too user hostile.

To be clear I'm not trying to be down on GitHub here. They made a good product and a very good alternative to SourceForge. I think they just got lucky getting bought by Microsoft when they did. By 2018 I think they'd gotten to the point where their costs would have required to start chasing revenue.

PaulHoule
0 replies
23h24m

I agree w/ the analysis in the article and yours. “Good taste” for them must have been influenced by “let’s not be SourceForge”.

Part of the enshittification story is the tragedy of the non-profitable site that has a large user base. Here is a recent example

https://news.ycombinator.com/item?id=41463734

Potentially investments in a site could have huge leverage because of the existing user base.

I (and most of my team) lost our jobs at one of the most well-loved open accessing publishing sites in the world because of a complex chain of events that was rooted in the site not having a sustainable funding source despite the fact that it was fantastically cheap to run if you divided the budget by the number of daily users. Fortunately they figured it all out and the site is still here and if you work in physics, math or cs you probably used it today.

Still it was painful to see “new shiny” projects that had 4x the funding but 1% the user count at most, or to estimate we could save users $500M a year with a $500k budget.

Thus you can overthrow SourceForge but cannot overthrow something profitable and terrible such as Facebook, Match.com or the “blob” of review sites that dominate Google, see

https://news.ycombinator.com/item?id=41488857

eric-hu
6 replies
1d1h

Didn’t they also start packaging malware into binary downloads?

progmetaldev
3 replies
17h50m

Was it actual malware? I thought it was just automatically checked software like the Ask.com toolbar for Internet Explorer, along with the Oracle.com JAVA JRE. Maybe I was just more careful than most, but I have been using FileZilla for many years, and never had any of these issues as long as I paid attention to the installer and what was included.

eric-hu
2 replies
17h23m

You got a chuckle out of me with such a specific reference that I know I got auto-installed too. I don't know, personally, but I do recall reading accounts of malware from them from around that era.

I suppose I would call what I saw "Grayware" [1], which is debatably not malware (but debatably is, too). It was enough of a smell for me to stop using their site, though. I'd actually just forego software I was seeking out from them instead.

[1] https://en.wikipedia.org/wiki/Malware#Grayware

progmetaldev
1 replies
16h0m

I definitely don't deny that what they did was definitely not on the "up and up", for sure. Dark patterns, and I haven't heard the term "grayware" before, but it definitely fits! I was only lucky because I actually watch installers, and tend to mess with install locations, strictly when it comes to Windows software. I would prefer to be on a Linux-distro for work, but it's honestly just easier for me to stick with Windows because I have to help so many other people, and I know I would eventually end up losing the (current) knowledge of what can go wrong with Windows. Over time, I would be out of touch.

eric-hu
0 replies
8h50m

I only learned the term grayware from this thread. I wasn't sure precisely what did and didn't qualify as malware, so I checked Wikipedia and found it there.

As for the Windows fear, I understand that. That was me 10+ years ago, and I _am_ less knowledgable at helping people with Windows these days. I can still figure some things out by searching, and "I couldn't figure it out after 5-10 minutes" has turned out to be an acceptable answer too sometimes :)

actionfromafar
0 replies
1d

By then they were already desperate from the previous enshittification.

PaulHoule
0 replies
1d1h

Yep.

3eb7988a1663
1 replies
1d

  ...lowest CPM of anything except maybe anime fan sites.
I am not in ads, so could you expand on this? Why are anime sites low value vs other niche? I would naively expect that anime has huge numbers of <20 year old fans who are more prone to advertising merchandise.

PaulHoule
0 replies
1d

They pirate or already subscribe to Crunchroll. Many sites are nowhere near brand safe (one reason Danbooru is as good as it is that is not even brand safe for porn ads.) Some of them buy figurines, but they already know where to get them. Good smile would rather invest in 4chan than buy ads on western anime sites. I have no idea what it is like in Japan.

I am skeptical of claims that “advertisers want to target young people because they have a lifetime of consumption ahead of them”. Maybe that is true of Proctor and Gamble (who expect to still be selling Dawn and Tide 50 years from now) but few advertisers are thinking ahead more than one quarter, if that —- particularly in the internet age where an ad click can be tracked to a transaction.

People today will say all the ads on TV target oldsters because only oldsters watch TV but circa 2000 there was no streaming and watching TV made me think “I want to die before I get old” because there were so many ads for drugs and embarrassing health conditions and personal injury lawyers and relatively little for things you would spend your own money on because not a lot of people in the audience have money to spend particularly after paying the cable bill.

cruffle_duffle
4 replies
22h52m

"shadow of its former self"

Was there ever a point in time where it wasn't something that basically sucked? For some reason there are still some widely used ham radio packages that are hosted on sourceforge and it annoys me greatly. When you click the big green "Download" for the project you get.... .... a dll file. Why? Because the actual release artifact is some other zip file and for some reason it doesn't deserve the "Big Green Download" button.

SF has always been this bad. Their core data model just doesn't jive with how people actually interact with open source projects.

... and for that matter didn't they stir up some controversy a long while ago for tampering with project artifacts and adding extra "stuff" in them? (spyware / nagware / **ware?)

toast0
0 replies
21h44m

Was there ever a point in time where it wasn't something that basically sucked?

Yeah, when it launched it was cool and hip. Free public CVS server to host your open source cool project was cool. Probably went downhill as the ad market fell apart post dot-com, and the only way to get revenue was big green download buttons.

progmetaldev
0 replies
18h9m

Yes, they used to be great for open source projects. They did get wrapped up in controversy where another company took over and were including other software in the installers, if you weren't careful to uncheck the optional (and unrelated) software. There is still great software hosted there, like FileZilla if you use a Windows environment. FileZilla did have the optional software installs for about a year or so, but as long as you paid attention, it was easy to get around (you just had to pay attention, but that's not an excuse for what they did).

mnau
0 replies
17h28m

Yes, they were cool once upon a time. It the place to be, you didn't have to host your own CVS without charge (no git back then, hell, even SVN was released few years after SF). It was like geocities.

It looks almopst impossible today, but launching a service was really hard and expensive back then. It cost a lot of money/effort in just software. All that stuff you can just download and it actually works? No way man, didn't exist yet.

That is why LAMP stack was so great back then, it was free, working and reasonably low-maintenence and super easy to set-up.

account42
0 replies
5h4m

It's been a while since I bothered with SF but AFAIR the maintainer can select which file is the default download that you get through the big button.

philipkglass
1 replies
1d

There is still some code hosted on SourceForge that has no other public source. This is unsettling because I don't know how long SourceForge will continue operating and Wayback Machine captures of SF pages don't include tarballs. Download backups yourself whenever you find something like this.

I'm contributing to someone's software that started as an academic project. The current version is on GitHub with history back to 2014 but early releases back to 2008 (from before the author started using version control) are on SF in the form release_1.0.tgz, release_1.1.tgz, etc. I stumbled on these old versions this weekend while looking for related material. Once I decompressed them I found that they contained notes and old code that really helps to understand the current project's evolution and structure.

LegionMammal978
0 replies
21h25m

Yeah, what especially irks me with SourceForge is the common habit of projects regularly deleting all outdated releases (due to some per-project size limit? or just not to clutter up the list?). In old projects with messy releases, it can be very hard to piece together exactly which revisions went into a version "x.y.z" that everyone else depended on, except by actually looking into the released files. If those files don't get archived anywhere, they just get lost to the ether. (At least, short of a manhunt for anyone with the files in an ancient backup at the bottom of the sea.)

nine_k
12 replies
1d1h

The fact that letters "SF" may need explanation in a context of code hosting and building says how thoroughly the job has been done. A number of good alternatives exist, there's no monoculture (even though some market domination is definitely in place, but now by a mysterious GH).

digging
9 replies
1d1h

A number of good alternatives exist, there's no monoculture

That doesn't sound true to me at all, except maybe in some very small niches. I've used Bitbucket at exactly one job; I've found Codeberg, but no project I've used was actually hosted there; and literally everything else I see or use is on Github.

boredtofears
3 replies
23h17m

I've used bitbucket at almost every job I've had. I suspect it's usage is much higher for private companies than people realize - if you've already bought into Atlassian for JIRA or Confluence it makes bitbucket an obvious selection.

ikety
2 replies
23h6m

Why is that? Jira github integration is nice and simple.

boredtofears
1 replies
22h42m

Why wouldn’t it be? Simpler to use first party integration and have centralized user management. Bitbucket works just fine.

consteval
0 replies
21h57m

Bitbucket is also just good - legitimately. I prefer the UI for a lot of stuff.

nine_k
2 replies
1d

GitLab is relatively more widely represented, but of the projects I encounter, about 2-3% are on GitLab. I encountered projects on Codeberg, too, and even on sr.ht.

A bunch of larger projects have a mirror on GitHub for easier access.

BTW there's launchpad.net which is often overlooked, bit it's vital for Ubuntu-specific projects.

At paid day jobs, I had to use BitBucket at least twice, and I miss better code review tools, like Phabricator.

GitHub definitely dominates the market, partly due to the network effects, but I don't think they have a lot of moat. If something goes badly enough wrong there, there will be plenty of viable alternatives with an easy to trivial migration path.

rapnie
0 replies
10h58m

GitHub definitely dominates the market, partly due to the network effects, but I don't think they have a lot of moat. If something goes badly enough wrong there, there will be plenty of viable alternatives with an easy to trivial migration path.

Their moat is a billion development tool vendors that have "integrate with Github" as a must-have and expected functionality.

account42
0 replies
4h46m

BTW there's launchpad.net which is often overlooked, bit it's vital for Ubuntu-specific projects.

It's overlooked because in true Canonical fashion they went hard in on their not-invented-here-syndrome VCS that nobody asked for or wanted. That and also the integration with Ubuntu and nothing else.

sangnoir
0 replies
1d

A decent number of larger open source projects self-host.

mnau
0 replies
17h18m

Everything has at least a mirror on GitHub, but quite a lot of projects are either on GitLab (e.g. KiCad) or self-host (freedesktop).

There is also a lot of stuff on Gitee (China), but due to langauge barrier, it's hard to judge.

shadowgovt
1 replies
1d

It reminds me of how Stackoverflow won so successfully that to even know about the old "expert sex change" joke is to thoroughly date oneself in modern conversation.

Brian_K_White
0 replies
22h10m

Earlier today I said that "what your github stars say about you" site was slashdotted. No one reacted so maybe I'll write about it on my LJ.

tsm
2 replies
1d1h

SF is SourceForge, which at the time effectively had a monopoly (and also sucked)

tadfisher
1 replies
1d1h

It still sucks, but it sucked then too.

DannyBee
0 replies
1d

I miss mitch hedberg

schacon
1 replies
1d1h

I now realize that it's SourceForge. :)

breck
0 replies
1d1h

I thought he was talking about SpaceForce. Wait until we get to 2050, SpaceForce develops a really shitty monoculture. That's why I came back.

wafflemaker
0 replies
1d1h

This post replied to a post talking about Source Forge. Had the same problem :)

noitpmeder
0 replies
1d1h

SourceForge

kemayo
0 replies
1d1h

Sourceforge.

ipsi
0 replies
1d1h

SourceForge, probably.

crop_rotation
0 replies
1d1h

I think it means SourceForge.

Brian_K_White
22 replies
21h50m

You sound like you're proud of this work and this plan and this sequence of events.

code.google going away, without the excuse that google itself was going away, after I had started to rely on it and link to it in docs and scripts all over the place, is what taught me to never depend on google for anything.

If google had said "The purpose of this service is an academic goal of Googles, not to serve users needs. This service will be shut off as soon as Googles academic purpose is met." I would not have used it.

But Google did not say that. Google presented the service as a service whos purpose was to be useful to users. And only because of that, we used it.

Do you see the essential problem here? Effectively, Google harnessed users for it's own purposes without their consent by means of deception. The free-ness of the service that the users received doesn't even count as a fair trade in a transaction because the transaction was based on one party misinforming the other.

So thanks for all your work making the world a better place.

procrastitron
11 replies
20h20m

code.google going away, without the excuse that google itself was going away, after I had started to rely on it and link to it in docs and scripts all over the place

It didn't go away, though. It got archived and that archive is still up and running today. Those links you put all over the place should still be working.

If google had said "The purpose of this service is an academic goal of Googles, not to serve users needs. This service will be shut off as soon as Googles academic purpose is met." I would not have used it.

That's not an accurate representation of what DannyBee said. Moreover, what DannyBee did say is in line with what Google itself said was its goal when the service launched: https://support.google.com/code/answer/56511

"One of our goals is to encourage healthy, productive open source communities. Developers can always benefit from more choices in project hosting."

Effectively, Google harnessed users for it's own purposes without their consent by means of deception.

This does not appear to be a good faith argument.

None of what DannyBee said in their comment aligns with that interpretation. Neither does that interpretation line up with Google's publicly stated goals when they launched Google Code.

justinclift
4 replies
18h12m

It didn't go away, though. It got archived ...

"It got archived" means it went away for actual use. i.e. in not-just-read-only fashion

nl
3 replies
17h27m

But that's ok! It's easy to switch to a new code host. It's hard to change all the links on the internet if your link rots.

Putting a service like code.google into read-only mode is pretty much the ideal outcome for a discontinued service.

Google should be praised for how they behaved here.

bronson
2 replies
14h49m

Coding is a solitary activity? Switching everyone to a new environment is hard.

Also, "Google sunset their project really well" is damning with faint praise.

shiroiushi
1 replies
14h34m

Switching everyone to a new environment is hard.

Sure, but this is the danger you get when you rely on an outside vendor's service for anything. If you don't want to deal with this danger, then you should never, ever use an external vendor's service for anything; you should only use your own self-hosted solutions.

Of course, Google does have a worse track record than some when it comes to their services being EOLed if they aren't search, Maps, etc., but still, this can happen with anything: it can be shut down, or bought out by a competitor, etc.

Also, "Google sunset their project really well" is damning with faint praise.

I don't think so in this case. I'd say Google has done a poor job of sunsetting other projects of theirs, but if this one actually keeps all the links alive albeit in read-only mode, that's really a lot better than most other EOLed or shut-down (like due to bankruptcy) services (from Google or anyone else), where it just disappears one day.

Dylan16807
0 replies
11h30m

Of course, Google does have a worse track record

Yes, that's the point of the above comments. The repeated lesson is that there's always a risk of shutdown, but don't trust google in particular to keep services running.

1vuio0pswjnm7
4 replies
9h13m

"It didn't go away, though. It got archived and that archive is still up and running today. Those links you put all over the place should still be working."

I think this is misrepresenting what the commenter stated. He appears to have stated the project hosting service "went away". This fits with the context of the OP which is comparing project hosting services, e.g., Google Code, Sourceforge, Github.

If the context was software archives, e.g., software mirrors, instead of project hosting, then we could indeed claim "Google Code" still exists. However, the context is project hosting. And no one can upload new projects or revisions to Google Code anymore. Google Code project hosting did in fact "go away":

https://codesite-archive.appspot.com/archive/about

The old https://code.google.com/p/projectname URLs need to be redirected to https://code.google.com/p/archive/projectname

Google then redirects these /archive URLs to storage.googleapis.com

Too much indirection

https://code.google.com/p/projectname becomes

https://storage.googleapis.com/download/storage/v1/b/google-...

Downloads

https://storage.googleapis.com/download/storage/v1/b/google-...

scott_w
3 replies
8h14m

He made that claim but then conflated it by talking about "those links being gone" which isn't true.

I'm going to defend Google on this. They don't need to maintain products forever but this case is a good way to shut down a service. Allow access to existing projects but make clear that active projects need to go somewhere else. The commenter can be upset that they can't use Google Code as a product but they shouldn't misrepresent the situation by saying the code is inaccessible. I checked a university project I created 15 years ago and it's still there. The commenter is objectively incorrect.

Too much indirection

I don't think this is a valid criticism. The web is designed explicitly to do this. You can still access the code, that's good enough.

rerdavies
2 replies
4h24m

No, it's not good enough. The service went away.

scott_w
1 replies
4h6m

You're free to expect for-profit businesses to fully support a free product forever. Just as they're free to decide they don't want to do that.

vagrantJin
0 replies
3h32m

Thats funny, no business ever does anything for free.

People/businesses who do stuff for free, ask for donations.

dminik
0 replies
11h25m

There is a difference between

Actually, Google Code was never trying to win.

It was simply trying to prevent SF from becoming a shitty monoculture ... . Google was 100% consistent on this ...

And

One of our goals is to encourage healthy, productive open source communities. Developers can always benefit from more choices in project hosting.

These are not the same. One of these makes it out to be the singular goal. The other does not.

hathawsh
6 replies
18h48m

This is a repeatable pattern:

  1. A well-known major company sees that people are relying on something broken and it's hindering progress.
  2. The company decides to compete with that thing, but it's not part of their mission, so they make it free.
  3. Because the new thing is free, and run by a major company, lots of people come to depend on it, even though it's intentionally not the best.
  4. Another company builds more competition and works to actually be the best.
  5. The major company sees that their original objective of replacing the bad version is fulfilled, so they sunset the thing.
  6. People who came to depend on the thing feel betrayed.
This is why we should all be cautious about any non-core free service from major companies.

gary_0
5 replies
15h6m

Is GitHub a core service of Microsoft?

kagevf
1 replies
14h50m

And, how will the existence of source control management in both GitHub and Azure DevOps be reconciled?

tracker1
0 replies
55m

They're sharing a lot of the resources as pragmatically as possible it seems. GH Actions and DevOps workflows are really similar, and afaik run on the same clusters of hardware.

There's also some pretty decent integration features for DevOps projects with source control in GitHub for that matter iirc. Not to mention the potential upsell to/from GH Enterprise and/or Azure DevOps.

I can imagine at some point, GH Enterprise and Azure DevOps may become more similar and share more infrastructure over time.

throwup238
0 replies
14h32m

Microsoft has over six thousand repos on Github including flagships like Typescript and VSCode. For all intents and purposes, it is a core service.

Microsoft is a different beast because so much of their revenue is B2B. Continuity of operations is kind of their whole deal. Google Workspace is the equivalent Google product and they're much less likely to dump a service, unlike the rest of Google.

Xylakant
0 replies
10h43m

Github is free for a large class of users, but it's also an enterprise product that feeds off the funnel created by the free userbase. Almost every developer knows github by now, knows how to use it and integrating your own source control management with the place where all of your open source dependencies live is a significant lock-in effect. And while I don't know the numbers for Github, I'd expect that GH itself is profitable or at least could be profitable.

Lutger
0 replies
11h18m

Very close actually. Their strategy has always been to build essential developer tools. Developers, developers, developers.

So I think it is core to the way Microsoft expands and holds market share. And that market has changed to not only want windows only tools, so they change with it. Microsoft culture has always been kind of pragmatic. They were a walled garden for developers when they could get away with it (and still are for some), but now they have opened up a bit out of need.

thunky
1 replies
5h38m

I hate to break it to you be GitHub is going to shut down too. Everything does.

BigParm
0 replies
1h29m

People are just now catching on that Google services always fold and leave you hanging. Your comment is insightful. You're ahead of the curve predicting that eventually non-Google services will screw you too.

Whats the solution? Is the future self hosted? Making it accessible for non-technical people?

the_gipsy
0 replies
3h17m

Google harnessed users for it's own purposes without their consent by means of deception

Every profitable company on earth, ever.

shadowgovt
13 replies
1d

The only real tragedy here is that Google really did have best-of-industry semantic search integrated into their code searching tools, something that nobody has been able to replicate.

GitHub is great, but it's absolute ass for search. To the point where for any nontrivial question I have to pull down the repo and use command-line tooling on it.

jerjerjer
6 replies
23h58m

New GitHub full text search [1] is amazing. It is so good that for me it often replaces StackOverflow - I just use it to see how some API function is being used. Especially useful if you're searching for an example with a specific argument value.

[1] https://cs.github.com/

jiveturkey
3 replies
22h23m

have you used google's internal code search though? the link you posted is amazing in its performance, for sure. but once you are in some repo and doing what most of us call "code search", github drops off in utility vs google's internal tooling pretty quickly.

i'm only remarking on this because of the context in the parent you are replying to, whom i agree with. local tooling is better than what github provides. as a standalone comment i would simply upvote you.

rty32
1 replies
20h16m

Chances are that a random stranger on the Internet has not used Google's internal code search. Even if that person has, it would be useful to provide the context for others to understand.

saagarjha
0 replies
12h9m

I've used both. Google's code search is usually better if you know what you're looking for. It's less so if you need to do something that involves cross-language references (e.g. code file that references a translation string).

upcoming-sesame
1 replies
21h38m

Why is this not the default?

NoahKAndrews
0 replies
20h50m

I believe it is nowadays. For a while it was in beta.

dmoy
5 replies
1d

Do you mean the non-semantic indexing, which covered most of Google Code? Like grep-style supporting, but no real semantic data?

Or are you talking about the few repos that had semantic indexing via Kythe (chromium, android, etc)? We never got that working for generic random open repos, primarily because it requires so much integration with the build system. A series of three or four separate people on Kythe tried various experimentation for cheaply-enough hooking Kythe into arbitrary open repos, but we all failed.

shadowgovt
2 replies
21h14m

I'm talking about Kythe, and learning that it ran into issues generalizing it for non-Google-controlled APIs explains a lot of the history I thought I knew!

dmoy
1 replies
20h14m

Yea we never had it for even all Google controlled repos, just the ones that would work with us to get compilation units from their build system.

I was the last one to try (and fail) at getting arbitrary repos to extract and index in Kythe. We never found a good solution to get the set of particular insanity that is Kythe extraction working with random repos, each with their own separate insane build configs.

shadowgovt
0 replies
1h20m

It almost makes me wonder if the right approach (had Google been willing to invest in it) would have been to wed Kythe and Bazel to solve that "insane build configs" problem.

"Okay, you want generic search? Great. Here's the specific build toolchain that works with it. We'll get around to other build toolchains... Eventually maybe."

Would have been a great synergy opportunity to widen adoption of Bazel.

dmoy
0 replies
19h19m

Yea it's still there, that is backed by Kythe.

JohnBooty
13 replies
1d

    All of this would have been easy to find out simply by asking
I'm not a journalist, but in an ideal scenario, how would somebody have known that you were one of the key members of the project?

It's not like Google (or anybody else) makes this easy to know. And call me jaded, but something tells me official Google PR channels would not have been really helpful for this.

And also - are most engineers in your sort of position even free to remark on such projects w.r.t. NDAs, etc?

DannyBee
6 replies
1d

"I'm not a journalist, but in an ideal scenario, how would somebody have known that you were one of the key members of the project?"

It's not about asking me, it's about not asserting things you don't know.

Instead of saying "people did x for y reason" when you have literally no data on x or y, you could say "I don't know why x happened, i only know about z". Or if it's super important you try to put something there, beforehand, you could say "hey does anyone know why x happened? I'm working on a blog post and want to get it right".

Then, someone like me who saw it could happily email you or whatever and say "hey, here's the real story on x".

Or not, in which case you can leave it at "i don't know".

The right answer is not to just assert random things you make up in your head and force people to correct you. I'm aware of the old adage of basically "just put wrong stuff out there and someone will correct you", but i generally think that's super poor form when it's about *other people or things and their motivations". I care less when it's about "why is the sky blue".

In this case, it also happens that there are plenty of on-record interviews and other things where what i said, was said back in the day.

So a little spleunking would have told them the answer anyway.

softfalcon
1 replies
1d

This explains so much about modern media, news, and story-telling. It's easier to make up a plausible narrative that supports your story than simply admitting you don't know.

You can see how as the article develops, they go from being "uncertain what made GitHub succeed" to definitively being sure about why it succeeded. It doesn't surprise me that details were glossed over as the story rose to the ultimate crescendo of "GitHub dominates".

This is how a good tale is spun and the people lap it up. What's a good tale without a bit of embellishment? (said every bard since antiquity)

guappa
0 replies
5h39m

You think you can ask google and get an honest reply?

shadowgovt
1 replies
1d

To be a bit more generous: I think from Scott Chacon's point of view, "They had no taste and we beat them in the market" is a fair way to hold the elephant. Lacking the Google-internal perspective, it's a reasonable conclusion from the signal he has. I don't get the sense from this post that he's trying to publish a doctoral thesis on the historical situation in the industry; he's providing some primary-source testimony from his point of view.

DannyBee
0 replies
1d

I guess i'm going to disagree with you.

He's not just providing primary source testimony from his point of view, he's trying to pretend he has primary source testmony on what others were doing as well.

If he left out the parts where he has no data (or said i don't know), it would have IMHO been a better post, and actually primary source testimony.

You also don't run into the Gell-Mann amnesia problem this way.

To each their own, of course.

wendyshu
0 replies
21h44m

Yeah but you said "nobody bothers to actually ask other people things anymore" and I don't think it's reasonable to expect someone to ask about this when the probability of getting an answer is so low.

HappMacDonald
0 replies
5h35m

I'd imagine that if you can say "I don't know why x happened" then you can also save some breath and say nothing at all, and there are billions of folks doing that right now.

Putting out a general inquiry of "why did x happen?" also has a lot of stigma attached to it in RTFM, LMGTFY internet culture. The result will not be a cloud of other people interested in the answer upvoting the question to make it visible to folk who might actually know. Snide comebacks notwithstanding the question will languish forgotten in a dusty corner of whichever forum it was asked within.

But bold assertions based on plausible conjecture? That can earn upvotes, drive engagement, draw ad clicks, and sometimes even prompt corrections from actual experts.

Certainly not an ideal situation but this does appear to be where we're at.

SoftTalker
2 replies
1d

The way it used to work is that tech journalists (or sports journalists, or any other type) had contacts in the industry. If those people were not directly involved, they probably could at least suggest someone else who might know. Leads were followed up, and eventually the writer got the story.

I'm not sure how it works now, cynically I would suggest that the writer asks an LLM to write the story, gets a rehash of Wikipedia and other sources, and they maybe makes some attempts at firsthand verification.

krisoft
1 replies
18h31m

That is neat, but Scott Chacon is not a journalist, does not act like a journalist and what you are reading is not tech journalism.

You are reading the personal diary of someone with personal connection to a topic and complaining that it is not up to the standards of professional journalism.

SoftTalker
0 replies
15h24m

I'm complaining about nothing, here.

mattnewton
0 replies
23h55m

Journalists are supposed to investigate not speculate because finding an email is too hard

jaredklewis
0 replies
23h40m

Well, finding, vetting, and getting comments from sources is like half of journalism. If you can't or won't do that, whatever you are doing is probably not journalism. It's just an editorial, think-piece, or whatever.

dlisboa
0 replies
1d

Maybe a cofounder of GitHub has the reach and network to ask for the e-mail of someone who worked on the Google Code team. A journalist might not, that's true.

Just flat out saying they had no taste in product development, however, is a bit of trash talking for no reason.

BSDobelix
13 replies
1d1h

So wait, you tried to prevent SF to become a "shitty monoculture"?

First: That sounds completely not like Google

Second: Now you have GH as the "shitty monoculture" (owner is MS and erases your license for Co-pilot)

Third: >>We folded it up because we achieved the goal we sought at the time, and didn't see a reason to continue.

Yeah ok that sounds like Google, try's to enter another market just to hurt them then folds ;)

DannyBee
8 replies
1d1h

This was 2006 Google, which did stuff semi-altruistically all the time.

At that point, SF was serving malware and stuff. It was really not a great time.

Github became a monoculture years later when others folded. Google code was shut down in 2016. Github wasn't quite a monoculture then.

I also said, back in 2014, that it might be necessary to do something like google code again in 5-10 years: https://news.ycombinator.com/item?id=8605689

10 years later, here we are i guess :)

Though i think what i said then still holds - Github is not anywhere near as bad or unreliable as SF was.

DannyBee
3 replies
1d1h

I mean, that's just when they did it fairly deliberately. Regardless, I think you would be hard pressed to argue SF was a great hosting environment when Google Code launched, which was the point.

BSDobelix
2 replies
1d1h

hard pressed to argue SF was a great hosting environment when Google Code launched

But SF had FTP, Websites, SVN hosting and i think even a WIKI, so you can hardly compare it with Google-Code...and hey at least they opensource'd their "forge":

https://allura.apache.org/

IDK i don't have such bad memory's about SF, even today you serve big files over SF because of GH limits.

ndiddy
1 replies
1d

SourceForge was originally open source, but they later closed it. GNU Savannah (https://savannah.gnu.org/) runs on a fork of the last open version of SourceForge.

BSDobelix
0 replies
23h23m

SourceForge was originally open source

True after slashdot got buy'd, they also served malware AFTER the takeover (2013), and now look at that year:

Allura graduated from incubation with the Apache Software Foundation in March >>2013

https://en.wikipedia.org/wiki/Apache_Allura

Google-Code was in 2006 right?

naniwaduni
1 replies
1d1h

They had a bad name for the download pages being ad-infested even before they bundled the malware in the installers.

(And yes, fake download buttons on a site serving binary downloads went exactly where you'd expect.)

BSDobelix
0 replies
1d

fake download buttons

Yes and today Ad-Sense (Google) took the Crown from being then biggest Scam AD's deploy-er.

And really i don't think that's true before they where sold, ad's sure, scam/malware stuff? I don't think so...at least i cant remember.

jiveturkey
0 replies
22h16m

I do wish there were enough incentive to have a strong commercial gerrit offering. There are some very good ideas in gerrit, and it would have strong differentiation vs github-styled offerings.

Not just because I like gerrit, but because the github monoculture is wearing on me.

remexre
3 replies
1d1h

For a comparison on the scale of harm from the monoculture, recall that SourceForge was bundling malware with downloads, and still has a full page of ads when you download from it.

If I recall correctly, SVN was also more popular than Git at the time, so migrating hosts was a lot more painful than now...

bluGill
1 replies
1d

SVN's model is what everyone is using. Sure you git, but almost nobody is using the distributed parts - they all sync to a central sever (github). SVN just could get user management, or merges right - those should be solvable problems but somehow were not. (I don't know enough about SVN to speculate on why they didn't)

rswail
0 replies
11h10m

The main reason is that SVN's model of branching/tagging is based on directories in the working directory, whereas git's model (and to a certain extent, commercial SVNs like Clearcase and Perforce) is that branching/tagging is of the entire repository and not related to the file tree structure.

This is a fundamental difference and the reason that git's model works much better when branching/merging.

BSDobelix
0 replies
1d1h

And then you force people to change from google code to something else just to prove a point since people then where unable to setup svn server ;)

Even today you find death links from google code repos.

shawabawa3
6 replies
1d1h

It's a bit of a cop out to say "we were never trying to win"

If you were never trying to win, that's a product failure

You should have been trying to win, you should have built a strong competitor to GitHub and you shouldn't have let it rot until it was shut down

The world would have been a better place if Google code tried to be as good as GitHub

SoftTalker
1 replies
1d

It was similar with Chrome. Internet Explorer was the monoculture browser and stagnating. Google had things they wanted to do on the web but needed better browsers. The original goal was to introduce competition in the browser space so that all browsers would get better. They may have changed goals along the way, but that was the original stated goal. In the end they killed IE and now they are the monoculture outside of Safari.

a57721
0 replies
11h39m

When Chrome was released, Internet Explorer was not the monoculture browser, and 1/3 of the users had Firefox installed.

DannyBee
1 replies
1d1h

It's a bit of a cop out to say "we were never trying to win"

It's literally not? We had a goal from the beginning - create enough competition to either force SF to become better (at that time it was infinite ads and malware), or that someone else wins.

You should have been trying to win, you should have built a strong competitor to GitHub and you shouldn't have let it rot until it was shut down

That's your goal, not mine (or at the time, Google). Feel free to do it!

You don't like what we had as a goal - that's okay. It doesn't mean we either failed, or had the wrong goal. We just had one you don't happen to like.

The world would have been a better place if Google code tried to be as good as GitHub

One of the things to ask before you either start something or keep doing something is "who actually wants you to win?" If the answer is "nobody", it might not make any sense to do.

It's not obvious in 2016 anyone would have wanted us to win. By then, Google had lived long enough to see itself become a villain. There was reasonable competition in the space.

I don't believe we would have really served people well to keep going.

93po
0 replies
17h54m

It sucks you're getting so much anti-Google sentiment when you're not at all attached to the reasons google sort of sucks.

ulbu
0 replies
1d1h

If you were never trying to win, that's a product failure.

what?

sanderjd
0 replies
1d1h

Different groups of people have different goals. Not every group of people has "winning" a market as their primary goal.

meiraleal
4 replies
1d1h

  > Actually, Google Code was never trying to win.

  > It was simply trying to prevent SF from becoming a shitty monoculture that hurt everyone
Being an insider of Google might make one be completely out-of-touch of reality. Google Video was trying to prevent Youtube from becoming a shitty monoculture that hurt everyone, too? This one clearly failed then.

omoikane
1 replies
19h5m

Being a Google insider comes with the gift of being able to see the real rationales behind many products, and also the curse that nobody outside will believe you.

This is perhaps true of all big companies, but Google also seem to adopt a more passive PR strategy and don't try too hard to explain things, and it's just that much more difficult to understand Google when everyone else is louder.

saagarjha
0 replies
12h8m

I hope you never have to work for Apple ;)

rtpg
0 replies
17h12m

different projects have different objectives? The guy literally worked on the project! There were 4 people at its peak! Why would you think that would be the source of a major initiative?

Though there probably is some deeper critique about how you have a pretty amazing service running with "just" 4 people and aren't able to turn that into something useful beyond that objective. Innovator's Dilemma I guess.

BSDobelix
0 replies
1d

Google Video was created to prevent Youtube from becoming a shitty monoculture, too?

Like Google+ and all the other attempts:

https://killedbygoogle.com/

Google is actually the good guy to prevent monopolies, we just don't understand them ;)

jonathanyc
4 replies
1d

It was not trying to make money, or whatever

If Google Code succeeded, it’s hard to imagine that Google would not have tried to monetize it someday.

This also reminds me of Google’s (also initial) position on Chrome vis-a-vis Firefox: create a product “not trying to make money, or whatever” but just to limit the market share of a competitor.

The less flattering term for this in the context of anticompetitive behavior is “dumping”: https://en.wikipedia.org/wiki/Dumping_(pricing_policy)

bryanlarsen
1 replies
1d

It seems highly likely that a successful Google Code would be used as an onramp to Google Cloud. IOW, indirect monetization so it likely would still have a generous free component.

chubot
0 replies
18h59m

Yeah exactly, it is extremely easy to imagine this, because when Github was acquired by Microsoft in 2018, Diane Greene (who headed Google Cloud at the time) commented on this

It sounds like Google would have paid some amount of billions for Github, but not the amount that Microsoft paid

I personally don't think Google would have tried to monetize Google Code, even if it had 10x or even 50x the users that it eventually had. (And I say that having worked at Google, and on Google Code briefly!)

I think it made more sense as complementary to another product (which in some sense explains a big problem with developing products at Google)

---

https://www.cnbc.com/2018/07/24/google-cloud-ceo-diane-green...

CNBC previously reported that Google was looking at buying GitHub, but Greene wouldn’t confirm the report.

“I think the only thing I’ve said is that I wouldn’t have minded having them,” said Greene.

someguydave
0 replies
1h36m

Relatedly, it is hard to understand how operating Google Code in a manner “never trying to win”, “not trying to make money, or whatever” and “just to prevent … a monoculture” was in the best interests of Google the corporation and its shareholders

DannyBee
0 replies
1d

"If Google Code succeeded, it’s hard to imagine that Google would not have tried to monetize it someday."

Google code did succeed in that sense. It had hundreds of thousands of 30-day active projects, and some insane market share of developers.

I don't honestly remember if it was even shrinking when we decided to stop taking in new projects.

I doubt we would have monetized it directly (IE sell an enterprise version) - the entire market for development tools is fairly small.

In 2022 it was ~5 billion dollars, and future estimates keep getting revised downwards :).

CAGR has been about 10-14% in practice, sometimes less.

I don't remember if it's still true, but most of that 5 billion dollars was going to Atlassian (80% at one point).

Now, if you project backwards to 2006, and compare it to other markets google could be competing in, you can imagine even if you got 100% of this segment it would not have actually made you a ton directly.

Indirectly, eh, my guess is you still make more off the goodwill than most other things.

It's actually fairly rare to make any significant money at development tools directly.

Nowadays, the main source even seems to be trying to sell AI and productivity, rather than tools.

Jyaif
2 replies
22h16m

We folded it up because we achieved the goal we sought at the time, and didn't see a reason to continue.

In 2018, MS bought Github for 7B.

Google Code started to be shutdown mid-2015. In 2015 it wasn't clear yet that it would be valuable for Google to host the world's code?

rty32
1 replies
20h10m

Well, it is hard/very distasteful to put ads on a source code hosting website, so this likely isn't aligned with Google's interest. No, I am not joking.

guappa
0 replies
5h30m

github shows me ads to their other tools, conferences and whatever all the time. And my company is a paying customer.

osmsucks
1 replies
23h43m

Actually, Google Code was never trying to win.

Herein lies the tragedy. Google could've offered, even sold, its internal development experience (code hosting, indexing and searching, code reviews, build farms, etc...) which is and was amazing, but it decided that it wasn't worth doing and let GitHub eat its lunch.

hanwenn
0 replies
22h5m

Developer infrastructure at google reported into cloud from 2013 to 2019, and we (i was there) tried to do exactly that: building products for gcp customers based on our experience with building interval developer tools. It was largely a disaster. The one product I was involved with (git hosting and code review) had to build an MVP product to attract entry level GCP customers, but also keep our service running for large existing internal customers, who were servicing billion+ users and continuously growing their load. When Thomas Kurian took over GCP, he put all the dev products on ice and moved the internal tooling group out of cloud.

theanonymousone
0 replies
11h2m

This is a very bold statement, isn't it? So Google says the credit of making SourceForge (very deservedly) irrelevant is theirs and not GitHub's (or SourceForge themselves)?

As a complete nobody, why can't I think that in every example of a product launch, if it wins it wins, and if it fails, I can claim it was never intended to win?

svnt
0 replies
23h42m

I had this theory that generations raised on the internet and exposed to it from birth would be the most humble generations ever, because we all look for ways to be uniquely valuable, and it became nearly impossible to be egotistical when faced with the entirety of even just a mature youtube platform.

Instead what we got was higher degrees of selective attention, and very elaborate and obscure flip-cup tricks.

skybrian
0 replies
1d

I don’t see a contradiction; it’s all part of the story.

Understanding your (Google’s) motivations explains why Google Code didn’t improve as much. It doesn’t contradict that Github had better UI, or their explanation of their motivation to build a better UI.

olalonde
0 replies
16h53m

To be fair, you "lacking taste" and "not trying to win" are not mutually exclusive. You could argue they are respectively the proximate and ultimate cause for GitHub's win.

hintymad
0 replies
1d

It was simply trying to prevent SF from becoming a shitty monoculture that hurt everyone

Initially I thought SF means San Francisco, and I thought "Wow, what kind of monoculture can be prevented by Google Code", and then I realized that SF meant Source Forge.

calmbonsai
0 replies
17h27m

Props for serving as primary source material. One of the reasons I, basically, no longer trust "tech" journalism.

Aside from too many hit/bait/paid-for pieces, writers have simply gotten "lazy" as they're no longer incentivized to "get it right"--just "get it out".

Granted, this post is, essentially, meant to be a whitepaper for their product offering, but c'mon guys, you had the references to reach out, but were lazy for...reasons?!

AWS also has a history of some of these "buit-to-marketize" products (e.g. CodeCommit), but at least there's a "solid core" of a stack to re-build these niche services from-scratch.

What's Google "reliable core" anymore aside from Compute Engine and Search? Don't get me started on Cloud SQL.

authorfly
0 replies
1h47m

was there, working on it, when it was 4 of us :)

raises eyebrow

When there were 4 of you?

So about $800k a year for that time period?

Just out of interest?

ThrowawayR2
0 replies
18h1m

"Actually, Google Code was never trying to win."

Wasn't Google reported among the bidders for GitHub?

https://www.cnbc.com/2018/06/05/github-interest-from-google-...

Maybe Google Code itself was never trying to win but tendering an significant offer to the auction suggests Google was trying to win something.

nerdix
32 replies
1d1h

GitHub won because Git won. It was obvious by the late 00s that some DVCS was going to upend subversion (and more niche VCS like TFS). It ended up a two horse race between Git and Mercurial. GitHub bet on Git. Bitbucket bet on Mercurial.

Git took the early lead and never looked back. And GitHub's competitors were too slow to embrace Git. So GitHub dominated developer mindshare.

It seems strange now but there was a period of time during the late 00s and early 10s when developers were pretty passionate about their choice of DVCS.

BeetleB
9 replies
22h53m

GitHub won because Git won.

Sorry, but Git won because Github won. Lots of people loved (and still use) Mercurial. It lacked the network effect because Github didn't support it.

GitHub bet on Git. Bitbucket bet on Mercurial.

Bitbucket didn't lose because of Mercurial. They lost because Github had a better product (in terms of sharing code, etc). It also was neglected by Atlassian in 2010.

It seems strange now but there was a period of time during the late 00s and early 10s when developers were pretty passionate about their choice of DVCS.

Sorry buddy, but there are still plenty of us Mercurial users. Maybe, just maybe, even dozens!

(Seriously, I use Mercurial for all my projects).

bornfreddy
4 replies
22h20m

As someone who used both git and hg, I must say I'm sorry git won. Its chrome sucks (though less than it did) and the naming is confusing as hell. Still, if everyone uses git, and you have to use BitBucket for hosting instead of GitHub/Lab... Nah, not worth it. Kudos to you for sticking with it!

Izkata
2 replies
21h41m

As someone who tried out both git and hg around 2012 with only svn experience, I found hg confusing and git easy to understand.

Unfortunately it's been so long since then I don't remember exactly what it was that confused me. Something around how they handle branches.

nottorp
1 replies
11h29m

I've only recently started to use mercurial in earnest for one project (legacy reasons). It's branches for me too. At least considering my experience with it is limited.

I don't like how every time you pull someone else's changes you end by default in a state that is probably similar with git's detached head. With git, most of the time you are on a named branch, you know where you are and you pull/push stuff out of said named branch. With mercurial some of the branches are unnamed and it's still confusing why I'd want that. Perhaps the original designers didn't like having private local-only named branches, I don't know.

This may just be an artefact of my very limited experience with hg though.

BeetleB
0 replies
3h3m

When not sharing with others, bookmarks is the way to go - not branches. Mercurial bookmarks act more like git's branches. I think they've now made it so that you can share them too, but since no one else at work uses mercurial, I don't have experience with distributed bookmarks.

BeetleB
0 replies
22h8m

Still, if everyone uses git, and you have to use BitBucket for hosting instead of GitHub/Lab

Isn't this supporting my point? That a barrier to use mercurial was that people preferred Github over Bitbucket?

Kudos to you for sticking with it!

It's simple a lot easier to use it vs Git! Kudos to whoever suffers through the latter!

dewarrn1
3 replies
21h45m

Me, I picked Python and Mercurial for primary language and DVCS, respectively: one of those worked out really well. I still miss hg and have never really gotten the hang of git.

Regarding Mercurial, would you happen to have recommendations for a GitHub/Bitbucket-like service that still works with hg?

bsder
1 replies
21h32m

Regarding Mercurial, would you happen to have recommendations for a GitHub/Bitbucket-like service that still works with hg?

Use "jujutsu" (jj).

It's the goodness of Mercurial but works in the crappy world that Git has bestowed upon us.

I made the switch from Mercurial because it's just getting too hard to fight the git monoculture. :(

dewarrn1
0 replies
3h42m

This looks cool; I might wait for 1.0, but the idea of git underneath and something better on top is appealing.

BeetleB
0 replies
20h24m

If you just want an online repository, go with Sourcehut (https://sourcehut.org/)

beAbU
7 replies
7h32m

Is Git and GitHub a DVCS though?

Have you ever checked out code directly from a colleague's machine? GitHub is very central-looking from where I'm standing, and the differences between Git and SVN are very academic and does not really apply in practice any more.

GitHub allowing forks of repo's to request PRs between one another is probably the only DVCS thing about all this. But this model does not apply to orgs hosting their proprietary code on GH, where the developers don't have their own forks of their employer's code repos. I'm pretty sure it would have been possible to replicate pull requests with SVN on GitHub in some alternative reality.

infecto
5 replies
6h40m

Git is a DVCS though. Just because GitHub exists does not exclude Git from the category of a DVCS. You get a local copy of the entire history with Git which is what pushes it into that category, nothing to do with GitHub. SVN is centralized in the sense that you are not grabbing the entire copy of the repo locally. Not academic differences.

beAbU
4 replies
5h51m

It's been a hot minute since I've used SVN at work, but in my last job where it was SVN, each dev checked out the entire repository locally. Even though you /could/ check out a section of the repo, it made no sense to do that, because you need the entire codebase to run locally. Branching was still a mess though, and Git has really innovated in this space. We used to all dev on `develop` branch, and we'd daily pull from the server, fix merge conflicts locally, and then push up to the server. On releases our lead dev would merge dev with master and run off a build.

I still maintain the differences are academic, because even though Git is a DVCS (and I agree it is), and it is possible to use it as a DVCS. But given that GitHub is the defacto standard, and everyone uses it for work and OSS, I posit we are actually using Git as a CVCS, and any argument about Git being better than SVN because it's a DCVS is moot because nobody is using Git's distributed features anyway.

infecto
1 replies
4h48m

I think we are missing something here, would like to be corrected if wrong.

Git is a DVCS because when you clone/pull a repo it includes the entire working history of the repo. Thats why its distributed, you could pull a repo from somewhere and never need to touch that source again. Has very little to do with Github.

SVN which I have not used in recent history but historically your local copy does not include the full change history of that repo and relied on a SVN server for that information.

I actually don't quite follow your arguments because while yes, we tend to setup Git so that is "centralized" the distinction is not about Github but that your local working copy is everything.

Snild
0 replies
2h45m

I think it was a misunderstanding based on different views of what "the whole repo" means -- all the files or all the history.

It quite nicely demonstrated the difference in philosophies, albeit accidentally. :)

kbolino
0 replies
4h37m

So much has gotten better thanks to distributed VCS that I think this perspective is a bit like a fish in water.

Every commit is identified by a globally unique content-addressable hash instead of a locally unique or centrally managed revision number. This means two people on opposite sides of the globe can work on the same project with no risk that they will think the same revision identifies different code, nor that they must coordinate with a distant server constantly to ensure consistency.

Moreover, core VCS operations like committing and branching require no server interaction at all. Server interaction is a choice that happens when synchronization is desired, not a mandatory part of every VCS command. "Commit early and commit often" could never happen with CVS or SVN on a large or geographically distributed team. And, of course, you can continue working on a cloned Git repo even if the server goes down.

Finally, forking a repository is still common even in the age of GitHub dominance. In fact, GitHub natively understands Git's internal graph structure, and makes forking and pulling across forks pretty painless. Yes, those forks may all be hosted on GitHub, but there can be far more dynamic collaboration between forks than was ever possible on say SourceForge.

So sure, everybody working on the same code may have the same GitHub repository as their origin more often than not, but we are still miles ahead of the world of non-DVCS.

It's probably worth noting too that even the canonical example of Git in practice, Linux, is essentially "centralized" in the same way. Linus Torvalds's clone is "the" Linux kernel, any clone that differs from his is either not up-to-date or intentionally divergent and thus unofficial. A lot of work gets merged first in other people's clones (with some hierarchy) but Linux also has tens of thousands of contributors compared to the average Git repository's handful or less.

immibis
0 replies
1h14m

You only checked out the latest version of each file in the entire repository. You did not check out the entire respository, like you do in Git.

snovymgodym
0 replies
3h37m

Git is still a DVCS, even today it's not being used in the way it was designed to be used by Linus and co.

The key distinguishing characteristic is the fact that every git checkout contains the full repo history and metadata. This means a consistent network connection to the master server isn't necessary. In fact, it means that the concept of a "master server" itself isn't necessary. With Git, you only need to connect to other servers when you pull down changes or when you want to push them back up to the remote repository. You can happily commit, branch, revert, check out older revisions, etc. on just your local checkout without needing to care about what's going on with the remote server. Even if you treat your remote repo on GitHub as your "master", it's still a far cry from the way that centralized VCS works.

If you've never worked with true centralized VCS, it's easy to take this for granted. Working offline with a system like Perforce or SVN is technically possible but considerably more involved, and most people avoid doing it because it puts you far off of the beaten path of how those systems are typically used. It basically involves you having to run a local server for a while, and then later painfully merging/reconciling your changes with the master. It's far more tedious than doing the equivalent work in Git.

Now, it's important to note that Git's notion of "every checkout contains all the repo data" doesn't work well if the repo contents become too large. It's for that reason that things like sparse checkouts, git-lfs, and VFS for Git exist. These sorts of extensions do turn Git into something of a hybrid VCS system, in between a true centralized and a true decentralized system.

If you want to understand more, here's a great tech talk by Linus himself from 2007. It's of note because in 2007 DVCS was very new on the scene, and basically everyone at the time was using centralized VCS like SVN, CVS, Perforce, ClearCase, etc.

https://www.youtube.com/watch?v=MjIPv8a0hU8

nine_k
5 replies
1d1h

Not just that. They invented "pull requests" and offered (initially minimal) code review tools. This made contributing in the open.much easier, and making small contributions, vastly easier.

Something like git had to take over svn / cvs / rcs. It could be Perforce, it could be BitKeeper which apparently pioneered the approach. But it had to be open-source, or at least free. Git won not just because it was technically superior; it also won because it was at the same time free software.

fweimer
2 replies
1d

Pull requests predate Git. The kernel developers used them in the Bitkeeper days:

    I exported this a patch and then imported onto a clone of Marcelo's
    tree, so it appears as a single cset where the changes that got un-done
    never happened.  I've done some sanity tests on it, and will test it
    some more tomorrow.  Take a look at it and let me know if I missed
    anything.  When Andy is happy with it I'll leave it to him to re-issue a
    pull request from Marcelo.
https://lore.kernel.org/linux-acpi/BF1FE1855350A0479097B3A0D...

I do not know to what extent Bitkeeper had browser-based workflows. Moving cross-repository merges away from the command line may actually have been innovative, but of course of little interest to kernel developers.

schacon
0 replies
1d

That's interesting. I know BK had "pulls", but iirc it didn't have a "request-pull" command, so clearly the "pull" terminology came from BK and the "request" part came from how people talked about it in email.

I actually just shot a video showing how BitKeeper was used. I'll post that and a blog post on our GitButler blog soon.

bluGill
0 replies
1d

Mercurial also supported pull requests. The unique thing about github was an easy central place to do them from and ensuring they didn't get lost. Once you have a github account you can fork a project make a change and pull request it in a few minutes. emailing a patch isn't hard, but with github you don't have to look up what address to email it to, if you just say open pull requests it typically goes to the right place the first time.

irunmyownemail
1 replies
20h18m

I remember we used a tool, I think it was Gerrit, before I'd heard of GitHub or Pull Requests. It worked with patches which is also how we used to share code, through email with patches. GitHub won because it had a cleaner UI and a likable name.

dijit
0 replies
4h10m

I found Gerrit recently.

I love it so much, I hate how the other code review systems kinda suck in comparison but people prefer them.

I guess it's proof that features and shiny are more important than a good idea.

dboreham
4 replies
1d1h

Yes, article seems to miss this. I believe (at the time, and still) that git won because the cost to host the server side of it is orders of magnitude lower than the competitors (svn, perforce, etc). All those other revision control systems ended up with a big server cost that couldn't justify a free hosting service. Plus git provided a reasonable (but still not great) solution to "decentralized development", which none of the others attempted to do.

bluGill
1 replies
1d

Why didn't mercurial win then? There were almost a dozen other distributed version control systems built in those early days, most of which I cannot remember but all had the same distributed ideas behind them and should be been as easy to host (some easier).

troutwine
0 replies
21h2m

At my university, performance. The CS department was clued into Linux development but also the Haskell world so darcs use among students was high. Our underpowered lab machines and personal devices struggled with darcs for reasons I no longer remembered and a group of us made use of mercurial for an OS project and had a rough go of it as the patch sets got more and more convoluted. Back in those days the core was C but a lot of the logic was Python which struggled on the memory constrained devices available. Some one of us learned about git trying to get into Linux kernel work, told the rest of us and it was just comically fast, is my memory. I spent a tedious weekend converting all my projects to git and never looked back, myself.

Some years later Facebook did a lot of work to improve the speed of mercurial but the ship had sailed. Interesting idea though.

schacon
0 replies
1d

I'm curious how you come to this conclusion. GitHub has always had fairly insane hosting problem sets. When someone clones the Linux repo, that's like 5G in one go. The full clone issues and the problems of a few edge case repos create sometimes crazy hosting costs and scaling problems. Most centralized systems only have to deal with one working tree or one delta at a time. There is not much that goes over the wire in centralized systems in general, comparatively.

aseipp
0 replies
22h22m

Multiple other distributed version control systems in the 2000s had support for easy hosting. Darcs was actually the best in this era, IMO, because it was far simpler than both Hg and Git -- a Darcs repository was just a directory, and it supported HTTP as the primary pull/patch sharing mechanic. So, you could just put any repository in any public directory on a web server and pull over HTTP. Done. This was working back in like 2006 as the primary method of use.

In any case, the premise is still wrong because as mentioned elsewhere, the distribution of repository sizes and their compute requirements are not smooth or homogonous. The cost of hosting one popular mirror of the Linux kernel, or a project like Rails, for 1 year is equivalent to hosting 10,000 small projects for 100 years, in either SVN or Git. The whole comparison is flawed unless this dynamic is taken into account. GitHub in 2024 still has to carve out special restrictions and exemptions for certain repositories because of this (the Chromium mirror for example gets extended size limits other repos can't have.)

Git also lacked a lot of techniques to improve clones or repo sizes of big repos until fairly late in its life (shallow + partial clones) because 99% of the time their answer was "make more repositories", and the data model still just falls over fast once you start throwing nearly any raw binary data in a repository at any reasonable clip (not GiB, low hundreds of MiB, and it doesn't become totally unusable but degrades pretty badly). This is why "Git is really fast" is a bit of a loaded statement. It's very fast, at some specific things. It's rather slow and inefficient at several others.

globular-toast
1 replies
12h4m

Git also massively benefitted from GitHub. Do you know a single person who even knows you can use git without a "forge" like GitHub, let alone knows how to or actually does it?

It's hard to remember but there was a time when git was resisted. When I first started to use it, a lot of people were saying you don't need it, you only want to use it because it's hipster and the kernel uses it, but you're not the kernel etc. It's exactly the same as k8s is all these years later (the tide seems finally turning on k8s, though).

Without GitHub (or something else), git would have remained a weird kernel thing. But, equally, without git GitHub would have had no raison d'être. It's a symbiotic relationship. GitHub completely the picture and together they won.

ajford
0 replies
1h10m

I taught my research group Git version control in college. It was part of a "new student/researcher onboarding" series that we put all the new grad students and undergrads through. But we were in Radio Astronomy, so there was a lot of data processing and modeling stuff that required being comfortable within a remote ssh session and the basics of Linux/bash/python. I know it was already being used in Radio Astronomy (at least in the sub-field of Pulsar Astronomy) at the time and was part of the reason I didn't get pushback when I proposed making sure our group was trained up on using it.

We switched to Git as a whole in early 2009 since it was already a better experience than SVN at the time. Could be off by a year or two, given how long ago this was and the fact that I was working with the group through the end of high school in 2007-2008.

We only added GitHub to our training later in 2011-2013 era, but we ran our own bare git repos on our department servers until then. And students/groups were responsible for setting up their own repos for their research projects (with assistance/guidance to ensure security on the server).

Last job also made use of our own internal bare repos, admittedly mirrors of our private GH projects, and our stack pulled from that mirror to ensure we always had an instance that was not dependent on an external vendor.

Current role also makes use of bare git repos for similar reasons.

I think the knowledge is there and plenty people do it, it's just not news/blog worthy anymore. It's not new or groundbreaking so it gets little attention.

bsder
0 replies
21h21m

Git took the early lead and never looked back. And GitHub's competitors were too slow to embrace Git. So GitHub dominated developer mindshare.

And Mercurial spent an enormous amount of effort going after Windows users and basically got absolutely nothing for it.

In my opinion, this was what really hurt Mercurial. Nobody in Windows-land was going to use anything other than the official Microsoft garbage. Consequently, every ounce of effort spent on Windows was effort completely wasted that could have been spent competing with Git/Github.

teqsun
19 replies
1d1h

As a "younger" programmer it always shocks me how things like git were only created in 2005. It feels so ubiquitous and the way it functions has the "feeling" of something created in the 80s or 90s to me.

eterm
10 replies
1d1h

Subversion (svn) was absolutely fine before git. Before that, there was CVS but that really was painful.

Svn gets a lot of hate for things it doesn't deserve, even this article talks about "checking out" and the difficulty of branching, but that doesn't track with subversion.

Branching in subversion was just as easy as in git, it had shallow branches. You could branch largely without overhead, although unlike git it was a server-side operation. ( Imagine it like git branch with auto-push to remote).

Most software also automatically checked out files as you modified them, and it was a local oepration, there wasn't any locking or contention on that. It was the older CVS/sourcesafe style version system that those.

I still maintain that most workplaces with less than, say, 10 devs, would be better off with subversion rather than git, if not for the fact that most the world now works on git.

Subversion solves problems with less mental overhead than git, but it's not worth doing anything non-standard, because everyone now knows git and has learned to put up with the worse developer user experience, to the point where people will argue that git doesn't have bad UX, because they've internalised the pain.

Before subversion there was CVS and Visual Source Safe. These are much older. These solved a problem of source control, but were based on the concept of locking and modifying files.

You'd "checkout" a file, which would lock the file for modification of all other users. It was a bit like using a global locking file repository but with a change history.

It was as painful as you might imagine. You'd need to know how to fix the issue where someone would go on holiday having checked out a critical file: https://support.microsoft.com/en-us/topic/5d5fa596-eb9c-d2b5...

Or more routinely, you'd get someone angrily asking who had such-and-such file checked out.

HerrMonnezza
1 replies
22h19m

One thing that Git and the other DVCS's massively improved over Subversion is that commits are local, and you only need to talk to a remote endpoint when you push/pull. In Subversion, every commit would require uploading changes to the repository server, which encouraged larger commits (to amortize the overhead).

thraxil
0 replies
10h28m

Yeah, this was huge at the time. Laptops were good but Wifi wasn't as ubiquitous. If you wanted to work on some code while you were traveling or at a cafe or something, you'd at best be working without version control, making ".backup" files and directories and stuff. With DVCSes you could branch, commit, etc. as much as you wanted, then sync up when you got back somewhere with good internet.

thruway516
0 replies
15h35m

This was not my recollection. The big thing about git over subversion at the time (at least before everyone started putting their repos up on github with pull requests and all) is that it was truly distributed i.e everyone maintained their copy of a repo and no repo was the 'master' or privileged source of truth. And merging changes was/is relatively seamless with fine grained control over what you want to merge in to your copy that svn simply didn't provide. Svn on the other hand is a server/client architecture. Although you could have multiple servers, it was kind of pointless as keeping them in sync was more trouble than it was worth. For most workflows there was the server or master repo and your local copy is not under source control. And if that master/server should go offline for any reason you were not able to 'check-in' code. I remember this being such a pain point because if the master was offline for a significant amount of time you essentially had no way to track changes you made and it would all just be one big commit at the end (which was also a major pain for the administrator/repo maintainer who would have to merge in a whole bunch of big breaking changes all at once). Maybe git vs mercurial was a close fight with no immediately obvious winner, but subversion's days were pretty much numbered once git showed up.

rbetts
0 replies
6h33m

Git clients became popular interfaces to SVN. This is how the organizations I was at moved from svn to git -- we/devs started preferring git locally and eventually migrated the backends to match.

ndiddy
0 replies
21h25m

SVN is fine as long as you don't have multiple people editing the same file in the same time. In that case, generally one person gets his work overwritten. Committing on SVN is basically the equivalent of "git push --force".

marcosdumay
0 replies
1d1h

CVS was absolutely not oriented around locking files.

It was about merge and conflict resolution like SVN or Git.

VSS was oriented around locking. And also broke all the time. Oh, and also lost data... And oh, it was also the expensive one used by everybody that kept saying "you get what you pay".

keybored
0 replies
21h55m

I never had the time (thankfully) to get good at Subversion. But now that I’ve “internalised [the pain]” of a DVCS I could never go back to a centralized VCS. Interacting with a server just to check the log? Either promiscuously share everything I do with the server or layer some kind of second-order VCS on top of Subversion just to get the privilege of local and private-to-me development? Perish the thought.

fanf2
0 replies
1d

Subversion didn’t get working merge support until years after git. Like CVS it basically required trunk-based development. Feature branches were not supported. You needed a separate checkout for any work in progress. You could not checkpoint your work with a commit before updating to the latest head. Every update is a forced rebase. It sucked.

cruffle_duffle
0 replies
22h31m

Subversion was great up until your working directory somehow got corrupted. Then you'd be in some kind of personal hell cleaning it up.

And honestly, it was always a pain in the ass setting up "the server". Unlike with git you needed a server / service running 24/7 upon which to check your code into. Which was always a pain in the ass at home... needed to keep some stupid subversion service running somewhere. And you'd have to go back into that service and remember how to create new projects every time you got a wild hair up your ass and wanted to create a new thing.

Git you just do "git init" and boom, you have a whole complete version control system all to yourself with no external dependency whatsoever.

That being said, TortiseSVN was the best GUI for version control ever.

aplusbi
0 replies
23h26m

Branching in Subversion was fine, but merging was quite painful (at least at the time I was using it, around 2008ish). From my recollection, SVN didn't try to figure out the base commit for a merge - you had to do that manually. I remember having a document keeping track of when I branched so that I could merge in commits later.

And even if I was using it wrong or SVN improved merging later the fact was that common practice at the time was to just commit everything to the main branch, which is a worse (IMO) workflow than the feature-branch workflow common in git.

But you're right, SVN was largely fine and it was better than what preceded it and better than many of its peers.

Edit: Forgot to mention - one of the biggest benefits to git, at least early on, was the ability to use it locally with no server. Prior to git all my personal projects did not use version control because setting up VC was painful. Once git came around it was trivial to use version control for everything.

vehemenz
6 replies
1d1h

As an "older" programmer, I feel the opposite. Git became mainstream very recently, though admittedly it's been a good ten years or more. I sometimes think younger programmers' attitudes toward git are borderline cultish—git or GitHub is not required to do programming—it's just another tool. I half expected something would have replaced it by now.

gmueckl
2 replies
23h44m

I have to agree on the cult aspect. This is unfortunate because better tools exist already today, but lots of people refuse to even entertain that possibility.

teqsun
1 replies
23h2m

I feel about git how I presume vim users feel. Maybe there are better ways, but I've become so accustomed to how it works that I doubt I could easily switch to anything else.

gmueckl
0 replies
22h9m

All the more reason to look beyond your comfort zone now and then!

schacon
0 replies
1d1h

Git's been around for almost 20 years now. I would say fairly dominant for 15 or so.

keybored
0 replies
1d

Git is overrated for a DVCS. But it’s not overrated considering the old-school competition like SVN.

The assumptions of SVN makes it feel like a dinosaur now.

golergka
0 replies
21h39m

Already in 2010-2012 majority of projects I encountered were using git. Last time I saw an SVN-based project was in 2015, before I migrated it to git.

Cloudef
0 replies
17h26m

Git is great when you use it from CLI. There is no single good git GUI app. Ideally a GUI git app would give you a interface for rebase and visualize the history, branches and trees. You'd drag commits around and then it would construct a rebase operation for you. In case of conflicts, you can either abort/rollback the operation or open the conflicting files in editor and fix the conflicts. That's it.

The github's desktop app in particular is a mess, because it does not try to be a UI, it's simply a frontend for the CLI , having buttons for the CLI operations basically, and it does everything worse than CLI.

In hindsight, it's hard for me to take seriously a programmer who can't spend some time to learn git, it isn't hard.

imiric
17 replies
1d1h

The celebrity of Linus definitely helped Git win, and GitHub likely benefited from that by the name alone. Many people today mistakenly equate Git and GitHub, and since GH did such a good job of being a friendly interface to Git, to many people it _is_ Git. They did an early bet on Git alone, at a time when many of its competitors were supporting several VCSs. That early traction set the ball rolling, and now everyone developing in public pretty much has to be on it.

Tangentially: it's a pretty sad state of affairs when the most popular OSS hosting service is not only proprietary, but owned by the company who was historically at opposite ends of the OSS movement. A cynic might say that they're at the extend phase of "embrace, extend, extinguish". Though "extinguish" might not be necessary if it can be replaced by "profit" instead.

schacon
10 replies
1d1h

I do go into Linux and Linus in the article in some depth, but even Linus credits the Ruby community to a degree with the explosion in popularity of Git, which is fairly clearly due in large part to GitHub. But, it's certainly a chicken/egg question.

I would also argue that MS is nothing like the company that it was 30 years ago when that philosophy was a thing. The truth today is the via GitHub, Microsoft hosts the vast majority of the world's open source software, entirely for free.

llm_trw
4 replies
19h0m

I would also argue that MS is nothing like the company that it was 30 years ago when that philosophy was a thing.

This is like saying that a cannibal has stopped eating people because there have been no disappearances in the last two days. Sure, technically correct, I'd still not eat their curry.

kneath
3 replies
14h22m

It is not anything even remotely like that. Fuck, I love programmers.

croes
2 replies
11h42m

Yep, they became worse.

30 years ago your PC was at least your PC, now they shove all kinds of cloud and AI services down the users' throat and put ads where they don't belong.

croes
0 replies
7h16m

The head is the same.

nine_k
2 replies
1d

MS have realized that producing the right kind of important open-source software gives even more strength than producing closed-source software. Hence Typescript, VS Code, a few widespread language servers, etc.

bluGill
1 replies
1d

MS has long known developers were critical to their success. For a while they were worried that projects like Linux would take away their market, but it is now clearer to everyone where linux is going and so they don't have to worry as much. (so long as they are not stupid)

nine_k
0 replies
1d

They were smart enough to offer MS SQL Server for Linux, and to support (rather than oppose) Mono and Xamarin early enough.

kelnos
0 replies
17h31m

The truth today is the via GitHub, Microsoft hosts the vast majority of the world's open source software, entirely for free.

To be fair, though, y'all did 90% of the work before the acquisition. MS only hosts the vast majority of the world's open source because they backed up dump trucks full of cash at the houses of the people who actually built that capability.

I would also argue that MS is nothing like the company that it was 30 years ago when that philosophy was a thing.

I don't think I can ever truly trust their motives, though. I will agree that it's a different company in many ways, but their history is still that of a company that, through anti-competitive practices, set personal computing back decades. And worked tirelessly to keep open source at bay where they could.

At this point MS realizes it's more profitable to work along side open source than against it. If at any point they no longer believe that's the case, you better believe we'll see a reversion to their former behavior.

4gotunameagain
0 replies
12h38m

The truth is that Microsoft trains Copilot using the vast majority of everyone's code, entirely for free.

Corporations like Microsoft don't do charity.

cyberax
1 replies
22h3m

Though "extinguish" might not be necessary if it can be replaced by "profit" instead.

Let me bite: why is this bad?

imiric
0 replies
20h28m

I didn't say it was. If anything, it's preferable to "extinguish". :)

Though a core philosophy behind the OSS movement is creating software for the benefit of humanity, instead of driven by financial reasons. Not that developers shouldn't profit from their work, but it's ironic that a large corporation who was historically strongly opposed to the movement is now a leader in it. It's understandable to question their motives if you remember the history, regardless of their image today.

JyB
1 replies
1d1h

Thats actually interesting. Was there any concern at any point in the early days about supporting other VCS or being too focused on git?

schacon
0 replies
1d1h

There was concern actually. We debated a bit the concept of naming the company "GitHub", since "git" is baked into the company name. We worried a little about what happens when the next big VCS thing comes along, not knowing that it's going to be dominant for at least the next 20 years.

DrBazza
0 replies
9h52m

Github won, not because of taste, but also because of providence and comms. The whole article is written from someone looking out, not in - you can't play a football match and watch it.

For the rest of the world, Github came along when blogs and rss feeds were also close to their zenith. IIRC Github used to employ a rather sweary chap that used to blog a lot, and he appeared in everyone's feeds that I knew of promoting github.

Whereas, Bitbucket, and FogCreek's kiln had little comparable publicity or comms.

BeetleB
0 replies
22h56m

It really was because of Github, and not Linux. If Github had Mercurial support from the get go, I would expect both to be heavily used today.

AnotherGoodName
17 replies
1d1h

Well Sourceforge literally bundled malware for a while. So everyone had to move.

https://news.ycombinator.com/item?id=31110206

This articles about the open source distribution side but I will also point out that the number of developers who don’t realise your remote GitHub repo can be located on any machine with an ssh connection and nothing more is surprising. As in people use private GitHub repos thinking that’s THE way you work with git. If GitHub was just for open source hosting I suspect they’d have trouble monetising like sourceforge clearly did which led to scammy attempts to make money. But they always had this huge usage of private GitHub repos supporting the rest. This must have helped a lot imho.

schacon
9 replies
1d1h

This is not my recollection, at least at the time. I remember meeting with one of the SourceForge founders and being a little star struck. SourceForge was a huge deal at the time and we totally felt like we were the underdogs in that arena. Perhaps later they got more desperate, but in 2008, SourceForge was the 900lb gorilla.

kstrauser
3 replies
1d1h

Who is "we"?

schacon
2 replies
1d1h

Sorry, "we" is GitHub. I'm the author of the article and one of the GH cofounders.

relaxing
0 replies
1d

Well damn. So much for the “Github had better taste” thesis.

Still, Sourceforge was a terrible user experience. Github was a breathe of fresh air.

kstrauser
0 replies
1d1h

Oh! Heh, that makes sense now.

michaelt
2 replies
1d1h

To help our recollections, let's look at Sourceforge's browse page, from back in 2008: https://web.archive.org/web/20081118033645/http://sourceforg...

They did indeed host quite a lot of stuff, and it was undeniably popular as a place to get your binaries hosted free of charge.

But at the same time, is it being used as a source code repository? A lot of those projects don't show the CVS/SVN features. And sourceforge never hosted the biggest and most established projects, Linux and Gnu and PHP and Java and Qt and Perl and Python were all doing their own thing. And pretty much every project visible on that page had its own separate website, very few projects hosted on sourceforge exclusively.

relaxing
0 replies
1d

No, you’d upload source tarballs. Live public access to VCS wasn’t a thing for most projects.

giantrobot
0 replies
23h41m

SourceForge was the upstream source of truth for a huge percentage of small apps bundled by various distros (and BSD ports etc). Even when the upstream maintainers just uploaded the latest tarball to SF and didn't use their hosted VCS, just the hosting was a major boon to all of the tiny teams and individual maintainers of FOSS projects.

AnotherGoodName
1 replies
1d1h

2013 is when the binaries had malware included although even in 2008 they were guilty of having 5 download buttons due to excessive and unpoliced inline advertising with only one of those buttons being the holy grail that linked to the download you actually wanted. Choose wisely.

beAbU
0 replies
7h51m

I completely forgot about the absolute gamble that was "which big green button is the actual download button this time" when using SourceForge and Tucows back in the day.

zargon
0 replies
1d1h

your remote GitHub repo can be located on any machine

It's such a an easy mistake to say that you did it while explaining you don't need GitHub for git repos. :)

marcosdumay
0 replies
1d1h

If GitHub was just for open source hosting I suspect they’d have trouble monetising like sourceforge clearly did

It made it harder to monetize, but it enabled Source Forge to use a huge amount of voluntarily-given bandwidth and saved them a fortune at a time bandwidth was crazy-expensive.

Bandwidth costs were one of the reasons something like GitHub didn't appear earlier, and suddenly popped-up a lot of times out of nowhere.

jrochkind1
0 replies
3h14m

The malware bundling was long after SourceForge had precipitously declined in popularity, not the original cause of it, no?

huijzer
0 replies
14h37m

I suspect that Microsoft has just accepted losses year after year after year. That’s what they do. They are very willing to invest heavily in projects that they think will work out in the long-run, see also their OpenAI investements.

dom96
0 replies
19h25m

I distinctly remember that what annoyed me about SourceForge was that it hid the source code behind multiple clicks. GitHub was a breath of fresh air because it made the source code front and center.

arp242
0 replies
23h1m

Sourceforge literally bundled malware for a while. So everyone had to move.

This was after SourceForge hugely declined in popularity.

The correct sequence of events is:

1. SourceForge massively declined in popularity,

2. and then in a desperate attempt to extract cash they started bundling malware.

Not the other way around.

All of this had little to no effect on the migration away from SourceForge, which was already well underway in 2013 when the first controversy started. It may have expedited thing somewhat, but not even sure about that. See for example [1] from 2011, which shows GitHub is already beating SourceForge by quite a margin. I found that article because it's used as a citation for "In response to the DevShare adware, many users and projects migrated to GitHub" on Wikipedia, which is simple flat-out wrong – that DevShare incident didn't happen until 2013 (I have removed that from the Wikipedia page now).

It's baffles me how people keep getting the sequence of events wrong on HN.

The reason is simple that SourceForge is just not very good and never was very good. Part of that is because of the ad-driven business model, part of that is that many features were just not done very well. Who actually used the SourceForge issue tracker or VCS browser? Almost no one, because it's crap.

[1]: https://redmonk.com/sogrady/2011/06/02/blackduck-webinar/

Cyberdog
0 replies
13h19m

your remote GitHub repo can be located on any machine with an ssh connection

Technically true, but GitHub provides so many more tools that it's almost silly to do so. Aside from the "hub" in GitHub such that it is often the first and only way that some people will look for projects they're interested in, you also get the nice web interface for quick code browsing, issue queues, the ability to add and remove people with a simple GUI rather than SSH key management, wikis, email notifications, and so on and so on.

Some of this can be mitigated by using a self-hosted web-based Git tool like GitLab, Gitea, Phorge, etc. But you still lose the "everyone uses it because everyone uses it" factor of GitHub on top of whatever GitHub features the clones may lack.

max_
16 replies
1d1h

There is no real winners in business.

Just people/products that are temporarily on top.

SourceForge was probably "the winner" for some time.

The same will be for GitHub.

Someone just needs to build an actual superior product and provide a service that GitHub will not provide. Then build a sufficient audience.

One such service is an end to end encrypted Git repo service.

Some anarchists I know don't want everyone to know what they are working on.

The same goes for algorithmic trading. I need strong guarantees that my code will not be used to train an LLM that will leak my edge.

I am shocked a superior Git service to GitHub has not been built.

I really liked source hut. But the custodian is abit arrogant (crypto projects for instance are banned)

crop_rotation
4 replies
1d1h

One such service is an end to end encrypted Git repo service. Some anarchists I know don't want everyone to know what they are working on.

I doubt there is a big enough market of anarchists for Github to even bother worrying.

One such service is an end to end encrypted Git repo service.

There are so few people that need this, that they can just use client side tools and store all data that gets to remote servers encrypted

max_
2 replies
1d1h

I doubt there is a big enough market of anarchists for Github to even bother worrying.

A lot of people writing prorietory code bases would definitely use it.

I don't think a founder wants the startup's codebase to leak via an LLM?

nine_k
0 replies
1d

A ton of proprietary code lives on GitHub, on closed paid repos. A lot of people reasonably think that GitHub's security chops are better than theirs.

But if you care, there is a whole gamut of on-prem solutions, from running bare cgit to fluff like Gitea and GitLab.

Lock up your central repo machine all you want, the code is still checked out to developers' laptops. For more security, don't allow that, and let your devs connect to a server with all necessary tools and access to the code, but without general internet access, for instance.

duped
0 replies
1d

I don't think founders care if parts or the entirety of the codebase leaks, it's not that valuable.

Diti
0 replies
1d1h

It’s already feasible with Keybase (although I wouldn’t trust them any more, because of the Zoom debacle).

kstrauser
2 replies
1d1h

I wish something like Forgejo/Gitea had federated identities so that I could fork a project on the server you're hosting and submit a PR as easily as I can do that if you're hosting it on GitHub today. Everything you're asking for is available today in self-hosted services. I mean, consider that you don't even need a Git server. You can swap code with your pals via SSH/email/whatever right now, today, without the rest of the world even knowing about it.

max_
1 replies
1d1h

Everything you're asking for is available today in self-hosted services

There is a reason why people use hosted Git services it's not practical for everyone to "self host".

We can run a self hosted Signal app for privacy. But it's neither convenient nor practical for everyone.

kstrauser
0 replies
1d1h

That's true, but if you have unusual requirements that make GitHub impractical, there are other options. Devs can update their origin to point at a shared SSH server and coordinate merges through email or Signal or anything else. I think that's a lot more practical than hoping GitHub adds something like end-to-end encryption, or worrying that they might train their LLMs against private code.

AnotherGoodName
2 replies
1d1h

For an end to end encrypted git repo;

git remote add origin ssh://user@host/srv/git/example

Where the host is simply an ssh server you have access to. Encrypt the servers drive itself however you see fit. This is how git is traditionally used btw. GitHub is a third party to the git ecosystem and really there’s little reason to use it for private repos. Just use ssh for the remote connection.

nottorp
0 replies
11h20m

really there’s little reason to use it for private repos

Admin costs? I paid $7/month to github for years for private repos (atm private repos are free so i switched to not paying when the card i was using acted up and i couldn't be bothered to fix it). I'm sure the time I would have spent admining a ssh based server would have cost more, even at 1 hour/month.

Groxx
0 replies
1d

Generally people mean "E2E Encrypted" as "the hosting service cannot see it". Git-over-SSH does not achieve this, it just encrypts in transit.

faangguyindia
1 replies
1d1h

If your code does not want edge leak, why is it on GitHub?

Who trusts private repo off GitHub?

Simply store encrypted files somewhere like Dropbox or cloud storage solutions.(Encrypt before you upload)

conradkay
0 replies
1d1h

Plenty of large companies. The risk is much higher that an individual's computer gets compromised, which often has a lot worse than just source code.

imiric
0 replies
1d1h

It's extremely difficult to unseat the leader with a superior product alone. Once sufficient traction is established, people will flock to where everyone else is, further cementing their position. It also requires monumental fumbles by the leader to actively push people away from the platform. Unfortunately for those who don't like GitHub, it's run by a company with limitless resources to pour into it, or to flatout buy out its competition. Microsoft has a lot of experience with this.

I really liked source hut.

Sourcehut never has and likely never will be a serious competitor. Its UX and goals are entirely different, and it's directed towards a very niche audience unlike GH.

cruffle_duffle
0 replies
22h36m

It won't be another git service that replaces github. It will be something completely out of left field that replaces git and that method of code collaboration. There is only incremental improvements to be made to git. It will take a brand new hotness with a brand new way of doing things that shakes things up.

Gualdrapo
0 replies
1d1h

Someone just needs to build an actual superior product and provide a service that [...] will not provide. Then build a sufficient audience.

I wish this was true for social media and instant messaging platforms, operating systems...

transpute
6 replies
1d1h

> we won because we started at the right time and we had taste.

2012, https://a16z.com/announcement/github/

  We just invested $100M in GitHub. In addition to the eye-popping number, the investment breaks ground on two fronts:
    It’s the largest investment we’ve ever made.
    It’s the only outside investment GitHub has ever taken.
2018, https://web.archive.org/web/20180604134945/https://a16z.com/...

  Six years ago we invested an “eye-popping” $100 million into GitHub. This was not only a Series A investment and the first institutional money ever raised by the company, but it was also the largest single check we had ever written.. At the time, it had over 3 million Git repositories — a nearly invincible position.. if I ever have to choose between a group of professional business managers or a talented group of passionate developers with amazing product-market fit like GitHub, I am investing in GitHub every time.

schacon
4 replies
1d1h

What is the point you're trying to make here?

transpute
3 replies
1d1h

Did $100M investment help Github to win, or had Github already won in 2012 with profitability and 3M git repos?

schacon
2 replies
1d

I would argue that GitHub already won in 2012. The investment helped us grow in a different way, but I don't think anyone involved in that deal would have said that we had almost any serious competitive threats at the time, which is to some degree why it was such a great deal.

transpute
1 replies
1d

Did the investment encourage corporate buyers to sign up for Github Enterprise, where corp developers were already using the free product?

schacon
0 replies
1d

That was certainly one of our internal arguments, that the institutional investment would be helpful for large company trust.

mardifoufs
0 replies
22h42m

Bitbucket got funded before GitHub did, and yet GitHub was still bigger before they got any investment.

seveibar
5 replies
1d1h

This article reinforces a lot of my biases around early bets. Taste is so so important, everyone looks at you weird when you say you're betting on "niche, tasteful solution" (git) instead of "common, gross solution" (SVN). Github bet on Git and made tasteful choices, and that was a huge propellant for them.

I feel the same way about tscircuit (my current startup), it's a weird bet to create circuit boards with web technologies, nobody really does it, but the ergonomics _feel better_, and I just have to trust my taste!

digging
1 replies
1d1h

Not sure what is even meant by "taste" here; what I see over and over is that convenience wins, where winning is defined as widespread use.

rustyminnow
0 replies
23h58m

The article uses "taste" pretty broadly compared to many folks in the comments. First mention is about the site being pretty. But later he says "We had taste. We cared about the experience" which more aligns with your perspective of convenience.

aidenn0
1 replies
1d1h

I would argue that hg was more tasetful than git at the time github began. The one thing git had going for it was that the most common operations were absurdly fast from the beginning, while hg took a bit of time to catch up.

seveibar
0 replies
22h13m

I agree with this take, I think hg could have overtaken git for a while, but git catered a bit better to the OS communities and hg catered a bit more to big companies from a DX perspective. Maybe in this case, the important thing is knowing that your partners/technologies are aligned with your vision of the future- git has been more open-source first (I would argue)

nprateem
0 replies
1d1h

It's just survivorship bias. If HH hadn't won no one would be trying to reverse justify their success.

This bet worked, the mercurial ones didn't.

devnull3
5 replies
1d1h

The rise of github also coincided with enshitification of sourceforge.net. SF although was not git based at that time but it had the mindshare of lot of open source projects and it went complete downhill.

So, a downfall of a potential alternative was also a factor IMO.

Edit: after I commented I realized that SF was already mentioned in other comment

schacon
3 replies
1d1h

I would argue that SF was always pretty shitty, because it focused entirely on advertising. I remember Chris giving a talk comparing the signup process of GitHub and SourceForge. SF had like 8 fields and GitHub had 2. This was because SF wanted to know ad demographic info - where did you hear about us, etc. GitHub just wanted a name and a password. But this was the difference in everything - SF cared about advertisers, not developers. GitHub never thought about what anyone other than the developers using the product wanted.

devnull3
2 replies
1d1h

Agree but my point is when you see a new and better rival then instead of pivoting SF became even worse and became malware-ised.

Also SF was based on SVN. They failed to understand and capitalize on a better tech on the market i.e. git.

devnull3
0 replies
1d

I stand corrected! Thanks!

hadlock
0 replies
1d1h

Sourceforge was always awful to navigate. Because it was dependent on ad revenue, not subscriptions. It was trying to compete with consumer focused things like download.com (remember that ?) where the end user just wanted a tarball of the executable, and the host was trying to make money selling ad space on the page where the download link was.

The fact that end users could peek at the folder structure of the source code was a novelty at best

chx
4 replies
1d1h

git won because of empty hype, bzr was far superior in basically every aspect. Much easier to program with either for plugins or to be embedded, much saner "hide your development commits" model with log levels, much saner command line interface. It's just better.

It's not the first thing to be carried by hype instead of careful comparison.

kstrauser
2 replies
1d

That's simply untrue. Bzr was dog slow on repos with lots of history. It had lots of early users and support from hosting services like Launchpad, Savannah, and SourceForge. I'm certain that everyone didn't migrate to git because of hype. I mean, it's not credible to say the Emacs team stopped using it because it wasn't fashionable.

There were lots of DVCS projects at the time, like arch, darcs, and dcvs. People were running all kinds of experiments to explore the Cambrian explosion of new ideas. Some of them did some things better than git, but git handled most of those things reasonably well and it was fast. We all mostly ended up on git because it was generally the better option. It earned the hype, but the hype followed the adoption, not vice versa.

chx
1 replies
1d

So in exchange for a little speed we are stuck with one of the most user hostile tools out there. That's not the deal I would have wanted to make. The interface is atrocious as some switches change completely what the command does -- this was partially acknowledged and fixed in git switch but there's so much more, it loses work way too easily and some of the concepts are near impossible to grok. (I did learn git eventually but that doesn't mean I like it. It's more of an uneasy truce than a friendship.)

kstrauser
0 replies
1d

It wasn't a little speed. Other options were many times slower. I just renamed a large subdir in a large project. `time git status` took 41ms. That kind of speed lets you add all sorts of interactivity that would be impractical if it were slower. For instance, my shell prompt shows whether the current directory is managed by Git, and if so, whether the status is clean. I would never tolerate my terminal being detectably slowed by such a thing. With git, it's not.

There are a thousand little ways where having tooling be fast enough is make-or-break: if it's not, people don't use it. Git is fast enough for all the most common operations. Other options were not.

schacon
0 replies
1d1h

I think PR and network effects of GitHub definitely played a role in the success of Git over other options like bzr, but you should also remember that bzr had tons of issues. It was slower, there was no index/staging area, there was no rebasing, etc. Mercurial was very good too, but while there were pluses and minuses with all of them, I think there was a _lot_ of careful comparison too. None of them were clearly and in all aspects better than the others.

Bjorkbat
4 replies
1d1h

The idea of Github having a unique "taste" advantage resonates with me a lot. I don't like the fact that Github is using my code to feed Microsoft's AI ambitions, but I dislike Bitbucket and Gitlab more simply on the grounds that they "don't look fun".

It's tricky, because any serious Github competitor would implicitly have to compete by attracting the deep pockets of enterprise clients, who care little for "fun". Getting revenue from solo devs / small teams is an uphill battle, especially if you feel obliged to make your platform open source.

Still, I wish someone would make a Github competitor that's fun and social.

darby_nine
1 replies
1d1h

sourcehut is always worth a mention, though I have never used it in a collaborative environment.

rapnie
0 replies
10h50m

sr.ht doesn't go for "fun" but brutal minimalism as their main selling point.

wood-porch
0 replies
1d

This. GitHub is a joy to use compared to its competitors. Using bitbucket at work is frustrating, and reminds me of a lot of Microsoft web interfaces, ironic, given that it’s GitHub and not Bitbucket that is owned by them now

bluGill
0 replies
1d

You don't need enterprise clients. Projects like KDE self host and are enough to keep you around and getting new features if you can get them on board. Plus enterprises often look at their bottom line and ask if something else is a better value so if you are "free" some of them will switch to you.

throwaway5752
3 replies
1d1h

I professionally used RCS, CVS, Subversion and Perforce before Git came along. Hell, I was actually in a company that FTP'd it's PHP files directly to the production server.

People in the field less than 20 years might not appreciate the magnitude of this change (though, adding my two cents to the author's article, branching in p4 was fine). People may have also dealt with ClearCase (vobs!) or Microsoft Visual SourceSafe.

Git did as much for software development velocity as any other development in recent history.

kstrauser
1 replies
1d1h

That's all true for me, too, although I hadn't used p4. I resisted Git for a little while because I didn't see the massive appeal of a distributed system in an office with a central server. CVS... worked. SVN was a much more pleasant "faster horse". And then I made myself try Git for a week to see the fuss was all about and my eyes were opened.

Git is not perfect. There are other products that did/do some things better, or at least more conveniently. But Git was miles ahead of anything else at the time that I could use for free, and after I tasted it, I never wanted to go back to anything else.

throwaway5752
0 replies
1d1h

I was a late adopter, also, and git is definitely not perfect. Mercurial did some things better, and at the time, notably, the forest extension. Git's flexibility is a two edged sword and the history rewrite footguns should be harder to use. Git does comes close enough to solving a fundamental problem it will be very, very durable, though. As long as it is used for linux kernel development I expect it continue to be the dominant dvcs.

physicsguy
0 replies
1d

God I hated ClearCase, did a migration from it in 2016(!) for a project that had been around since the late 80s. People were really resistant to moving but once it was done were like "Oh wow, it's really fast to create a branch, this means we don't have to have one branch for three months!"

ldayley
3 replies
1d1h

Thank you for sharing this, Scott! He mentions "Taste" throughout the post and this intangible quality makes all the difference in an early-stage winner-take-all market dominance race.

In 2007 I was teaching myself programming and had just started using my first version control tools with Mercurial/Hg after reading Joel Spolky's blog post/love letter to Mercurial. A year or two later I'd go to user group meetups and hear many echo my praise for Hg but lamenting that all the cool projects were in GitHub (and not bitbucket). One by one nearly everyone migrated their projects over to git almost entirely because of the activity at GitHub. I even taught myself git using Scott's website and book at that point!

"Product-market fit" is the MBA name for this now. As Scott elegantly states this is mostly knowing what problem you solve, for whom, and great timing, but it was the "flavor" of the site and community (combined with the clout of linux/android using git) that probably won the hearts and minds and really made it fit with this new market.

Edit: It didn't hurt that this was all happening at the convergence of the transition to cloud computing (particularly Heroku/AWS), "Web 2.0"/public APIs, and a millennial generational wave in college/first jobs-- but that kinda gets covered in the "Timing, plus SourceForge sucked" points

kccqzy
1 replies
21h4m

I learned git first because it was already very popular when I decided to learn it. But when I later learned hg for fun, I realized how much of a better user experience it is:

* After using hg which doesn't have the concept of an index, I realize I don't miss it and the user experience is better without it. Seriously, even thinking about it is unnecessary mental overhead.

* As someone who modifies history a whole lot, `hg evolve` has superior usability over anything in git. The mere fact that it understands that one commit is the result of amending another commit is powerful. Git doesn't remember it, and I've used way too much `git rebase --onto` (which is a poorer substitute) to be satisfied with this kind of workflow.

* Some people, including the author, say cheap branching is a great feature of git. But what's even better is to eliminate the need to create branches at all. I don't need to use bookmarks in hg and I like it that way.

I sometimes imagine an alternate universe where the founders of GitHub decided instead to found HgHub. I think overall there might be a productivity increase for everyone because hg commands are still more user friendly and people would be stuck less often.

ldayley
0 replies
3h40m

imagine an alternate universe where the founders of GitHub decided instead to found HgHub.

I'm reading this as "HugHub" and audibly laughing!

bluGill
0 replies
1d

I still miss hg. I migrated to github years ago because github is a much better workflow, but I miss hg which can answer questions that git cannot.

LtWorf
3 replies
1d1h

Github won because sourceforge was ruined already.

heisenbit
2 replies
21h50m

SF lost its way. My vague memory is that SF and also CollabNet were focusing on higher level functionality and while valuable neglected the basic code sharing growth/ease of use which is the rationale for all their existence. Too early into higher level functionality.

latchkey
1 replies
17h34m

I worked for CollabNet from pretty early on. CollabNet had one major customer... HP. Everything got developed for a relatively stodgy old customer. They wanted centralized version control. They wanted tons of ACLs. It all was very corporate sales focused.

Github "won" because they were not CollabNet.

heisenbit
0 replies
5h49m

At the time I was in an industry collaboration and we needed a place to work together and looked at SF and CollabNet. Internally we also would have benefitted from such a tool so I even got an onsite demo by SF. It was impressive but also as you say enterprise focused and very different from their free offering. And that was imho the key reason: There was not one but were two offerings with different tech and customers. Focus was lost.

simonw
2 replies
1d1h

I clicked on this link thinking "timing and product quality", so I was satisfied to see that GitHub co-founder Scott Chacon credits it to "GitHub started at the right time" and "GitHub had good taste".

SoftTalker
1 replies
1d1h

git won because linux used git, and the vast majority of open-source code was written for linux. Simple as that. GitHub won because it made remote collaboration on a code base easier than anything else.

bluGill
0 replies
1d

I think that if github hadn't come out something other than git would have one. While git did have Linus behind it, the others were objectively better in some way and working on the areas they were objectively worse, and eventually the advantages would have got everyone to switch. However the others never had anything like github - even 20 years latter they still aren't trying (rumor is they are not dead)

amtamt
2 replies
1d1h

vi: 1976 GNU Emacs : 1984 BIND: 1986

are (along with too many other projects) from way before Nov 1993, where "The Total Growth of Open Source" graph starts from 0.

schacon
1 replies
1d1h

It all depends on how you're counting. For one, "open source" was not a phrase before 1998, so there is some retrofitting of Free Software projects. But also, there isn't a registry, it's rather difficult to be more than approximate with this. The article is very specific about their methodology, I'm only using one graph as a general example.

amtamt
0 replies
13h21m

From the paper

"The database contains data from January 1990 until May 2007. Of this time horizon, we analyze the time frame from January 1995 to December 2006. We omit data before 1995 because it is too sparse to be useful"

"Large distributions like Debian are counted as one project. Popular projects such as GNU Emacs are counted as projects of their own, little known or obsolete packages such as the Zoo archive utility are ignored"

So even though methodology is "very specific", it seems very incomplete/ inaccurate/ selective. Even Linux kernel, as per their source, started in 2005 (https://openhub.net/p/linux).

Source: https://www.researchgate.net/publication/45813632_The_Total_...

tanepiper
1 replies
1d1h

Around about that time, I was working on a Mercurial frontend https://github.com/tanepiper/hgfront - it was around the time GitHub was starting to pick up, and BitBucket also appeared around then (we spoke to the original developer at the time but nothing came of it). Funnily enough also a Gist-like tool that had inline commenting, forking and formatting (https://github.com/tanepiper/pastemonkey).

I always wonder what would have happened if we had a dedicated team to make something of it, but in the end git won over hg anyway so likely a moot point.

Edit: there's a low-quality video of the early interface we worked on - https://youtu.be/NARcsoPp4F8

schacon
0 replies
1d1h

Fun fact, I (original author), wrote the original version of Gist. That was my first project at GitHub. Gist #1 is my claim to fame: https://gist.github.com/schacon/1

sunshowers
1 replies
23h35m

They never cared about the developer workflow.

Man, given how terrible GitHub's developer workflow is in 2024... there is still no first-class support for stacked diffs, something that Phabricator had a decade ago and mailing list workflows have been doing for a very long time.

I personally treat GH as a system that has to be hacked around with tools like spr [1], not a paragon of good developer workflows.

[1] my fork with Jujutsu support: https://github.com/sunshowers/spr

juped
0 replies
12h6m

You can't even see a commit graph (no, the insane glacial-js "network" tab doesn't count). You can see it in bitbucket for heaven's sake. The basic data structure of git, invisible. On a GUI.

sirspacey
1 replies
22h54m

I’m a little stunned by “taste” as the defining factor, but GitHub has certainly brought the industry a long way!

Whenever I’ve asked for help using GitHub (usually because I’m getting back into coding) the dev helping me out stumbles, forgets, and is confused often. What’s surprising is that’s true no matter how senior they are.

GitHub did a ton to smooth out dev workflows, for sure, but there’s something almost intensely counter-intuitive about how it works and how easy it is to miss a step.

I’d assume good product taste is reasonably indexed to “intuitive to use” but GitHub doesn’t seem to achieve that bar.

What’s an example of GitHub’s good taste that I’m missing?

echelon
0 replies
22h53m

Have you used SourceForge or SVN? Or sent a zip of files named "v23_backup_(copy).zip" to other engineers?

Compared to everything that came before, Github may as well have been Nirvana.

cromulent
1 replies
8h19m

The guy who reverse-engineered the Bitkeeper protocol, triggering the creation of git, is the same guy who created rsync - Andrew Tridgell.

amir734jj
1 replies
19h42m

I work at Microsoft, so I write a lot of pipelines and interact a lot with git.

This is my own opinion:

- GitHub looks nice but PR merge window between forks is still bad

- GitLab CI is so much more intuitive than GitHub CI and there is a lot of existing codes that you can copy/paste specially if you use Terraform, AWS, K8S

- I am biased, but AzDevops both looks most intuitive, and its pipeline system is the best

TheRealPomax
0 replies
16h47m

Want to expand on that second point? Github actions have more prebuilt workflows than I can shake a stick at; no copy pasting anything, you just say "uses: actions/whatever@v123" in your own yaml, configured with some "with" statements.

SenHeng
1 replies
1d1h

I used both GitHub and BitBucket during the early days. There was no comparison. GitHub was simply nice to use. The UX was phenomenal for its time and made sense. BitBucket was horrible but my then employer wouldn’t pay for hosting and GitHub didn’t provide free private hosting.

One of my biggest gripes was that switching back and forth between code view and editor mode would wipe whatever you had written. So you better had them in separate tabs. Also be sure not to press the backspace key outside a text window.

tootie
0 replies
23h48m

Idk, I loved BitBucket and I loved Mercurial. It was much easier to use and had native JIRA integration. I always thought (and still do) that github looks too cute and not very serious.

vander_elst
0 replies
22h15m

They might have taste but they still don't have IPv6. Sorry for the rant, but I'm always baffled that they haven't switched yet. Anyone has insight about the challenges they are facing?

trelane
0 replies
4h38m

LWN linked to a summary of the origin of git: https://lwn.net/Articles/974914/

They also provided a set of links to LWN articles from that era.

thom
0 replies
8h14m

I wonder if there's an alternative universe where Fog Creek pushed Kiln - their hosted Mercurial product - harder and exploited the mindshare of Stack Overflow somehow. Perhaps if they'd tried to get open source projects and their maintainers onto a shared platform to manage both code and (potentially paid) support they would have earned a mention here.

thepra
0 replies
14h34m

I don't know about that, it was a dead platform for my projects by the time that the US government policies went and blocked accounts and projects of some middle east developers.

Since then I'm happy self hosting Gitea. GitHub is still a decent place to contribute to others projects.

sylware
0 replies
20h55m

Well, I can login and use core functions on github with a noscript/basic (x)html browser... gitlab... well...

sergiotapia
0 replies
1d1h

I still miss Codeplex from microsoft ;) it was a really beautiful website

scelerat
0 replies
22h52m

The ease of creating, deleting and merging branches and patches is what sold me on git over subversion (which itself was a vast improvement over the only VCS I had experienced up until that point, CVS). When the author describes the literal jaw-dropping demos, I remember having a similar reaction.

righthand
0 replies
14h48m

It’s also never to late to use a different vcs. Especially if you have no one who’s interested in your code, like me. “Winning” isn’t everything as they say.

red_admiral
0 replies
10h4m

I remember the days when on sourceforge, you sometimes had to find a small link as opposed to the big download button that gave you an installer bundled with "offers". As far as I know this is something SF added on top of the binaries the author was trying to distribute.

That left a market opportunity for something better. I think that _might_ have had something to do with it.

pmarreck
0 replies
4h25m

And they STILL don't have IPv6 support >..<

physicsguy
0 replies
1d

Sourceforge was horrible to use. GitHub was widely used but it only really reached proper dominance I think when it started offering free closed source repositories to people that weren't paying them, which was what, 2014/2015 or so? Until then it was pretty common in my experience for people to use BitBucket for Git for private stuff.

mproud
0 replies
14h28m

its

mikemitchelldev
0 replies
22h42m

I didn't realize Scott Chacon was a founder of Github. Did they all cash out equally?

masa331
0 replies
9h11m

Surely Github is most popular git hosting nowadays but fortunately there are good alternatives like Gitea for those who don't want to give Microsoft free access to any code you you host on Github. Spinning up a new instance can be done in one afternoon and is not complicated

lkrubner
0 replies
10h27m

Github won in part because git won. And git won because, for complex sociological factors, the software engineers were able to argue that their needs were more important than the needs of other parts of the companies for which they worked.

For a counter-point (which I've made many times before) from 2005 to 2012 we used Subversion. The important thing about Subversion was that it was fun to use, and it was simple, so everyone in the organization enjoyed using it: the graphic designers, the product visionaries, the financial controllers, the operations people, the artists and musicians, the CEO, the CMO, etc. And we threw everything into Subversion: docs about marketing, rough drafts of advertising copy, new artwork, new design ideas, todo lists, software code, etc.

The whole company lived in Subversion and Subversion unified every part of the company. Indeed, many products that grew up later, after 2010 and especially after 2014, grew up because companies turned away from Subversion. Google Sheets became a common way to share spreadsheets, but Google Sheets wasn't necessary back when all spreadsheets lived in Subversion and everyone in the company used Subversion. Likewise, Google Docs. Likewise some design tools. Arguably stuff like Miro would now have a smaller market niche if companies still used Subversion.

At some point between 2008 and 2015 most companies switched over to git. The thing about git is that it is complex and therefore only software engineers can use it. Using git shattered the idea of having a central version control for everything in the company.

Software engineers made several arguments in favor of git.

A somewhat silly argument was that software developers, at corporations, needed the ability to do decentralized development. I'm sure this actually happens somewhere, but I have not seen it. At every company that I've worked, the code is as centralized as it was when we used Subversion.

A stronger argument in favor of git was that branches were expensive in Subversion but cheap in git. I believe this is the main reason that software developers preferred git over Subversion. For my part, during the years that we used Subversion, we almost never used branches, mostly we just developed separate code and then merged it back to main. Our devops guy typically ran 20 or 30 test servers for us, so we could test our changes on some machine that we "owned". For work that would take several weeks, before being merged back to main, we sometimes did setup a branch, and other times we created a new Subversion repo. Starting new repos was reasonably cheap and easy with Subversion, so that was one way to go when some work would take weeks or months of effort. But as ever, with any version control system, merge conflicts become more serious the longer you are away from the main branch, so we tried to avoid the kind of side projects that would take several weeks. Instead, we thought carefully about how to do such work in smaller batches, or how to spin off the work into a separate app, with its own repo.

A few times we had a side project that lasted several months and so we would save it (every day, once a day) to the main branch in Subversion, just to have it in Subversion, and then we would immediately save the "real" main branch as the next version, so it was as if the main branch was still the same main branch as before, unchanged, but in-between versions 984 and 986 there was a version 985 that had the other project that was being worked on. This also worked for us perfectly well.

The point is that the system worked reasonably well, and we built fairly complex software. We also deployed changes several times a day, something which is till rare now, in 2024, at most companies, despite extravagant investments in complex devops setups. I read a study last week that suggested only 18% of companies could deploy multiple times a day. But we were doing that back in 2009.

The non-technical people, the artists and product visionaries and CFOs and and CMOs, would often use folders, when they wanted to track variations of an idea. That was one of the advantages of having something as simple as Subversion: the whole team could work with idioms that they understood. Folders will always be popular with non-technical people.

But software developers preferred git, and they made the argument that they needed cheap branches, needed to run the software in whole, locally on their machines, with multiple variations and easy switching between branches, and needed a smooth path through the CI/CD tools towards deployment to production.

I've two criticisms with this argument:

1. software developers never took seriously how much they were damaging the companies they worked for when they ended the era of unified version control.

2. When using Subversion, we still had reasonably good systems for deployment. For awhile we used Capistrano scripts, and later (after 2010) I wrote some custom deployment code in Jenkins. The whole system could be made to work and it was much simpler than most CI/CD systems that I see now. Simplicity had benefits. In particular, it was possible to hire a junior level engineer and within 6 months have them understand the whole system. That is no longer possible, as devops has become complex, and has evolved into its own specialty. And while there are certainly a few large companies that need the complexity of modern devops, I've seen very few cases myself. I mostly see just the opposite: small startups that get overwhelmed with the cost of implementing modern devops "best practices", small startups that would benefit if they went back to the simplicity that we had 10 to 15 years ago.

langsoul-com
0 replies
8h55m

They had the timing, the name and the reputation boost.

Though I wonder if bit bucket would win if they had githubs name

keybored
0 replies
1d

Why GitHub Actually Won

How GitHub _actually_ became the dominant force it is today, from one of it's cofounders.

Being at the very center of phenomena like this can certainly leave you with blind spots, but unlike these youngsters, I was actually there. Hell, I wrote the book.

Downvote all you want for being “non-substantive” but for some reason I can’t voluntarily tolerate such a density of well-actually phrasing. It’s grating.

It also seems to be everywhere these days but maybe I’m too attuned to it.

josefrichter
0 replies
2h7m

I am seriously worried about the Microsoft acquisition, because products acquired by MS almost never end up well.

jillesvangurp
0 replies
11h1m

I think the analysis is largely correct. But not entirely. My take on this is that 1) Github fixed the one problem that Git had: terrible UX. Github made it more like subversion. Enough so to consider switching. Indeed a lot of small companies treat it like a central repository with commit rights for everyone. 2) It fixed a big problem OSS projects had: it was very hard to contribute to them.

The first reason was why Git became interesting, the second one is why it won.

Prior to Github the way to contribute to OSS projects was a protracted process of engaging with very busy people via mailing lists, issue trackers, and what not and jumping through a lot of hoops to get your patches considered, scrutinized, and maybe merged. If you got good at this, you might eventually earn commit privileges against some remote, centralized repository. This actively discouraged committing stuff. OSS was somewhat elitist. Most programmers never contributed a single line of OSS code or even considered doing so.

Github changed all that. You could trivially fork any project and start tinkering with it. And then you could contribute your changes back with a simple button push: create pull request. It actively encouraged the notion. And lots of people did.

Github enabled a bunch of kids that were into Ruby to rapidly scale a huge OSS community that otherwise would not have existed. That success was later replicated by the Javascript community; which pretty much bootstrapped on Github as well. What did those two communities have in common: young people who were mostly not that sophisticated with their command line tooling. This crowd was never going to be exchanging patches via some mailing list, like the Linux crowd still does today. But they could fork and create pull requests. And they did. Both communities had a wild growth of projects. And some of them got big.

Github gave them a platform to share code so they all used it. And the rest is just exponential growth. Github rapidly became the one place to share code. Even projects with their own repositories got forked there. Because it was just easier. A lot of those projects eventually gave up on their own central infrastructure. Accepting contributions via Github was easier. In 2005 Git was new and obscure; very elitist. In 2008 Github popped up. By 2012 it hosted most of the OSS community. Game over by around 2010 I would guestimate. By 2015 even the most conservative shops were either using it or considering it at least.

jarule
0 replies
1d

Why GitHub was started. To ease ass pain.

izietto
0 replies
9h42m

I think GitHub won because its UI/UX it's the best among competitors-and I don't mean it's perfect, just that it's the best among competitors.

gsliepen
0 replies
1d1h

Another big advantage of Git for sites like GitHub is that you are never putting your eggs into one basket. You have your local copy of all history in a project. GitHub is merely a mirror. Sure, some features have been sprinkled on top like pull requests and an issue tracker, but those are not the most critical part. If GitHub goes down you can move your whole Git history to another site like GitLab, sourcehut, or just self-host it, or you can even start doing it right now with minimal effort. This was never the case with CVS and Subversion.

codr7
0 replies
18h18m

Because it was a lot better than the alternatives and free?

I introduced Subversion at my first job, we were sharing updates over FTP and heavily coordinating who worked on what before.

Subversion was definitely a step up, but branching/merging was very problematic and time consuming.

I still find Git confusing as hell sometimes, and would guess most developers use like 50% of the features tops; without GitHub or something similar it wouldn't have gone anywhere.

chubot
0 replies
18h40m

fundamentally interesting thing that I think showed in everything that we did was that we built for ourselves. We had taste. We cared about the experience.

The "taste" thing is weird, making an analogy with Apple vs. Microsoft, who explicitly competed at many times

As DannyBee said, there was never any Google vs. Github competition, because Google's intention was never to build something like Github

In fact, one thing he didn't mention is that URL, as I recall, was

   code.google.com/hosting/
not

   code.google.com/   # this was something ELSE, developer API docs for maps, etc.
SPECIFICALLY because management didn't want anyone to think it was a product. It was a place to put Google's own open source projects, and an alternative to SourceForge. (There was also this idea of discouraging odd or non-OSS licenses, which was maybe misguided)

That is, the whole project didn't even deserve its own Google subdomain !!! (according to management)

(I worked on Google Code for around 18 months)

---

However if I try to "steel man" the argument, it's absolutely true that we didn't use it ourselves. The "normal" Google tools were used to build Google Code, and we did remark upon that at the time: we don't dogfood it, and dogfooding gives you a better product

But it was a non-starter for reasons that have to do with Google's developer and server infrastructure (and IMO are related to why Google had a hard time iterating on new products in general)

I think also think Github did a lot of hard work on the front end, and Google famously does not have a strong front end culture (IMO because complex front ends weren't necessary for the original breakout product of search, unlike say Facebook)

below43
0 replies
16h11m

cough. its, not it's.

asnyder
0 replies
21h10m

Personally I really liked darcs, always felt more natural and intuitive to me.

Though fortunately was compatible and natively convertible to git and made the git takeover mostly smoothless.

At the time it felt that github and the rapid tooling and integrations developed in response cemented git's rise and downfall of everything else including darcs.

ThinkBeat
0 replies
18h56m

A commercial proprietary platform that came around at the right time to cash in on the open source movement, and was then acquired by Microsoft for $7.5 billion¹ in 2019.

Becoming a jewel for (one of) the worlds most profitable proprietary software and platform company.

And GitHub is still super popular, evolving more and more into a social network. where the users work for free to increase the value of platform and promote it for the chance at more GitHub stars.

With the common echo chamber: "" Windows is bad, eww proprietary bloated spy software and evil Office. Opensource and Linux is goood. "" and then

"I have 3000 stars on GitHub now, check out my repos."

Did Microsoft predict the value having trivial access to all those codebases for training software models?

The kind of insight they have with Microsoft GitHub and Microsoft LinkedIn is quite something.

¹ In stock

INTPenis
0 replies
9h49m

I was there from the ground floor and Gitlab.com failed miserably in SEO.

There are ancient threads about it on their issue tracker, that go nowhere for years and years. It's almost as if they were trying to sabotage themselves.

SEO was hugely important because when you searched for something Github projects actually came up in Google, gitlab.com did not. Even if there was an interesting project there, it wouldn't have been known.

So I'm not surprised Github became synonymous with Git forge online.