return to table of content

The Rise and Fall of Silicon Graphics

martinpw
124 replies
1d

Whenever this topic comes up there are always comments saying that SGI was taken by surprise by cheap hardware and if only they had seen it coming they could have prepared for it and managed it.

I was there around 97 (?) and remember everyone in the company being asked to read the book "The Innovator's Dilemma", which described exactly this situation - a high end company being overtaken by worse but cheaper competitors that improved year by year until they take the entire market. The point being that the company was extremely aware of what was happening. It was not taken by surprise. But in spite of that, it was still unable to respond.

mrandish
28 replies
22h55m

You highlight one of the most interesting (and perhaps less understood things) about the key Innovator's Dilemma insight. Even if the senior management have read the Innovator's Dilemma books, know they are being catastrophically disrupted and desperately trying to respond - it's still incredibly difficult to actually do.

Not only are virtually all organizational processes and incentives fundamentally aligned against effectively responding, the best practices, patterns and skill sets of most managers at virtually every level are also counter to what they must do to effectively respond. Having been a serial tech startup founder for a couple decades, I then sold one of my startups to a valley tech giant and ended up on the senior leadership team there for a decade. I'd read Innovator's Dilemma in the 90s and now I've now seen it play out from both sides, so I've thought about it a lot. My key takeaway is that an incumbent's lack of effective response to disruption isn't necessarily due to a lack of awareness, conviction or errors in execution. Sure, there are many examples where that's the case but the perverse thing about I.D. is that it can be nearly impossible for the incumbent to effectively respond - even if they recognize the challenge early, commit fully to responding and then do everything within their power perfectly.

I've even spent time sort of "theory crafting" how a big incumbent could try to "harden" themselves in advance against potential disruption. The fundamental challenge is that you end up having to devote resources and create structures which actually make the big incumbent less good at being a big incumbent far in advance of the disruptive threat appearing. It's hard enough to start hardcore, destructive chemo treatment when you actually have cancer. Starting chemo while you're still perfectly healthy and there's literally no evidence of the threat seems crazy. It looks like management incompetence and could arguably be illegal in a publicly traded company ("best efforts to maximize/preserve shareholder value" etc).

rbanffy
15 replies
20h18m

I think SGI failed to understand that there was a point where desktop PCs would be good enough to replace dedicated workstations. Continuing to make hardware that's much better than the best PCs wasn't going to save them after PCs crossed the good-enough line - whatever they had, would be relegated to increasingly rarefied niches - the same way IBM now only makes POWER and mainframes - there is no point of making PCs, or even POWER workstations anymore for them, as the margin would be too narrow.

SGI could double down on their servers and supercomputers, which they did for a while, but without entry-level options, their product lines becomes the domain of legacy clients who are too afraid (or too smart) to port to cheaper platforms. And being legacy in a highly dynamic segment like HPC is a recipe for disaster. IBM survived because their IBMi (the descendant of the AS/400) and mainframe lines are very well defended by systems that are too risky to move tied to hardware that's not that much more expensive than a similarly capable cluster of generic and less capable machines. As the market was being disrupted from under them, they retreated up and still defend their hill very effectively.

The other movement they could do was to shift downwards, towards the PC, and pull the rug from under their workstation line. By the time Microsoft acquired Softimage and had it ported to NT, it was already too late for SGI to even try that move, as NT was solidified as a viable competitor in the visual computing segment, running on good-enough machines much, much cheaper than anything SGI had.

mrandish
7 replies
19h28m

I think your analysis of the shifting technology landscape is largely on target. However, I'm not convinced that the true root of SGI's failure was the technology. Clearly their tech did need to evolve significantly for them to remain competitive but that's a transition which many companies successfully make. Even though SGI chose not to evolve the tech soon enough, fast enough nor far enough, I suspect they still would have failed to survive that time period due to an even more fundamental root cause: their entire corporate structure wasn't suited to the new competitive environment. While the "desktop transition" was most obviously apparent in the technology, I think the worst part for SGI was that the desktop shift changed the fundamental economics to higher volumes at lower costs.

SGI had invested in building significant strengths and competency in its sales and distribution structure. This was one of their key competitive moats. Unfortunately, not only did the shift in economics make this strength irrelevant, it turned it into a fundamental weakness. All that workstation-centric sales, distribution, service and support infrastructure dramatically weighed down their payroll and opex. This was fine as long as they could count on the higher margins of their existing business. While it's easy to say they should "just layoff all those people and relaunch as a desktop company" that can't be done in one quarter or even one year. It requires fundamentally different structures, processes, systems and skill sets. Hiring, training and integrating all that while paying for massive layoffs and shutting down offices, warehouses etc takes time and costs a lot of money.

Worse, once their existing workstation customers saw them shutting down the SGI the customers had bought workstations and service contracts from to become a different kind of company entirely, sales revenue would have taken an overnight nosedive. SGI's stock would also have tanked far more immediately than it did as the fickle stock market investors sold stock they'd bought because SGI offered a specific risk/return expectation which just became much more "risk" and much less "return" (at least in the near-term). In making such a dramatic move SGI would have effectively dumped much of their current quarterly revenue and the value of one of their core strengths - all at the same moment. Thus turning them into one of their emerging startup competitors with all of a startup's disadvantages (no big ongoing revenue streams, no big cash pile (or high stock valuation to leverage for cash)) yet none of a startup's strengths (nimble, lower-paid staff and more patient venture investors).

The point of my earlier post was mainly that a true disruptive market shift is nearly impossible for a large, established incumbent to successfully survive because they basically have to rapidly turn into someone else almost overnight. How can a champion sumo wrestler survive a shift so dramatic that their sport quickly turns into a track meet? Even seeing it coming doesn't help. How does one even prepare for such a shift since losing mass turns you into a bad sumo wrestler long before you even start being a viable sprinter? As Christiansen observed, such disruptions are often enabled by technology but the actual cause of incumbent death is often due to the shift turning an incumbent's own strengths into weaknesses almost overnight.

panick21_
6 replies
17h27m

I don't think you need to shut down that distribution. Because fundamentally, existing costumers often continue to buy existing product at existing prices, as long as they get a faster product. This was something that Gordon Bell pointed out. Existing costumers are design to work at that level of expense. Expensive graphics workstations and rendering servers and so on still exist. Sure it would go down eventually as all business does in the long run.

The real failure is not picking up new business along the way. With the N64 they showed they could design a consumer product. But outside of that they were in no future consoles. 3dfx and ArtX both came from former SGI. You don't need to stop selling workstations just because you make chips for consoles and other such devices. Nvidia barely survived and might not have if not for consoles. There are other markets where their expertise could have been applied.

Surviving changes like that often requires finding other markets. And then when it comes to making hard choices you need to cut part of the thing that is unprofitable. But this is really hard to do and in some ways it goes against the 90s US corporate philosophy of focusing only on the 'Core' business. DEC for example sold profitable business units that might have been helpful to have. DEC had a printer business and had the potential for a Database business. Oracle calls RDB one of the best acquisitions.

vidarh
5 replies
11h50m

the N64 they showed they could design a consumer product. But outside of that they were in no future consoles. 3dfx and ArtX both came from former SGI. You don't need to stop selling workstations just because you make chips for consoles and other such devices.

Commodore tried for a play where their - still much lower end - newest generation chipset would have scaled (with the same chips) from being a full low end computer, console, or set-top box, computer (it had a PA RISC core on chip, so could run standalone), a high end graphics card for PCs, and the chipset for a higher end Amiga at the same time.

They ran out of money - short maybe a few million - before being able to test if that would have worked as a way to widen their market.

I wish we'd have gotten to see how that'd unfolded, though Commodore's deep management dysfunction probably still would have made it fail.

rbanffy
4 replies
6h37m

Legend says Commodore was at some point approached by Sun willing to OEM their Amiga 3000/UX machines as entry-level Unix workstations and sell them through their distribution channels. The same accounts say Commodore management refused (maybe because the deal had a non-compete part).

They also missed an (earlier) boat with the Commodore 900 workstation, which would run Mark Williams' Coherent OS (a Unix-like system).

pjmlp
2 replies
3h9m

Many Amiga engineers were UNIX heads, and even Amiga DOS /Workbench started as what might be another take into UNIX like systems with multimedia hardware, when they pivoted away from a games machine.

There are some vintage computer club talks where they dive into this.

rbanffy
1 replies
1h44m

Commodore could have used Coherent as the Amiga OS.

pjmlp
0 replies
54m

On the other hand, thankfully they haven't taken that approach, as otherwise wouldn't have been the multimedia machine it turned out to be, or have OS design decisions like its use of Libraries, Datatypes and everything scriptable that are yet to be mainstream.

With exception of a few UNIX systems like Irix, Solaris with NeWS, NeXTSTEP, Apollo, everything else tends to be the same deck of cards reshuffled.

vidarh
0 replies
3h42m

Commodore is one long history of management failures... As a big fan at the time, without insight into how dysfunctional it was, it was thoroughly depressing both to see from the outside, and then reading up on the internal chaos years later. It's a miracle they survived as long as they did, and at the same time there were so many fascinating lost opportunities.

ChuckMcM was at Sun at the time, and mentioned a while back he tried to get Sun to buy Commodore outright:

https://news.ycombinator.com/item?id=39585430 (his later replies in that sub-thread are also worth reading)

giantrobot
5 replies
13h17m

I think SGI failed to understand that there was a point where desktop PCs would be good enough to replace dedicated workstations.

I think the real problem here is PC workstations, with Windows NT (later Linux) and an Intel chip, could do 90% of the SGI workstations for a fraction of the price. By workstation I mean an honest workstation with a high end graphics card, 10k RPM SCSI drives, and hundreds of megs of RAM.

The price of SGI workstations was "call us" which translates to tens of thousands of dollars per workstation. PC workstations didn't and couldn't replace all uses of SGI workstations. What SGI was not able to handle was the fact their customers suddenly having a viable option besides paying them tens of thousands of dollars for their workstations. Even their Visual Workstations weren't a real improvement cost wise as those were still overpriced compared to competitors' PC workstations.

smackeyacky
4 replies
12h21m

This is my recollection of the era as well. A quality PC like a Compaq was already a good alternative during the Motorola era. In a lot of ways the whole RISC thing was a dead end as all that happened was SGI, IBM, HP and Sun cannibalised each others sales.

ARM is the only one left standing from that era which with hindsight seemed so unlikely.

vidarh
1 replies
11h40m

Keep in mind several of the others survived long past their public visibility. There were MIPS smart phones for a while. Both PPC and MIPS long outsold x86 in number of CPUs - just at low margin licenses with the bulk going into embedded uses, like ARM.

ARM had the advantage in that space of starting from the very low end, and being able to squeeze margins instead of being squeezed.

rbanffy
0 replies
6h42m

Don't forget IBM is still selling (and improving) their POWER family, running AIX (which had the low-end eaten away by Linux) and IBMi (which is the only minicomputer family still standing). IBMi is extremely well defended as it's not even fully documented (unlike mainframes) and, therefore, doesn't even have a way to be emulated outside IBM. And I would not be surprised if that "secret sauce" to be kept in an underground facility under a mountain in the middle of a desert, in a completely offline system behind multiple biometric locks.

ARM survived this long because it had a niche others couldn't break into (and still can't) as the highest performance per watt anywhere.

giantrobot
1 replies
2h36m

I think it would be more accurate to say the RISC workstation built entirely of custom low volume components was a dead end. Even if SGI decided to cut their margins to compete on cost their lowest prices would still be way above a PC workstation. Compaq benefitted from economies of scale that SGI could not.

RISC architectures live on today. Your house likely has dozens of MIPS chips in various components. You've got more ARM chips for sure but no shortage of other RISC chips in various components.

rbanffy
0 replies
1h41m

“Custom low volume components” is a marketing strategy. They could sell them to competitors and make money off the components. Late in its history SGI owned MIPS and sold their hardware to others. I have a WindowsCE laptop running on a MIPS R4000 processor. Runs NetBSD with effort, but with dignity.

panick21_
0 replies
17h52m

IBMs market was also just far larger. SGI sales were around 3 billion, Sun were around 3x that. IBM was even more. SGI and Sun were based Unix so Linux could disrupt that far more easy then IBMs systems.

IBM also came into the game more vertically integrated. Having your own fab is expensive, but if you read on what Sun and SGI had to do in order to get chips, that route also wasn't great.

In the beginning there was a chance that Sun and SGI could have merged, the reason it didn't happen was mostly because leadership couldn't agree on who would lead the company. Between them they duplicated a lot of technology while sitting right next to each other. Both doing their own RISC chips, at times Sun was doing their own graphics cars, competing in 'low' priced and mid priced work stations, incomparable Unix developments, competing in the large SMP market against each other. If they had been together and things could have been different a larger install base and more investment into the architecture might have given them a better chance.

pfdietz
2 replies
17h2m

Arguably, the shareholders don't want the management to try to harden the company. If the shareholders wanted to invest in a new approach, they could do that on their own. Rather, the shareholders expect the company to squeeze what they can out of what they are, even if that means winding down the firm.

ghaff
0 replies
3h55m

Right. It's not actually in anyone's interest most of the time to deliberately destroy the company in the short term to have a glimmer of hope of maybe successfully reinventing it in the longer term. Shareholders always have the option of moving on and employees/management can probably collect paychecks for more time while they possibly look elsewhere. Corporate legal continuity is probably overrated.

Pet_Ant
0 replies
10h8m

Companies and management would rather a slow controlled descent into the ground than unexpected success. They balance their risk at the portfolio level not at the company level.

During good times focus on your core and then peter out. Just follow a nice parabolic arc.

A lot of success is just timing and attempting to fight all threats look foolish. A remember, for every up and comer that “we all saw coming” there were lots that didn’t make it. If you waste time fighting those you wasted money.

kjkjadksj
2 replies
2h6m

I wonder if any major companies have experimented with an internal “red team” startup? It would be on level playing field with your upstart competitors, free from existing hierarchy and structure to develop disruptive tech like any other startup, only you are bankrolling them which gives them a head and shoulders boost relative to any other startup. Eventually you can let this startup grow to be more dominant than the old company and repeat the process with the next disruptive technology.

telesilla
0 replies
1h9m

I've seen this in several mid-size companies but the trick to success is that you make a start-up in another vertical so your existing customers aren't dropping their contracts for the cheaper thing you disrupt yourself with. You're building a new business for the same market that already know you, so you can capitalize on your brand, using revenue from a dying but profitable model.

JustLurking2022
0 replies
1h13m

As a thought experiment, let's say Facebook did this.

The red team designs a new streamlined look with less click baity posts and fewer ads. Users flock to it, abandoning the existing platform. The new platform isn't monetized as effectively as the old one so revenue drops by billions per year - maybe an order of magnitude more than the new product brings in. Shareholders demand an answer as to what management are doing about it.

There might be some golden path, but that typically relies on finding a new market that doesn't interfere with the existing business (e.g. AWS and Amazon). However, the far easier options are a) shutdown the new service or b) increase its monetization, thereby starting the enshitification cycle.

bsder
1 replies
21h49m

"Maximally efficient is minimally robust."

A company that optimizes for efficiency will get stomped flat when the environment changes.

The problem is that there are no incentives in business to optimize for robustness.

roughly
0 replies
20h46m

Well, that’s partially because of the converse: a company that optimizes for robustness will get stomped flat before the environment changes to require robustness. Short term games are bad in the long term, but often good enough in the short term to win before the long term arrives.

throwaway4good
0 replies
21h57m

Everyone has been reading that book since the late 90es.

I remember a talk by Clayton Christensen talking specifically about Intel and how they setup the Celeron division to compete with themselves (based on his advice).

A key property of tech in economics lingo is that it is “natural monopolies” - all fixed cost and no variable cost.

This creates these winner takes all games. In this case both Intel, SGI plus others knew the rules and it just ended up with Intel taking the prize and it all becoming Wintel for a decade or so - basically until the smart phone allowed enough capital to be accrued to challenge the old monopoly.

duxup
0 replies
13h23m

Even just changing simple behaviors across a large company can be impossible. I worked at a company with several thousand employees all required to attend mandatory training on "effective meetings", several hours long spread out across thousands of employees. Rule 1 was to have an agenda for the meeting. This was something nobody did at this company and you would regularly attend meetings unprepared.

After all that training, still nobody did it (ok I did and one other guy). That company couldn't change anything. It was amazing.

They had a project to change a department into more proactive than reactive. The solution was to create a lot of bureaucracy surrounding being proactive. As you can imagine bureaucracy about being proactive was really just institutionalizing ... not being proactive.

I eventually left and work at a smaller company now. It's been refreshing for years now when we can decide "this process doesn't work, let's not do it anymore" and it just happens. Even just new coworker: "I won't be in tomorrow, who do I tell?", me: "you just did, have a great time off" seems revolutionary after being at a big company for so long.

I'm convinced that as the sheer numbers of humans increases the friction to making real change in a company decreases and there's not much you can do. Fundamental change to respond to real disruption, nigh impossible.

datavirtue
0 replies
12h53m

Does this apply to the US response to Chinas lead in battery and solar production? The Inflation Reduction Act is essentially a protectionist policy for manufacturing in response to Asian and Chinese tech manufacture. I feel like we are trying to fight our way out of one of these situations. I'm concerned because every time I see something happening that excites me (the massive policy shift from enacting the IRA and the surge in American manufacturing) it usually means something really bad is about to happen.

Kamq
0 replies
15h9m

and could arguably be illegal in a publicly traded company ("best efforts to maximize/preserve shareholder value" etc).

There's no way the shareholders win this lawsuit. Violating fiduciary duty involves doing things you know won't provide value. Long term planning would be a defense for this.

The shareholders could absolutely oust those executives, though. And they may very well do so.

ghaff
17 replies
23h59m

Having worked longtime for a minicomputer company--which actually survived longer than most mostly because of some storage innovations along with some high-end Unix initiatives--it's really hard. You can't really kick a huge existing business to the curb. Or otherwise say we're going to largely start over.

Kodak was not actually in a position to be big in digital. And, of course, the digital camera manufacturers mostly got eclipsed by smartphones anyway a decade or so later.

aurizon
6 replies
22h34m

On the contrary, Kodak was well placed to do well by anticipating 'Moore's Law' as pertinent to sensor pixel density and sensitivity versus film. Film resolution was towards the end of intense development in pixel terms - not much further to go. They had pioneering patents and ongoing R&D would have enabled a long period of dominance during the transition and to this day!! The board and scientists were asleep on a mountain of cash, and they sold their future for a few crumbs left for shareholders after bankruptcy. Blackberry did much the same with fewer excuses. I met with some board members of Kodak in the 80's and they were like old English gentlemen - long on pomp and procedure, but they wore blinders and a vision bypass - TRIH.

ghaff
3 replies
21h41m

Kodak was essentially a chemical company at one point. They even spun off an actual chemical company. Kodak could probably have played a better hand even if they did probably before their time things like PhotoCD. But they could have been Apple or maybe Instagram? That's a stretch.

I'm not a particular Kodak apologist but suggesting that a company should have been able to anticipate and correct for their business collapsing by 90% in a decade or so seems to need a lot of particulars.

xcv123
2 replies
21h27m

But they could have been Apple? That's a stretch.

They could have been a Sony. The iPhone camera sensor is made by Sony.

ghaff
1 replies
21h13m

And Sony has certainly had rough patches too. And that's for a company coming from an electronics manufacturer angle.

Kodak could have spun off a consumer electronics or semiconductor manufacturing company. But it's not clear why that is actually a better model than someone else just spinning up similar entities.

I don't need all the chemical engineers and a lot of other people connected with the old business anyway. And I'm sure not turning them into semiconductor experts.

So you're one of the 10% of employees in HR who snuck through to the other side. Is that really a big deal?

xcv123
0 replies
19h20m

That's right. The chief executives and the HR lady basically get transferred over to a new startup funded with Kodak's money and everyone else is fired.

ianburrell
1 replies
19h32m

Kodak did fine in the transition to digital. They made some popular compact cameras and tried to make DSLRs. They were wiped out by compact cameras being killed by smartphones. The survivors are the old camera makers like Canon and Nikon that have ecosystems. The other big survivor is Sony, which bought a camera company and makes most of camera sensors.

Fuji is interesting, they weren't that successful in first digital cameras, but now have some interesting mirrorless ones. They still make film.

ghaff
0 replies
18h52m

Fujifilm is a much smaller company than Kodak was. They also applied a lot of their expertise in emulsions to medical applications.

And, yes, they have some interesting if somewhat niche cameras.

maire
4 replies
21h44m

Kodak was well aware of what was going to happen. Company culture killed digital photography.

I was at Apple when we worked with engineers from Kodak who were working to change various format standards to allow digital photos. This was in the late 1980s or early 1990s.

ghaff
3 replies
21h34m

But, from the perspective of today, Kodak would have had to basically eclipsed Apple.

Even displacing the big Japanese camera manufacturers, who by then had dominated high-end photography, would have required reversing decades of a shift away from high-end cameras like the Retina line.

I don't doubt there was company DNA against digital photography but it's not like non-smartphone photography, especially beyond relatively niche pro/prosumer level, has had such a good run recently either.

nradov
2 replies
20h32m

There is still a lot of business opportunity in supplying image sensors and lenses to smartphones.

chiefgeek
1 replies
20h5m

But it is nowhere near as profitable as the 35mm film system was.

ghaff
0 replies
18h56m

The 35mm system was a huge consumables business down through the food chain. That basically doesn't exist with digital. (Aside from ink jet supplies and I'm not sure how true even that is any longer.)

loloquwowndueo
4 replies
23h50m

Data General?

ghaff
3 replies
23h38m

Yes. CLARiiON eventually enabled a sale to EMC (which arguably saved EMC for a time) and the Unix business (especially NUMA servers) were sufficient revenue producers for a while to keep the lights on. ThinLiiNe (or whatever the capitalization was) never went anywhere but neither did a lot of things in the dot.com era.

loloquwowndueo
1 replies
23h29m

I knew it :) thanks for confirming! And for sharing.

ghaff
0 replies
23h17m

I was the PM for a bunch of the minicomputers from the mid-80s on. Then I was PM for the initial Unix AViiONs and later the NUMA servers including being one of the main liaisons with CLARiiON.

panick21_
0 replies
17h11m

Seems like Sun really shot itself in the foot by not buying DataGeneral. A Unix storage business that fits pretty well with their larger datacenter portfolio. And a start to a real x86 business.

I just finished reading 'LIFE UNDER THE SUN: My 20-Year Journey at Sun Microsystems' that talks about Sun and the storage businesses a bit. Sun was never happy with how well they did in storage. Sun was part of defining Fibre Channel.

For some reason that still doesn't make sense to they bought StorageTek for an incredible $4 billion at the time when Sun wasn't exactly flying high. The explanation from the CEO given in the book mentioned above is baffling.

Edit:

Seems they bought:

1999: MAXSTRAT Corporation, a company in Milpitas, California selling Fibre Channel storage servers.

Never heard of them. Not sure how comparable it was.

hintymad
10 replies
21h25m

I remember Clayton Christensen mentioned that Andy Grove invited him to Intel to talk about how to deal with the dilemma, and interrupted Christensen while he was talking and said something like "I know the problem, and I need you to tell me the solution". Similarly, Peter Drucker repeatedly mentioned one of the biggest challenges in business is "killing the cash cow". Along that line, Netflix's Reed Hasting is really amazing. He somehow managed to kill the DVD business and used it to milk the streaming business, when almost everyone in the industry and some of his lieutenants in Netflix didn't believe him.

specialist
6 replies
19h33m

Yes and:

These years later, while the innovator's dilemma thesis describes what, there's still little treatment of why and how.

I keep wanting someone to account for the roles of investment and finance.

Amazon's innovation was lower cost of capital. They convinced investors to wait for returns. And they got a massive tax holiday. (How could they not succeed?)

Ditto Tesla, with its saavy moves like govt loans, prepurchases, tax incentives, and selling direct.

That cheap capital was necessary, but not sufficient. Both still had to create products customers wanted.

I keep coming back to Apple. How'd Apple avoid the trap? Despite their terrible position. My guess is better financial strategy (if that's the right phrase). Apple focused on margins (and monosophony) instead of market share. And then leveraged their war chest to implement monosphony.

microtherion
2 replies
17h59m

Apple also did not hesitate to kill their cash cow, supplanting the iPod with the iPhone (an observation made, I think, by Horace Dediu).

varjag
1 replies
10h25m

iPod was killed quite late into iPhone era, when the sales have plummeted to unsustainable levels.

Pet_Ant
0 replies
9h55m

By “killed” GP means “cannibalised” not “cancelled”. They out-competed the iPod while it was still hot.

system7rocks
0 replies
17h34m

Apple also recognized one of its big problems was marketing to the end user, in particular in big box stores where not many people knew and appreciated what Macs could do. Creating their own stores created innovation and an experience, rather than less than knowledgeable staff and price tag comparisons against the latest Dell PC. That meant letting go of some of the traditional stores and the chain of legacy Mac resellers. Of course, now, you can get Macs at Costco.

panick21_
0 replies
16h55m

Investors only 'wait' if you can show revenue growth and large market.

Tesla was very capital efficient compared to how hard the task was, only just enough to get the next product out. By the time they were making big loses, you could see the Model 3 was gone turn around once they reached scale. There was always a clear story of if we do X next, that means Y.

I keep coming back to Apple. How'd Apple avoid the trap?

I think by changing CEO radically at the right time. Whatever was gone work for Apple, it wasn't any of the typical things you would think off. I'm not at all a Jobs fan, but I don't think many people could have turned it around.

Jobs had the ability to come back and radically shift the company. They also had the benefit of 20 years of loyal costumers, maybe the biggest asset Apple had was their loyal costumer base (or captured market). Personally I can say my dad was a complete Apple fan, he would just buy Apple. He simply didn't care the higher performance or any of the other things.

Spooky23
0 replies
16h48m

Apple’s limited SKUs let them maximize their economies of scale. When Jobs took over, despite teeny market share, products like iMac and iBook were #1 in shipments, compared to Dell and their 300 models.

froonly
2 replies
19h52m

For a while you could view Netflix online and rent DVDs from them.

RandallBrown
1 replies
19h43m

Oh dang, I thought you still could. Looks like they shut down the DVD rentals about 6 months ago.

chiph
0 replies
18h3m

There are now a couple of smaller disc-by-mail startups who aim to serve that market. I've signed up with DVDInBox out of Florida and they've done a good job thus far.

From what I've seen their biggest current challenge is their mailer. Netflix spent a lot of time designing their signature red envelopes, working with the USPS on ensuring they would survive a trip through the postal sorting machines. DVDInBox has yet to reproduce it - their envelopes sometimes arrive fairly mangled, or have not arrived at all (lost to the bowels of the machines).

foobarian
8 replies
23h59m

I think it's a near impossible situation - the status quo is literally something that should not exist given the new market realities. Pivoting is pretty much asking a company to commit seppuku - asking the layers of leadership to basically replace themselves and quit in many cases. Which is pretty much what happens anyway.

bunderbunder
4 replies
23h39m

And, just like every Unix workstation vendor of the 1990s, they got hit with a perfect storm. They had their hardware being disrupted by x86 really coming into its own as a viable option for higher-end computing at the exact same time that Linux was becoming a serious option for the operating system.

"Literally something that should not exist" is the perfect way of putting it. In 1990, lots of people needed boutique workstation vendors. In 2000, nobody did.

rbanffy
2 replies
20h1m

Even Apple, which became the unlikely last of the Unix workstation vendors, was disrupted by Intel (and moved from PowerPC to x86 for a while). Ironically, Apple is now the very last Unix workstation vendor in existence.

grumpyprole
1 replies
10h57m

Yes, although they have clearly not always been committed to the Mac Pro over the years, it still exists and the pricing is certainly workstation like.

rbanffy
0 replies
6h52m

Every Mac is a Unix workstation these days. I work on a Macbook Pro and spend an inordinate amount of time running Unixy stuff in the terminal.

bunabhucan
0 replies
23h26m

It's worse than that. Instead of "nobody" it was conservative slow moving vendor locked clients that could convince you to keep selling at those prices ("look at the margins!") instead of focusing on alternatives. I remember $5,000+ PCs being considered "cheap" workstations when those clients finally migrated.

neuralRiot
0 replies
21h19m

Probably the lack of vision is not just failing to turn into the direction of new “products” but not acquiring, digesting and eliminating those busines who start to grow before they’re too big. See Microchip for example, how many relatively small semiconductor and technologies manufacturers have already eaten.

ghaff
0 replies
23h41m

And, at some point, what does it matter if the leadership and most of the employees turn over, typically involuntarily?

Is there any significance really to Foot Locker basically being a reorganized Woolworth's as opposed to being a brand-new company?

If you're big enough and have some product lines that still bring in a lot of money and didn't totally collapse like IBM you can sometimes pull it off. But it's hard.

AlbertCory
0 replies
23h17m

Pivoting is pretty much asking a company to commit seppuku

This is conventional wisdom (and thus, usually correct).

However, it's always interesting to look at counterexamples: Beretta, for example (in business for 500 years).

https://www.albertcory.io/lets-do-have-hindsight

or the IBM PC, which cannibalized IBM's business, at least in IBM's mind. Thus, they screwed it up and let Wintel make the real billions. So it worked, until they took your advice and decided that had to stop.

Animats
7 replies
23h32m

They sort of tried. Around then they had a Windows NT machine that cost around US$12,000. But it was too late. The first serious graphics cards for PCs were appearing, from Matrox and others, with prices of a few thousand dollars.

(I tried some early NT graphics cards on a Pentium Pro machine. This was before gamer GPUs; these were pro cards from tiny operations. Fujitsu tried unsuccessfully to get into that business, with a small business unit in Silicon Valley. At one point they loaned me a Fujitsu Sapphire graphics card prototype. When I went back to their office to return it, the office had closed.)

Also, there was a bad real estate deal. SGI owned a lot of land where Google HQ is now. They sold it to Goldman Sachs in a sale and lease-back transaction, selling at the bottom of the market. That land, the area north of US 101 in Mountain View had, and has, a special property tax break. It's the "Shoreline Regional Park Community", set up in 1969. The area used to be a dump. Those hills near Google HQ are piles of trash. So there was a tax deal to get companies to locate there. That made the land especially valuable.

msisk6
6 replies
23h9m

SGI tried its hand at the PC video card business as early as 1990. I was at Autodesk at the time and got one of these to beta test on a DOS 486 running AutoCAD. It was an impressive product. But huge; it took up two full-length ISA slots. And the display drivers were a bit buggy.

rbanffy
2 replies
19h59m

I wish they ported IRIX to x86. You can make more money by making stuff for Windows, but it won't protect you from market erosion.

nradov
1 replies
18h50m

I doubt that would have helped. Customers didn't love Irix as an OS. They tolerated it in order to use SGI's superior hardware and specialized applications.

Competitors such as Sun did port their proprietary Unix flavors to the x86 PC platform but never achieved traction. It was impossible to compete with free Linux, and lack of device drivers was always an obstacle.

rbanffy
0 replies
6h53m

Kind of. Oxide uses what's left of OpenSolaris as their OS of choice and Oracle still sells Solaris SPARC servers. It's the retreat up - they can defend their hill for a long time that way and, in Oracle's case, they don't even innovate the same way IBM does with POWER and Z.

Y_Y
1 replies
21h29m

Sounds just like Nvidia in 2024

theandrewbailey
0 replies
17h20m

Except Nvidia's modern cards are even bigger.

jiggawatts
6 replies
20h39m

In the late 90s I was in the last year of high school. Silicon Graphics came to do a demo of their hardware for students that were interested in taking a computer science course at university in the following year.

The graphics demos looked like trash, basically just untextured and badly shaded plain colored objects rotating on the screen. For reference I was playing Quake III around the time which had detailed textures and dynamic lighting.

I asked the SGI presenter what one of his Indigo workstations cost. He said $40,000, not including the graphics card! That’s extra.

I laughed in his face and walked out.

dekhn
5 replies
20h16m

In the late 90s, SGI demos were much more impressive than what you describe. It was used by technical folks to do real stuff, with stringent criteria.

More importantly, the things that made Quake III so great were state-of-the-art for gaming. But those things couldn't render lines quickly and well (a mainstay of CAD at the time), or render at very high resolution (which IIRC was 1280x1024 in that era).

Here's what Carmack said abotu the SGIs a few years before: """SGI Infinite reality: ($100000+) Fill rate from hell. Polygons from hell. If you don’t trip up on state changes, nothing will come within shouting distance of this system. You would expect that.""" SGI was also key for map builds before PCs were capable.

But yes, 1999-2000 was just around the cusp of when SGI went from "amazing" to "meh".

jiggawatts
3 replies
19h6m

If I remember correctly, their cards had a high compute rate even at double precision, but had tiny memory buffers and basically couldn’t do texturing at all in practice.

It turned out that double precision was a mistake that was sold as a “professional” feature. By sharing edges correctly and using the correct rounding modes, single precision provides pixel-perfect rendering. Efficiencies like this allowed the consumer GPUs to run circles around SGI hardware.

dekhn
2 replies
17h43m

They did texturing just fine.

jiggawatts
1 replies
16h51m

Most SGI graphics cards had no texture ram at all, and had zero texturing capability. At the time there was one card that had 4 MB of texture RAM, but in the same year a typical PC GPU had between 16 or 32 MB of shared memory, most of which was used for textures.

A “budget” SGI card cost more than my parents’ car. I bought three different GPUs by that point with my pocket money.

dekhn
0 replies
1h35m

I honestly don'tt know what you're talking about- I wrote many programs that did high performance texture mapping on SGIs and they had both texturing capability and RAM. When you say "SGI card", it makes me sound like you're talkijng about something other than an Octane2 or InfiniteReality.

rbanffy
0 replies
19h55m

The curve that maps fucking around with finding out is not linear. By the time you start finding out, it's very hard to stop finding out much more than you would like to.

VelesDude
6 replies
22h46m

It was clear they were trying to do more consumer grade things, just look at the N64. Couldn't get more mainstream than that. Seeing how the graphics market ended up, it looks obvious from here but in the mid 90's it was still the wild west and everybody was throwing mud at the wall seeing what would stick.

I have never really said that they where "taken by surprise", but a part of it felt like (from the outside) that management had been a little blinded by their pass success and the profit margins from their workstations combined with no clear path forwards for the whole industry. Nvidia could have very easily been just a curiosity of the past but they managed to strike it lucky standing on the shoulders of others.

If SGI had always been a company that could provide graphics workstations the worked with x86/Windows PC's early for example - maybe they would have fared better. Would have gone with the flow of technology at the time rather than fighting uphill no matter the potential technical brilliance. But being saddled to their MIPS processors and custom OS meant that once people left, they almost never came back. One can have the best tech and still fail.

rbanffy
3 replies
20h3m

provide graphics workstations the worked with x86/Windows PC's

Integraph started making PCs with high-end graphics at one point, when they abandoned CLIX and gave up on their (Fairchild's, really) Clipper processor. It didn't work for them either. SGI did their own "Visual Workstation" that ran Windows and had a Pentium, but that too was a huge disappointment.

coredog64
2 replies
13h53m

The 320 and 540 had a few nice things going for them: You could option them with the same 1600SW monitor that the higher-end O2 workstations used without having to use the fiddly multilink. SGI paid Microsoft for a HAL license and did a better job with multiprocessor than what you got from installing the vanilla version.

timc3
1 replies
11h58m

They had decent bandwidth internally allowing them to playback uncompressed realtime standard definition video which normal PCs running Windows couldn’t do at the time.

rbanffy
0 replies
6h32m

The moment your product runs Windows, it'll compete with thousands of others. Being able to do one single thing better than the others won't help you much unless that one single thing can drive all your sales.

Even for video rendering, if your box is twice as fast, it'll be outcompeted by machines that cost half as much or less. At times my desktop PC was not fast enough, it was simpler to get another PC and run time-consuming things on it while I did other things on the other.

the_mitsuhiko
1 replies
22h32m

It was clear they were trying to do more consumer grade things, just look at the N64.

Yes, but the team that did that also left SGI, then worked directly with Nintendo for the GameCube and are acquired by ATI. I’m not sure how SGI managed to not support that effort within itself.

cuno
0 replies
20h22m

Nintendo wasn't loyal to the company it was loyal to the team, so when they just decided to leave and form ArtX they took the customer with them... SGI was happy with the Nintendo contract. They earned $1 in additional royalties for every single N64 cartridge sold worldwide. Losing the team was a big blow.

AnimalMuppet
6 replies
23h36m

Someone at SGI wrote a paper/web page/blog post titled "Pecked To Death By Ducks", claiming that x86 chips could never compete with SGI, and claiming to refute all the arguments that they could.

Then Intel introduced dual core (or maybe just two chips in one housing sharing a bus), and that generated a lot of buzz. So he wrote a follow-up titled "Pecked To Death By Ducks With Two Bills".

I don't recall the timing, though, how it related to the timing of asking everyone to read The Innovator's Dilemma. But at least some of the time, there was a pretty deep denial (or at least a pretty deep effort to keep the customers in denial).

MichaelZuo
4 replies
23h3m

That’s really funny for some reason.

HeyLaughingBoy
3 replies
22h48m

IIRC, "Pecked to Death by Ducks" is the title of either a short (nonfiction) story or a book by Gerald Durrell, one of my favorite childhood authors.

jfk13
2 replies
20h34m

I don't recall that one, and I thought I knew Gerald Durrell's work pretty thoroughly. There is a book of that title by Tim Cahill, though; maybe that's what you're remembering?

HeyLaughingBoy
1 replies
19h58m

Yesss! Thank you. I'm embarrassed because I used to have that one, too :)

On the upside, I've never known anyone else who had even heard of Durrell.

bluenose69
0 replies
18h49m

I would have thought that lots of folks had watched the "The Durrells in Corfu" TV series (PBS Masterpiece Theatre) and then did some research on the family.

I don't know whether this is available online, but I can recommend it as a pleasant programme, with lovely scenery, interesting storylines, and engaging actors.

rbanffy
0 replies
19h58m

A bit like Seymour Cray's plowing a field with 1024 chickens instead of two oxen.

tambourine_man
3 replies
16h10m

I remember reading how Andy Grove, IIRC, spent two years trying to exit the DRAM business. He would flat out order people to stop working on it and they wouldn’t listen, believe or understand. The amount of inertia for large bodies is remarkable.

Pet_Ant
1 replies
9h39m

And yet there are stories of people ignoring the company direction, continuing work, and saving the company.

See: The Blue LED https://youtu.be/AF8d72mA41M?feature=shared

So it’s not clear that the company knows better. Feels like educated guesses but a lof of luck involved.

tambourine_man
0 replies
6h49m

Very true

AtlasBarfed
0 replies
5h1m

Why was exiting the DRAM business such an obvious positive?

Intel is basically at its root PC hardware. Yes, it doesn't dominate the entire industry, but memory is pretty key to a PC and the OS that runs on it.

In fundamentally it's another fab product. Like chipsets.

hintymad
3 replies
18h44m

I guess an interesting question would be whether Nvidia is SGI in the late 90s or Intel of 80s.

leetrout
0 replies
17h17m

I have been comparing them to SGI of the late 90s but the next 18 months will prove me right or wrong as Intel, Apple, Google and AMD try to compete.

geor9e
0 replies
16h38m

SGI was a pretty small 3D workstation company at the bottom of the S&P500, even at it's peak. Microsoft was at least a hundred times as big as SGI. (And Nvidia was even smaller, selling GPUs for both SGI worthstations and Windows PCs at the time.) Now Nvidia is a titan at #5 in the S&P. They could certainly be taken down a notch or two as CUDA alternatives mature and people start buying cheaper AMD/Intel/FPGA(groq, etc) hardware. But they're the best at what they do, with the best people. They don't really have a doomed business model the way SGI's "take out of a second mortage for the Onyx workstation to make N64 games" business model was doomed. I don't buy Nvidia stock, personally, but I especially wouldn't bet against them long term either.

VelesDude
0 replies
11h11m

I get the vague feeling they are more like Intel in the 80's. Nvidia has had a real talent partially by their own smart moves and partial just pure luck for stumbling from one win to another. Seeing them fall into the Crypto boom and then straight onto the AI boom has been wild to see.

appstorelottery
3 replies
22h32m

I was making crazy money in the dot-com boom and bought a SGI 540 in 1999 (with an SGI monitor).

With money to burn SGI was a childhood brand, legends in 3D. Such wonderful memories. 15k on a desktop setup - it was loose change, however it shows how clueless I was back then. However I'd felt like I'd "arrived".

SGI with Windows NT - lol - I wrote my first OpenGL game in Visual Basic... I've always been somewhat of an outlier ;-) God help me.

The point? My personal experience says something about the strength of the SGI brand - even in the face of what was happening at the time (3DFX and so on - my previous company was one of the few 3DFX api devs - illustrating how clueless I was...)... it all happened so quickly... I'm not surprised SGI couldn't respond - or more importantly understand the strength of Microsoft/OpenGL/DirectX in the boiling pot of 3DFX / Nvidia and the rest... From memory it took three years and SGI was done - shared memory architecture? No longer worth the cost. :-(

Looking back, I was such a kid - a complete fool.

Greybeard advice: bet on the monopoly. Be smart. Brands like SGI are nothing in the face of install base. Think about how crazy it was to spend 15k on a desktop SGI back then... nostalgia is insanity, vanity.

lizknope
1 replies
19h49m

Yeah but it still sounds really cool!

From 1991 when I first saw SunOS I wanted a SPARCstation. I started college in 1993 and the school was full of DEC Ultrix machines, Suns, HP PA-RISC, and a handful of IBM RS/6000 and SGIs.

I just thought DOS/Windows PCs were such garbage. Single user, no preemptive, multitasking, no memory protection. Then Linux came out and it changed everything. I bought a PC just to run Linux. My dream of a Unix RISC workstation faded away.

My roommate in 1996 bought a DEC Alpha. Not the cheaper Multia but an Alpha that could run OSF/1 Digital Unix. He actually ran NetBSD on it.

In 1997 I took the computer graphics class and we used SGIs. There was just one lab of them reserved for that class and grad students. I was so excited and it was really cool but I didn't think I could ever afford one. It's still really cool though that you had one.

panick21_
0 replies
16h52m

Linux on RISC-V ... Its happening ... eventually ... probably.

VelesDude
0 replies
10h58m

The one thing that I always have to point out to folks who didn't live through it. The pace of change in everything was so rapid, especially in the graphics space.

It is wild to think that in games for instance, we went from Quake in 1996 running software rendering to Quake 3 requiring a GPU only 3 years later and that becoming the standard in a matter of months.

mcculley
2 replies
10h57m

I worked a lot with SGI hardware and IRIX in the 90s and very early 2000s. I ported a lot of code to and from Linux in that time.

I remember trying to explain to SGI reps that while we loved the SGI hardware, Linux on commodity x86 was the increasingly more effective choice for our projects. We wanted to buy SGI x86 boxes but they were pushing NT.

It was very apparent that SGI salesmen knew which way the wind was turning and they were milking the last bits of the commission gravy train.

Even when everyone understands “The Innovator’s Dilemma”, the incentives of salesmen and shareholders can be to optimize for extracting value out of the dying company. If I am a shareholder in a company being out innovated, it might make sense to want them to maximize short term profits while reallocating capital to the innovators instead of trying to improve the dying company.

Pet_Ant
1 replies
10h0m

Seems like issue is a shift from high margin sales representative lead growth to low margin mass where you don’t need sales people to convince you to buy it. Then there is no way to justify the sales team, so they don’t support it. The company is beholden to them, and they sabotage the move to mass market.

mcculley
0 replies
8h35m

That is certainly one case, maybe the most likely. But another case I was involved with involved sales just not understanding the innovation. They could have potentially made just as much or more money with the new thing, but were too comfortable selling the old thing.

voidmain0001
1 replies
9h43m

Could this eventuality happen to Google, Nvidia, or Apple?

infinite8s
0 replies
1h37m

Why not? Google's only been a powerhouse for the last decade or so.

bombcar
1 replies
18h34m

The way to survive is to eat your own lunch. Be the low cost competitor and cannabilize your own market.

Otherwise you don’t build the iPhone because you don’t want to lose iPod sales.

VelesDude
0 replies
10h48m

I not so obvious but similar example of this would be with Nintendo and the Switch. They had always hand their handheld and consoles as separate things but eventually decided to just combine them into a single thing. So instead of double dipping on the hardware and potentially the software sales, it just become a single entity. And to say that has worked out well for them is an understatement. Sold 139 million units and are still taking their time of getting to a successor. They are probably going to take the #1 highest selling games machine of all time and they did it by going against their old business practices.

blackoil
1 replies
20h9m

I believe having a talented dictatorial leader at top may be only solution. Like Steve Jobs, Bill gates or Jeff Bezos. Once they believe in a path, they have methods to get it done. Internet Tidal Wave memo is a good example of it. Zuckerberg is able to invest 100s of billions on a future he believes in.

Obviously the observation has a confirmation bias.

pixelfarmer
0 replies
11h12m

Companies through their usual hierarchical structuring are authoritarian by nature. However, many leaders are no actual leaders, they are just bureaucrats, and that is the reason why things get stuck.

yndoendo
0 replies
23h39m

When statements like X was better than Y come up. I always think of "All models are wrong, but some are useful", from statistician George E. P. Box, and rephrase it as "All models are flawed, but some are useful" so that model becomes ambiguous, such as a smartphone, TV, computer, car, programing language, programing design pattern, social media platform, and so on.

Price-point, SGI technology was a financially flawed model pertaining to the growing market and more useful than flawed performance of the low cost technology market.

Did anyone at SGI try to simply buy the low tech products, play with them a bit, and see about slowly integrating your tech to make that low tech product just a little better than the competition and cost effect for the market?

szundi
0 replies
1d

Thanks for this comment, very much appreciated.

hulitu
0 replies
8h25m

The point being that the company was extremely aware of what was happening. It was not taken by surprise. But in spite of that, it was still unable to respond.

They had the best processor on the market, yet they decided to sell Intel and Windows. I really don't underestand what were they smoking.

foobiekr
0 replies
20h58m

I was at one of SGI's competitors and we had teams doing cheap HW - ATi and other cards, like IBM's GPUs at the time - and yet the company as a whole was like "ALL GOOD LET'S KEEP BUILDING MASSIVE CUSTOM GRAPHICS SYSTEMS!"

They were as dead as SGI in the same timeframe.

dclowd9901
0 replies
12h14m

Can you guess why? Was it maybe outmoded or outdated sales models that didn’t take reasonable competition into account.

I’m so sick of dancing around this topic. Managers and business types destroy companies. It never stops. Company after company. How many times do we have to pretend this isn’t the case before we see the truth.

Innovators make things worth selling. Business types drop it on the floor.

bigboy12
0 replies
18h1m

Walking the streets of nyc 19 years ago. I saw a sgi indigo on the street and sadly I just kept walking on. Like wow.

Octokiddie
0 replies
16h28m

It was not taken by surprise. But in spite of that, it was still unable to respond.

This is even a major point of discussion in the book. The incumbents always see it coming a mile away. They can't respond because doing so breaks their business model. Employees can't go to managers and say, "We need to enter this low-margin market for low-end products." Doing so is a good way to get sidelined or fired.

The "dilemma" is that either traditional option, compete with the upstarts or move further up-market, leads to the same outcome - death.

tombert
95 replies
1d1h

There's a few cases in the history of computers where it feels like the world just "chose wrong". One example is the Amiga; the Amiga really was better than anything Apple or Microsoft/IBM was doing at the time, but for market-force reasons that depress me, Commodore isn't the "Apple" of today.

Similarly, it feels like Silicon Graphics is a case where they really should have become more standard. Now, unlike Amiga, they were too expensive to catch on with regular consumers, but I feel like they should have become and stayed the "standard" for workstation computers.

Irix was a really cool OS, and 4Dwm was pretty nice to use and play with. It makes me sad that they beaten by Apple.

snakeyjake
13 replies
1d

One example is the Amiga; the Amiga really was better than anything Apple or Microsoft/IBM was doing at the time

Amiga was only better 1985-1988.

I still have my original Amiga and A2000. I was an Amiga user for a decade. They were very good. I was platform agnostic, caring only to get work done as quickly and easily as possible so I was also an early Macintosh user as well as Sun and PA-RISC. And yes, I still have all of those dinosaurs too.

By 1987 PC and Mac caught up and never looked back.

But by 1988 the PS/2 with a 386 and VGA was out and the A2000 was shipping with a 7MHz 68000 and ECS.

By 1990 the 486s were on the market and Macs were shipping with faster 030s and could be equipped with NuBUS graphics cards that made Amiga graphics modes look like decelerated CGA.

After the A2000 the writing was on the wall.

Note: my perspective is of someone who has always used computers to do work, with ALMOST no care for video games so all of the blitter magic of Amiga was irrelevant to me. That being said when DOOM came out I bought a PC and rarely used my Amigas again.

What I can confidently assert is that I upgraded my A2000 many times and ran into the absolute configuration nightmare that is the Amiga architecture and the problems with grafting upgrades onto a complex system with multiple tiers of RAM and close OS integration with custom chips.

One more bit of heresy is that I always considered Sun's platform to be superior to SGI's.

logicprog
5 replies
1d

Amiga was only better 1985-1988. By 1987 PC and Mac caught up and never looked back.

Oh indubitably! I don't think even the most committed Amiga fan, even the ones that speculate about alternate histories, would deny that at all.

The thing is, though, that only happened because Commodore essentially decided that since it had so much of a head start, it could just rest on its laurels and not really innovate or improve anything substantially, instead of constantly pushing forward like all of its competitors would do, and so eventually the linear or even exponential curve of other hardware manufacturers' improvements outpaced its essentially flat improvement curve. So it doesn't seem like IBM PCs and eventually even Macs outpacing the power of Amiga Hardware was inevitable or inherent from the start.

If they had instead continued to push their lead — actually stuck with the advanced Amiga chips that they were working on before it was canceled and replaced with ECS for instance — I certainly see the possibility of them keeping up with other hardware, and eventually transitioning to 3D acceleration chips instead of 2D acceleration chips when that happened in the console world, eventually perhaps even leading to the Amiga line being the first workstation line to have the gpus, and further cementing their lead, while maintaining everything that made Amiga great.

Speculating even further, as we are seeing currently with the Apple M-series having a computer architecture that is composed of a ton of custom made special purpose chips is actually an extremely effective way of doing things; what if Amiga still existed in this day and age and had a head start in that direction, a platform with a history of being extremely open and well documented and extensible being the first to do this kind of architecture, instead of it being Apple?

Of course there may have been fundamental technical flaws with the Amiga approach that made it unable to keep up with other hardware even if Commodore had had the will; I have seen some decent arguments to that effect, namely that since it was using custom vendor-specific hardware instead of commodity hardware that was used by everyone, they couldn't take advantage of the cross-vendor compatibility like IBM PCs, could and also couldn't take advantage of economies of scale like Intel could, but who knows!

pjmlp
2 replies
1d

From retrogaming talks from former Commodore engineers, the issues were more political and management than technical alone.

logicprog
0 replies
23h46m

That's definitely how it seems to me, which is why I focused on Commodores poor management decisions first and only mentioned the possible technical issues second

AnimalMuppet
0 replies
23h18m

That's kind of typical, though, isn't it? When a company falls off, it's almost always not just technical.

panick21_
1 replies
9h16m

The thing with Commodore was that as a company it was just totally dysfunctional. The basically did little useful development between C64 and the Amiga (the Amiga being mostly not their development). The Amiga didn't sell very well, specially in the US.

The company was going to shit after the Amiga launched, it took a competent manager to save the company and turn the Amiga around into a moderate success.

Commodore didn't really have money to keep up chip development. They had their fab they would have need to upgrade that as well, or drop it somehow.

Another example of that is the Acorn Archimedes. Absolutely fucking incredibly hardware for the price. Like crushing everything in price vs performance. But ... literally launched with a de-novo operating system with 0 applications. And its was a small company in Britain.

The dream scenario is for Sun to realize that they should build a low cost all costume chip device. They had the margin on the higher end business to support such a development for 2-3 generations and to get most software ported to it. They also had the software skill to make the hardware/software in a way that would allow future upgrades.

logicprog
0 replies
5h36m

Imagining Sun buying Amiga and making it a lower end consumer workstation to pair with its higher end ones, with all the much-needed resources and interesting software that would have brought to the Amiga is a really cool thought experiment!

geophile
3 replies
23h57m

I was similar, not really interested in graphics, just a nice programming environment. PCs had that stupid segmented address space (which was not ignorable at the programming language level), expensive tools, and crappy OSes. My Amiga 2000 had a flat address space, a nice C development environment, and multitasking actually worked. It really was ahead of its time, in combining a workstation-like environment and an affordable price.

snakeyjake
2 replies
23h41m

My Amiga 2000 had a flat address space

Chip ram, fast ram, cpu ram, expansion board ram, or slow ram? Did too much ram force your zorro card into the slooooooooooow ram address space (mine did)? Tough cookies bucko!

Macintosh, pounding on table: "RAM is RAM!"

logicprog
0 replies
23h35m

As someone trying to get into Amiga retro competing as a hobby in today's day and age, I find it keeping all the different types of ram straight very confusing lol

geophile
0 replies
18h45m

This was a loooooong time ago. I have no clue.

pjmlp
0 replies
1d

It took a bit more than 1990, for PC 16 bit sound card, Super VGA screens, with Windows 3.1 to be widely adopted for the PC to out perform the Amiga, specially in European price points.

My first PC was acquired in 1992, and still only had a lousy beeper, on a 386SX.

icedchai
0 replies
23h10m

I think you are mostly right, I just think your timing is off. Those early 386 machines and Mac II systems were very expensive, at least 2 to 3x the cost of an Amiga. The average home user wasn't going to drop $8K on a PS/2 model 80 with a 386/16.

By the early 90's the Amiga just wasn't competitive. The chip set barely evolved since 1985. ECS barely added anything over the original chip set. By around 1992 or 1993, 386 systems with SVGA and Soundblaster cards were cheap. Amiga AGA was too little, too late. Also consider the low end AGA system (Amiga 1200) was totally crippled with only 2 megs of slow "chip" RAM.

I was an Amiga fan until 1993. Started with an A500, then A3000. Eventually I moved on to a 486 clone w/Linux. Later on I had a Sun SparcStation 10 at home, so I agree with you on Sun and SGI.

dylan604
0 replies
23h26m

We kept our A2000 viable longer by adding the CPU board with the 030 chip. We went from 7MHz to somewhere around 40MHz or whatever. It meant that my Lightwave render went from 24 hours per frame to a few hours per frame.

qqtt
12 replies
1d

My main problem with Silicon Graphics (& have the same problem with Sun Microsystems) is that they just tried to do too much in propriety hardware and completely resisted standards. Microsoft & IBM "won" because they made computers with actual upgrade paths and operating systems with wide support among upgrade paths. With SGI/Sun you were very much completely locked in to their hardware/software ecosystem and completely at the mercy of their pricing.

In this case, I think the market "chose right" - and the reason that the cheaper options won is because they were just better for the customer, better upgradability, better compatibility, and better competition among companies inside the ecosystems.

One of the most egregious things I point to when discussing SGI/Sun is how they were both so incredibly resistant to something as simple as the ATX/EATX standard for motherboard form factors. They just had to push their own form factors (which could vary widely from product to product) and allowed almost zero interoperability. This is just one small example but the attitude permeated both companies to the extent that it basically killed them.

thisislife2
7 replies
1d

With SGI/Sun you were very much completely locked in to their hardware/software ecosystem and completely at the mercy of their pricing.

How is that in anyway different from Apple today with it's ARM SoCs, soldered SSDs and an OS that requires "entitlements" from Apple to "unlock" features and develop on?

Gracana
4 replies
1d

You can buy a cheap Mac and easily write programs for it. You don't have to spend $40k on a computer, you don't have to buy a support contract, you don't have to buy developer tools.

fuzztester
3 replies
23h1m

You can buy a cheap Mac and easily write programs for it.

Interesting. How cheap? Never used Macs, only Windows and Unix and Linux.

icedchai
0 replies
21h55m

You can get a Mac Mini for $600-ish. Never get the base model though. (FYI, macOS is Unix.)

fuzztester
0 replies
13h38m

Thanks, guys.

cryptoxchange
0 replies
21h38m

Every time I’ve checked over the last decade (including today), you can buy a mac mini that supports the latest macOS for under $250 on ebay. You can also test your app using github actions for free if your use case fits in the free tier.

There is no way to do this for an IBM z16, which is the kind of vendor lock in that people are saying Apple doesn’t have.

mcculley
0 replies
1d

Are there entitlements or unlockable features other than when talking about App Store distribution?

CountHackulus
0 replies
22h52m

Thanks to web browsers and web apps it's not QUITE as bad of a lock-in nowdays. At least from a general consumer point of view.

dekhn
3 replies
1d

The big exception here is that SGI took IrisGL and made it into OpenGL which as a standard lasted far longer than SGI. And OpenGL played a critical role preventing MSFT from taking over the 3D graphics market with Direct3D.

pjmlp
2 replies
23h56m

Except that OpenGL only mattered thanks to Carmack and id Software mini-GL drivers.

It hardly matters nowadays for most game developers.

dekhn
1 replies
23h48m

When I say "hardware graphics market" I'm referring to high performance graphics workstations, not gaming. There is a whole multibillion dollar market there (probably much smaller than games, but still quite significant). It's unclear what carmack's influence on the high performance graphics workstation environment is, because mini-GL left out all the details that mattered to high performance graphics (line rendering would be a good example).

In my opinion, Mesa played a more significant role because it first allowed people to port OpenGL software to run on software-only cheap systems running Linux, and later provided the framework for full OpenGL implementations coupled with hardware acceleration.

Of course, I still greatly enjoyed running Quake on Windows on my 3dfx card with OpenGL.

pjmlp
0 replies
23h42m

Well, put that way it is a market that runs on Windows with OpenGL/DirectX nowadays, or if using GNU/Linux, it is mostly with NVIDIA's proprietary drivers, specially when considering VFX reference platform.

hn_throwaway_99
12 replies
1d1h

Similarly, it feels like Silicon Graphics is a case where they really should have become more standard. Now, unlike Amiga, they were too expensive to catch on with regular consumers, but I feel like they should have become and stayed the "standard" for workstation computers.

I think you highlighted very correctly there, though, why SGI lost. It turned out there were cheaper options, which while not on par with SGI workstations initially, just improved at a faster rate than SGI and eventually ended up with a much better cost/functionality profile. I feel like SGI just bet wrong. The article talks about how they acquired Cray, which were originally these awesome supercomputers. But it turned out supercomputers essentially got replaced by giant networks of much lower cost PCs.

cmrdporcupine
3 replies
1d

This betting wrong on specialization happened over and over again in the late 70s and 80s. The wave of improvements and price reduction in commodity PC hardware was insane, especially from the late 80s onwards. From Lisp machines to specialized graphics/CAD workstations, to "home computer" microcomputer systems, they all were buried because they mistakenly bet against Moore's law and economies of scale.

In 91 I was a dedicated Atari ST user convinced of the superiority of the 68k architecture, running a UUCP node off my hacked up ST. By the end of 92 I had a grey-box 486 running early releases of Linux and that was that. I used to fantasize over the photos and screenshots of workstations in the pages of UnixWorld and similar magazines... But then I could just dress my cheap 486 up to act like one and it was great.

kazinator
2 replies
23h6m

Atari ST and Intel PC are not distant categories. Both are "'home computer microcomputer' systems". Not all home computer systems can win, just like not all browsers can win, not all spreadsheets can win, not all ways of hooking up keyboards and mice to computers can win, ...

cmrdporcupine
1 replies
22h29m

They were distant on market tier but most importantly on economies of scale. The Intel PC market grew exponentially.

kazinator
0 replies
21h30m

Sure, but the economy of scale came from the success. The first IBM PC was a prototype wire-wrapped by hand on a large perf board.

When you switched to Intel in 1992, PC's had already existed since 1981. PC's didn't wipe out most other home computers overnight.

bunderbunder
3 replies
1d

Hypothesis:

What smaller businesses are using will tend to be what takes over in the future, just due to natural processes. When smaller businesses grow, they would generally prefer to fund the concurrent growth of existing vendors that they like using than they are to switch to the existing "industrial-grade" vendor.

At the same time, larger organizations that can afford to start with the industrial-grade vendors are only as loyal as they are locked in.

01HNNWZ0MV43FF
1 replies
1d

I see the same trend in programming languages. Say a really solid career lasts from about 20 to 60, 40 years long. Say that halfway through your career, 20 years in, you're considered a respectable senior dev who gets to influence what languages companies hire for and build on.

So in 20 years in, the current batch of senior devs will be retiring, and the current noobies will have become senior devs.

*Whatever language is easy to learn today will be a big deal in 20 years*

That's how PHP, Python, and JavaScript won. Since JavaScript got so much money poured on it to make it fast, secure, easy, with a big ecosystem, I say JS (or at least TS) will still be a big deal in 20 years.

The latest batch of languages know this, and that's why there are no big minimal languages. Rust comes with a good package manager, unit tester, linter, self-updater, etc., because a language with friction for noobies will simply die off.

One might ask how we got stuck with the languages of script kiddies and custom animated mouse cursors for websites. There's no other way it could turn out, that's just how people learn languages.

chuckadams
0 replies
22h50m

Back in the old days there was a glut of crappy bloated slow software written in BASIC. JS is the BASIC of the 21st century: you can write good software in it, but the low bar to entry means sifting through a lot of dross too.

My take: that’s just fine. Tightly crafted code is not a lost art, and is in fact getting easier to write these days. You’re just not forced into scrabbling for every last byte and cpu cycle anymore just to get acceptable results.

tombert
0 replies
1d

I mean, there are corporations who only sell to very large corporations and have had plenty of success doing so. Stuff like computational fluid dynamics software, for example, has a pretty-finite number of potential clients, and I don't think I could afford a license to ANSYS even if I wanted one [1], since it goes into the tens of thousands of dollars. I don't think there are a ton of startups using it.

But I think you're broadly right.

[1] Yes I know about OpenFOAM, I know I could use that if I really wanted.

tombert
1 replies
1d1h

Yeah, I'm more annoyed about Amiga than SGI. They were priced competitively with Apple and IBM offerings.

I guess it's just kind of impossible to predict the future. I don't think it's an incompetent decision to try and focus entirely on the workstation world; there are lots of businesses that make no attempt to market to consumers, and only market to large companies/organizations, since the way budgeting works with big companies is sort of categorically different than consumer budgets.

But you're absolutely right. Apple and Windows computers just kept getting better and better, faster and faster, and cheaper and cheaper, as did 3D modeling and video editing software for them. I mean, hell, as a 12 year old kid in 2003, I had both Lightwave 3D (student license) and Screenblast Movie Studio (now Vegas) running on my cheap, low-spec desktop computer, and it was running fast enough to be useful (at least for standard definition).

mike_hearn
0 replies
1d

Of course, the reason they got better so fast is volume. There was just way more investment into those platforms. Which means this explanation is somewhat circular: they were successful because they were successful.

I think a more useful explanation is that people rate the value of avoiding vendor lockin extraordinarily high, to the extent that people will happily pick worse technology if there's at least two competing vendors to choose from. The IBM PCs were not good, but for convoluted legal reasons related to screwups by IBM their tech became a competitive ecosystem. Bad for IBM, good for everyone else. Their competitors did not make that "mistake" and so became less preferred.

Microsoft won for a while despite being single vendor because the alternative was UNIX, which was at least sorta multi-vendor at the OS level, except that portability between UNIXen was ropey at best in the 90s and of course you traded software lockin for hardware lockin; not really an improvement. Combined with the much more expensive hardware, lack of gaming and terrible UI toolkits (of which Microsoft was the undisputed master in the 90s) and then later Linux, and that was goodbye to them.

Of course after a decade of the Windows monopoly everyone was looking for a way out and settled on abusing an interactive document format, as it was the nearest thing lying around that was a non-Microsoft specific way to display UI. And browsers were also a competitive ecosystem so a double win. HTML based UIs totally sucked for the end users, but .... multi-vendor is worth more than nice UI, so, it wins.

See also how Android wiped out every other mobile OS except iOS (nobody cares much about lockin for mobile apps, the value of them is just not high enough).

gspencley
1 replies
1d

I still dream of having a Beowulf Cluster of Crays.

One day ...

pjmlp
7 replies
1d

Yes, Irix is one of the few UNIX based OSes that I actually find cool.

hinkley
4 replies
23h50m

It certainly got fewer complaints than HP-UX.

fuzztester
2 replies
22h49m

What were the complaints that HP-UX used to get?

I used it for a while earlier at work, and don't remember many problems with it. One did have to apply OS patches fairly regularly to it, but IIRC, that process was somewhat smooth.

fuzztester
0 replies
13h38m

Okay, got it. :)

pjmlp
0 replies
23h40m

On HP-UX 10, back in 2000, the C compiler version I was using still wasn't fully ANSI C, and needed K&R C function declarations, but hey at least we had containers (HP Vault), and 64 bit file system access.

sys_64738
1 replies
23h5m

People are always passionate about various UNIX systems and their derivatives like Linux. Windows is so utilitarian.

pjmlp
0 replies
21h43m

Outside Irix, Tru64, Appolo, Solaris with NeWS, NeXTSTEP, all other UNIXes are pretty meh.

Regarding Windows, some time reading the excellent Windows Internals book series is recommended.

epcoa
7 replies
1d

the Amiga really was better than anything Apple or Microsoft/IBM was doing at the time

At the time. A brief moment in time, and then they had no path forward and were rapidly steamrolled. Nothing was "chosen wrong" in this aspect.

tombert
3 replies
1d

Well, wait, the Amiga had preemptive multitasking way before Apple or Windows got it, like the mid 80s. I don't think Windows got it until Windows NT, and it didn't become mainstream until Windows 95. Macs had bizarre cooperative multitasking that would freeze if you just thought it about it funny [1] all the way until OS X.

There's other stuff too; they had better color graphics in the 80s while DOS was still dealing with CGA and EGA, and decent sound hardware. Even by 1990, the video toaster was released, well before it got any port to DOS.

[1] I'm sure it got better, my first exposure to it was System 7 and that thing was an unholy mess. I didn't touch macOS again until OS X.

epcoa
2 replies
1d

Long before Windows 95 there was DOOM and DOOM would not run on an Amiga.

80s while DOS was still dealing with CGA and EGA, and decent sound hardware.

And then the 80s ended. What point did I make that you are contradicting?

Even by 1990, the video toaster was released,

And if you wanted to do CAD? Would you use an Amiga? Probably not. What about desktop publishing? Pointing out that Amiga had carved out a niche (in video editing) when that was the norm back in those days doesn't make any strong comment about the long term superiority or viability of the platform.

Also, I don't buy into the idea that just because a company had something "superior" for a short period of time with no further company direction that they didn't lose fair and square. That Amiga had something cool in the 80s but didn't or couldn't evolve isn't because the market "chose wrong". Commodore as a company was such a piece of shit it made Apple of the 80s look well run. Suffering a few more years with the occasional bomb on System 7 was not a market failure.

Macs had bizarre cooperative multitasking

What was bizarre about it, compared to any other cooperative multitasking system of the time? Also you seem to be fixated on preemptive multitasking to the neglect of things like memory protection.

tombert
1 replies
23h32m

Long before Windows 95 there was DOOM and DOOM would not run on an Amiga.

Yeah fair. I do wonder if a port like the SNES version would have been possible if id would have greenlit it, but that's a "what if" universe. Alien Breed 3D would run on a 1200, but IIRC it ran pretty poorly on that.

And then the 80s ended. What point did I make that you are contradicting?

I mean, yes, VGA cards and Soundblaster cards were around in 1990, but they weren't really standard for several years later.

And if you wanted to do CAD? Would you use an Amiga? Probably not. What about desktop publishing? Pointing out that Amiga had carved out a niche (in video editing) when that was the norm back in those days doesn't make any strong comment about the long term superiority or viability of the platform.

Also fair. I'll acknowledge my view is a bit myopic, since I don't really do CAD or desktop publishing, but I do some occasional video editing, and I do think Amigas were quite impressive on that front. You're right in saying it was a "niche" though.

Commodore as a company was such a piece of shit it made Apple of the 80s look well run.

No argument here. Still think that the hardware was pretty cool though.

What was bizarre about it

I guess "bizarre" was the wrong word. It was just really really unstable, and System 7 would constantly freeze for seemingly no reason and I hated it.

Also you seem to be fixated on preemptive multitasking to the neglect of things like memory protection.

I feel like if Commodore had been competently run, they could have done work to get proper protected memory support, but again that's of course a "what if" universe that we can't really know for sure.

I guess what frustrates me is that it did genuinely feel like Commodore was really ahead of the curve. I think the fact that they had something pretty advanced like preemptive multitasking (edit: fixed typo) in the mid 80s was a solid core to build on, and I do kind of wish it had caught on and iterated. I see no reason why the Amiga couldn't have eventually gotten decent CAD and Desktop publishing software. I think Commodore didn't think they had to keep growing.

icedchai
0 replies
21h4m

The Amiga OS was designed in a way that protected memory support was basically impossible. Message passing was used everywhere. How did it work? One process ("task", technically) sent a pointer to another, a small header with arbitrary data, which could contain anything, including other pointers. Processes would literally read and write each other's memory.

sys_64738
1 replies
23h9m

Commodore's story is more about achieving the impossible with 1-2 engineers building each computer. Commodore was a company built around Jack Tramiel who wanted his widgets to ship in volumes to "the masses, not classes". When he left then it was a lifestyle sucking cash machine for Irving Gould who appointed incompetent CEO after incompetent CEO after Tramiel. The miracle is it staggered on ten years post-Jack.

But the reality is the Commodore 64 kept Commodore going during most of that period rather than Amiga sales. It's similar to Apple where the Apple 2 kept Apple afloat during the 80s and 90s until Steve returned.

cmrdporcupine
0 replies
22h57m

Times changed though, too, and Tramiel couldn't replicate his success w the C64 at Atari Corp, despite bringing the same philosophy (and many key engineers) over there.

By the late 80s the "microcomputer" hobby/games market was dead and systems like the ST and Amiga (or Acorn Archimedes, etc.) were anachronisms. You had to be a PC-compat or a Mac or a Unix workstation or you were dead. Commodore and Atari both tried to push themselves into that workstation tier by selling cheaper 68030 machines than Sun, etc, but without success.

logicprog
0 replies
23h31m

they had no path forward

This is I think the premise that you and people like me who think Amiga could have gone on to do great things disagree on, I think. Most Amiga fans would say that it totally had a path forward, or at least there is no evidence that it didn't, and the failure to follow that path therefore it wasn't an inherent technical problem, but a problem of politics and management. Do you have any evidence to the contrary?

jandrese
6 replies
1d

SGI dug their own grave. Not only were the workstations expensive, but they demanded outrageously priced support contracts. This behavior drives people nuts and will insure that the switch to a competitor the instant it becomes an option. Despite the high cost, the support contracts had a pretty lousy reputation as well, with long wait times for repairs from a handful of overworked techs. Even worse is the company turned away from its core competencies to focus on being an also-ran in the PC workstation market.

There was a window in the mid-90s where it would have been possible for SGI to develop a PC 3D accelerator for the consumer market using their GE technology, but nobody in the C-Suite had the stomach to make a new product that would undercut the enormous profit margins on their core product. It's the classic corporate trap. Missing out on the next big thing because you can't see past next quarter's numbers. Imagine basically an N64 on a PCI card for $150 in 1996. The launch versions could be bundled with a fully accelerated version of Quake. The market would have exploded.

grumpyprole
4 replies
1d

The market would have exploded

Absolutely, they could have been where Nvidia is now!

Keyframe
2 replies
1d

I'd argue Nvidia is ex-SGI, and so is ex-ATI. It's all their crew in the beginnings.

foobarian
1 replies
23h56m

I wish we could have a debugging view of the universe, draw a diagram with clusters of people labeled with company names, and watch them change over time. :-)

jmtulloss
0 replies
23h10m

This view would certainly explain to people outside of Silicon Valley/ SF why the Bay Area has been so dominant in our industry for so many years.

christkv
0 replies
1d

Or they could have 3dfxed themselves.

cduzz
0 replies
20h36m

Ugh.

Worked at a university in the early 90s.

Maybe irix was okay to use if you were just sitting in front of it doing rando user / graphics things, but administering it was unbearable. The license fees to get OS updates were exorbitant; you'd have to get wacky new licenses to enable NFS or NIS and you'd need new kernels for just about anything.

As far as I could tell they were a cursed company that hated their users. "Here's a pretty thing that does one thing well but is otherwise insane and will ruin you when you need it most."

Good riddance.

HarHarVeryFunny
6 replies
1d

The reason SGI failed, and eventually Sun too, isn't because the world "chose wrong", but because their performance simply did not keep up with x86.

When these RISC-based workstations were initially released their performance, especially at graphics, was well beyond what a PC could do, and justified their high prices. A "workstation" was in a class by itself, and helped establish the RISC mystique.

However, eventually Intel caught up with the performance, at a lower price, and that was pretty much the end. Sun lived on for a while based on their OS and software ecosystem, but eventually that was not enough especially with the advent of Linux, GCC, etc, as a free alternative.

sys_64738
3 replies
23h15m

Sun had the perfect opportunity with Utility Computing around the mid-2000s but when cloud took off we had Oracle buying SUNW. They killed Sun Cloud which had the opportunity to be big, vast, and powered by JAVA hardware.

Sun Microsystems was a company like no other. The last of a dying breed of "family" technology companies.

msisk6
2 replies
23h3m

I was at the MySQL conference when it was announced that Oracle was buying Sun. It just took all the life out of the conference. All the Sun folks were super pissed off. Truly the end of an era.

icedchai
0 replies
21h53m

I remember that time. It felt like Sun was on death's doorstep since the dot-com crash. On the hardware side, the market was flooded with used Sun hardware. On the software side, Linux was "good enough" for most workloads.

hodgesrm
0 replies
20h25m

I was there too. It certainly felt "timed" to maximize the sense of deflation for people working on MySQL. Perhaps it was just coincidence. IIRC Larry Ellison said that the crown jewel in the deal was actually Java.

hinkley
0 replies
23h54m

Sun really struggled to make full use of their multicore systems. That m:n process model is coming back with fibers and libuv, but we have programming primitives and a deeper roster of experienced devs now than we did then. Back then they caused problems with scalability.

There were times when Java ran better on Intel than on Solaris.

cduzz
0 replies
23h13m

Ivan Sutherland described the reason [1] why PCs won a long time ago. Basically a custom tool may do a thing "better" than a general purpose tool for a while, but eventually, because more resources are spent improving the general tool, the generalized tool will be able to do the same thing as the specialty tool, but more flexibly and economically.

[1] http://www.cap-lore.com/Hardware/Wheel.html

mtillman
5 replies
1d

The Amiga couldn't handle the performance requirements of Doom at the time (Game Engine Black Book Doom). Workbench was more fun than Windows and at least the install process that was early linux.

As much as I loved my O2 (my first professional computer), it was underpowered for the time for anything other than texture manipulation. The closed source nature of that time period and the hardware sales motion meant that you were paying through the teeth for compilers on top of already very expensive hardware. The Cray-linked Origin 200's ran Netscape web server with ease but that's a lot of hardware in a time period when everything went out of date very quickly-donated ours! Irix still looks better than the new Mac OS UIs IMO but no-Motif is a small price to pay for far cheaper access to SDKs IMO. Also, Irix was hilariously insecure due in part to its closed source nature. https://insecure.org/sploits_irix.html

axpvms
2 replies
1d

Also, Irix was hilariously insecure due in part to its closed source nature.

That was in addition to having three default accounts with well known passwords and a telnet server.

icedchai
1 replies
23h3m

Some versions of IRIX (4.x, maybe?) also defaulted to having X11 authentication disabled. Anyone in the office could "xmelt" your screen... or worse.

nyrikki
0 replies
15h24m

Oracle EBS required root X11 authentication be disabled to use the compositor for PDF generation even 10 years later so still pretty common.

Oracle threatened to not support us when I used an unprivileged Xvfb instqance instead.

Still stupid but not that uncommon back then.

nyrikki
0 replies
15h28m

We had 6 O200s Cray-linked into three nodes in an CXFS cluster to run an appletalk server backed by clarion arrays. While there were serious limits caused by the single metadata server, XVM and CXFS were better than anything provided by Veritas or the other major UNIX vendors of the day.

The Fibre Channel XIO boards were really needed back then for that application as PCI was still way too slow.

I was sad to know that when I left that job that SGI server was being replaced and the support personal at SGI were going to lose their jobs too.

downut
0 replies
1d

"... hardware sales motion meant that you were paying through the teeth for compilers..."

For Fortran? My memory is hazy but at NASA NAS a bunch of us were using gcc/g++ starting ~1990. g++ was... an adventure. Building my own (fine!) compiler for free(!) got me hooked on OSS to the point that when Linux/FreeBSD launched I jumped in as fast as I could.

I really loved my various SGI boxen. Magical times. I was a NASA "manager" so had the Macintosh "manager" interface box that I solved by keeping it turned off.

hnhg
3 replies
1d1h

The people that created the Amiga weren't the same people as the ones leading Commodore. Apple's success seems to have been heavily based on the company's leader being very involved in product development and passionate about it.

Along the same lines, there is an alternate timeline where the Sharp X68000 took over the world: https://www.youtube.com/watch?v=OepeiBF5Jnk

randomdata
1 replies
1d

I'm not sure Apple did continue to succeed after its early success. It eventually gave up its name to NeXT, which is who found later success.

samatman
0 replies
23h35m

The standard quip here is that NeXT purchased Apple for negative $400 million.

tombert
0 replies
1d

I've actually seen that video!

Yeah, I think that would also have been a better timeline; I'm just stuck in the anglo-world and thus my knowledge is mostly limited to what was released in the US or Europe.

cladopa
2 replies
23h45m

I never had an Amiga, but I had friends that had it. It was a superior tech only for a very small period of time.

What happened was Intel, they took great decisions like automating the design of their processors and this made them grow at an incredible pace. The Amiga depended on a different processor that stagnated.

sys_64738
0 replies
23h6m

Intel never pulled ahead until the Pentium but by then Motorola weren't interested in the 68K series.

KerrAvon
0 replies
23h20m

The 68k CPU lineup at the heart of the Amiga was competitive well into the 90's; the Amiga had run out of juice by 1989. The Amiga was only as good as the custom chips. If Commodore kept investing in R&D for the custom chips, they would have at least remained competitive.

causi
2 replies
1d1h

"Revolutionaries rarely get to live in the societies they created"

I think it's a combination of a skillset/culture needed to create a paradigm shift isn't the same one needed to compete with others on a playing field you built, and of complacency. It happens over an over. We saw it happen with RIM, and we're watching it happen right now with Prusa Research.

itronitron
1 replies
1d

Both Prusa and SGI are (and were) probably largely unknown to 90% of their potential market. The globally recognized companies tend to spend far more on marketing than anyone in a STEM field would consider remotely reasonable.

fuzztester
0 replies
21h21m

True. In the early to middle days of Java, I read that Sun spent millions of dollars on marketing it, and related stuff around it.

JohnBooty
2 replies
1d

If Amiga really "deserved" to win, I think they wouldn't have been eclipsed by the PC ecosystem in terms of performance.

They leapt out ahead of the competition with an advanced OS, purpose-built for graphics and sound in a way that PCs and Macs weren't.

Which was great. But they weren't really better than the competition. They were just doing something the competition wasn't. And when the competition actually started doing those things they got eclipsed in a hurry.

I wonder if Tesla will suffer the same fate. They were obviously around a decade ahead of the established players when it came to electric cars. But once the other established players actually got serious about electric cars, Tesla largely stopped being special, and upstarts like Lucid and Rivian are neck and neck with them (in terms of compelling products, not sales) as well.

hinkley
0 replies
23h52m

Tesla will also suffer a reverse cult of personality problem.

I don’t know anyone at Rivian so my opinion of them is neutral. Meanwhile Tesla is run by the jackass who ruined twitter.

cduzz
0 replies
1d

"the future is already here, it just isn't evenly distributed."

This means there are products out there with futuristic features that will be seen as requirements for all things going forward and right now those features are niche elements of some product.

The Amiga was a fantastic device but not a general purpose device. Lots of things are fantastic at a niche but not general, and those almost always fail.

Is this also the "worse is better" truism?

bluedino
1 replies
1d

They were destined for eventually dying like the rest of the high end UNIX workstation market. Linux and x86 got better and better every year.

tombert
0 replies
1d

Yeah, and OS X more or less mainstream-ized consumer UNIX as well. It gave you access to the UNIX tools in the command line if you wanted them, had a solid UNIX core, but was a lot cheaper than an SGI and also easy to use.

prpl
0 replies
1d1h

It’s just a lesson in worse is (often) better. If you can do some most of the job with something that is either cheaper, easier to build, or easier to iterate on, then it will often overtake a better engineered solution.

knorker
0 replies
1d

Well, for SGI that's like saying the world "chose wrong" that long distance travel is not done by Saturn 5 rockets.

The Saturn 5 was clearly a technical marvel better than any plane, and it'd get you anywhere much faster.

If you spare no expense, you get a better product. Sure. I'm also not surprised that a $100k BMW is more comfortable than a Renault Clio.

ip26
0 replies
1d1h

We've seen again and again that the high end of the computer market can't sustain itself; the mass market outruns it. The result is that the high end works best when leveraging the mass market instead of trying to compete with it.

See the dominance of Threadripper in workstations, which is built on top of mainstream desktop and server parts bin. Or look at the Epyc based supercomputers, rumored to be the only supercomputers to turn a net profit for the suppliers, thanks to leveraging a lot of existing IP.

cameldrv
0 replies
23h8m

I used some SGIs in the mid-late nineties, and they did have cool 3D graphics capabilities. I found 4dwm to be kind of cool but mostly gimmicky and it was really slow on the Indy and O2. Windows 95/NT were much snappier on contemporary hardware.

By '97 or so SGI actually had essentially given up competing when they shut down the team that was developing the successor to InfiniteReality.

In a sense though, Silicon Graphics did become more standard, in that their original 3D framework was Iris GL, which then evolved into OpenGL, which became the main 3D graphics standard for many years.

davepeck
23 replies
1d1h

I was there near the end. First, as a summer intern in 1998, and then in 1999 as a full time engineer on what is now Google's Mountain View campus. SGI had always been a dream company for me. I'd first learned about them in high school; now, right out of college, I'd somehow managed to land a dream job.

SGI's hardware was cutting-edge and exotic. IRIX was killer (sorry Solaris). Cray was a subdivision. My coworkers used emacs, too. They put an O2 on my desk!

The dream didn't last long. Major layoffs hit just a few months after I started full time. I wrote about the experience here: https://davepeck.org/2009/02/11/the-luckiest-bad-luck/

dxbydt
7 replies
22h27m

SGI had always been a dream company

It was a dream company for pretty much every siggraph person at that time. I was in grad school, eagerly awaiting a very popular 3-semester course in computer graphics. It had been devised and taught by a young promising professor who had published some pioneering siggraph papers. I signed up for the course. On the first day of class, the head of the department walked in and said the professor had been recruited by his dream company SGI for an ungodly sum of money to work on some Jewish director’s movie about a dinosaur themepark. I thought ok, whatever, someone else will teach the course. The bastards scrapped the entire 3 series computer graphics module because there wasn’t anyone else who could teach that. So we had to pick from one of the usual dumb options - databases, OS, Networks, Compilers. Since then I’ve always held a grudge against sgi.

krapp
2 replies
15h54m

to work on some Jewish director’s movie about a dinosaur themepark

I assume you mean Steven Spielberg and one of the Jurassic Park films?

If so, why can't you just say so? Why are you referring to Steven Spielberg, one of the most famous directors of all time, as "some Jewish director?" Do you think people won't recognize the name? I promise people know who Steven Spielberg is.

theonething
0 replies
10h44m

I think the GP was telling their story in the context of that time. It's a technique to help the reader more fully understand the context. I'm almost sure there is a term for this literary technique.

Kamq
0 replies
14h34m

If so, why can't you just say so?

Based on the comment, it sounds like that's the way the head of the department phrased it.

Presumably the department head didn't know the title as it hadn't been released yet.

brcmthrowaway
2 replies
21h51m

Jewish director? Hrmph

Y_Y
1 replies
21h23m

Spielberg had a bar mitzvah, what more do you want?

krylon
0 replies
6h13m

I don't anyone is disputing that, but why did the department head see a need to point that out?

IntelMiner
0 replies
16h9m

"Jewish director" is an...interesting description

mrpippy
4 replies
1d

What did you work on at SGI during your brief stint?

davepeck
3 replies
1d

MineSet, their data mining and visualization package.

bcantrill
2 replies
12h14m

If a may, can I fact check a story conveyed to me through a mutual acquaintance of ours? The story was that SGI was trying to sell off MineSet, and needed the team to stick around long enough to sell them off -- so a bonus was to be given after a short period of time (a month maybe?). The bonus was significant enough to get people to at least defer a job search ($10K?), but SGI didn't manage to find a buyer. The check was to hit bank accounts on a particular day; the team waited to hear word that the literal money was in the bank -- and then all quit simultaneously.

Is there at least some truthiness to it? Or has this just become Silicon Valley urban legend in my head?

davepeck
1 replies
3h34m

That rings a bell although fuzzily: as the new kid from school, I was pretty disconnected from the politics of the moment. I do seem to remember that the MineSet team departed en masse, but IIRC that departure roughly coincided with broader layoffs in the org.

(With apologies for reviving 90s IRIX/Solaris snark in my earlier post. :-)

bcantrill
0 replies
35m

Ha -- no worries on the snark; Irix probably was a better system in ~1998, as ZFS, DTrace, Zones, SMF, FMA, etc. were all still in the future...

oaktowner
3 replies
1d

I worked at Google from 2013 to 2020. There were definitely employees (maybe a majority) who assumed that Google would always be the dominant force in technology. Those of us who were a bit older always understood that everything changes in Silicon Valley.

Those buildings represented that change to me. I can remember coming to concerts at the Shoreline in the 90s and looking at those Silicon Graphics buildings: they looked so cool, and they represented the cutting edge of technology (at the time). And yet...it all disappeared.

Same goes for the Sun campus which is where Meta/Facebook is now. Famously, the Facebook entrance sign is literally the same old Sun sign, just turned around! [0]

So I always cautioned co-workers: this too, shall pass. Even Google.

[0] https://www.businessinsider.com/why-suns-logo-is-on-the-back...

dbreunig
2 replies
1d

Meta still has the Silicon Graphics logos on a few glass conference room doors in building 16, I believe. At least they were there in 2012.

Great memento mori.

dbreunig
0 replies
18h30m

I do! Thanks!

lowbloodsugar
1 replies
16h35m

That's funny because my reaction to the O2 was "oh, this is far too expensive for what it is". Was workin on N64 game, and the other teams were using the Indy devkits, while we had PCs with the SN systems dev kits. Writing was on the wall at that point.

davepeck
0 replies
14h27m

Yeah, the O2 definitely was too expensive for what it was. And while it was the least cool and powerful of the lineup by far, as a recent college grad, it was still the coolest computer I had ever had on my desk. ;-)

ska
0 replies
20h20m

SGI's hardware was cutting-edge and exotic.

This was their downfall, trying to scale out adoption with esoteric hardware.

I remember being quoted $18k ish for memory upgrade on a O2 or origin, same amount of memory I had just bought for $500 for an intel Linux box at home.

Sure, it wasn’t apples to apples, but I remember thinking very clearly that this wasn’t going to end well for SGI.

ryandrake
0 replies
1d

I graduated undergrad in 1998 and can confirm that SGI was the company to go to. I felt so jealous of those few guys who had SGI offers, where I had to settle for a more generic PC graphics company. History is what it is but the SGI really had that luster that only a handful of companies ever boasted.

e40
0 replies
3h58m

Beg to differ on IRIX. I always hated it as an ISV. Solaris was way better to work with.

alecco
0 replies
23h44m

I had to support an open source library for all major unixes and the Irix compiler was by far the best one. It took years for the rest to catch up. But it took ages to compile with optimizations on. Good times.

vrinsd
14 replies
23h59m

I believe nVidia was started with a lot of SGI's core technology ; not "I have a good idea and I can't do it here" and more like "let me just take this stuff I doubt anyone will notice". I think SGI sued but didn't really pursue the matter because they didn't really see nVidia as a threat. I think Jensen was pivotal in this "technology transfer".

Regarding computing cycles, boom/bust, I recently re-read Soul of New Machine and was struck by how much the world has NOT changed. Sure we're not talking about micro/mini-computers and writing micro-coded assembly but the whole "the market is pivoting and we need to ride this wave" and "work like a dog to meet some almost unobtainum goal" seems to still underpin being an engineer in "tech" today.

formerly_proven
6 replies
22h45m

From my reading SGI was already dead and falling apart by that time. If you look at 3D, SGI had two graphics architectures in the 90s: RealityEngine from 1992 and InfiniteReality from 1996. They never managed to release a follow-up to IR. Similarly everything that came after about 1996-97 was a refresh of a prior product with only marginal changes. And then they went bankrupt in the early 2000s. So SGI had really only a very brief productive period that was over by the second half of the 1990s.

SGI also never had a presence in business critical applications which gave some of the other vendors more momentum (HP-UX/PA-RISC, VMS/Alpha, Solaris/SPARC).

cuno
3 replies
20h35m

I worked at SGI on the next generation (code named Bali) in 1998 (whole year as an intern) and 1999 (part time while finishing my degree, flying back and forth from Australia). Bali was revolutionary. The goal was realtime Renderman and it really would. I had an absolute blast. I ended up designing the highspeed data paths (shader operations) for world's first floating point frame buffer (FP16 though we called it S10E5) with the logic on embedded DRAM for maximum floating point throughput. It was light years ahead of its time. But the plug got pulled just as we were taping out. Most of the team ended up at Nvidia or ArtX/ATI. The GPU industry was a small world of engineers back then. We'd have house parties with GPU engineers across all the company names you'd expect, and with beer flowing sometimes maybe a few secrets could eh spill. We had an immersive room to give visual demos and Stephen Hawking came in once pitching for a discount.

For team building, we launched potato canons into NASA Moffet field, blew up or melted Sun machines for fun with thermite and explosives. Lots of amazing people and fond memories for a kid getting started.

mrpippy
1 replies
17h33m

Very cool. Was Bali going to be the next high-end architecture after InfiniteReality? (I think IR was code-named “Kona” so the tropical codenames fit)

Why did they cancel it, money running out? It’s sad to think they were close to a new architecture but then just kept selling IR for years (and even sold a FireGL-based “Onyx” by the end).

Also was it a separate team working on the lower-end graphics like VPro/Odyssey?

cuno
0 replies
12h38m

Yes Bali was the next gen architecture and incredibly scalable. It consisted of many different chips connected together in a network that could scale. The R chip was so big existing tools couldn't handle it and ppl were writing their own tools. As a result it was very expensive to tape out so many hefty chips and I think that's why when it came time, and with a financial crisis, upper management pulled the plug.

Yes there were separate teams working on the lower-end graphics.

phonon
0 replies
13h43m

Why was Bali cancelled?

vrinsd
1 replies
21h55m

Well,

Most Hollywood effects were all done on SGI systems before the slow migration to Linux. Renderman, Maya, were all SGI first-party programs.

Also SGI made huge advances in NUMA and machines with dozens of CPUs/processors before most other companies ventured into this space.

But not business critical like IBM CICS or Java.

1. https://en.wikipedia.org/wiki/NUMAlink

2. https://www.cs.ucr.edu/~bhuyan/CS213/2004/numalink.pdf

3. https://cseweb.ucsd.edu/classes/fa12/cse260-b/Lectures/Lec17...

sllabres
0 replies
19h38m

The large Origin servers and the nice indigo workstations at trade fairs with their cool real time visualizations comes into my mind. Also applications like Softimage, the 4Dwm desktop ...

Later the large Altix NUMA systems with core counts in unprecedented sizes (and problems booting due to lock contention ;)

And of course their donation of the XFS filesystem to the linux world!

miohtama
4 replies
23h43m

The Nvidia lawsuit is discussed in the article.

_DeadFred_
3 replies
22h50m

I mean there's more to it. NVidia literally just took SGI's IP. Only more blatant start was Cisco, where they straight stole a University computer.

meekaaku
1 replies
22h18m

where can i read about this cisco thing?

takinola
0 replies
21h21m

Cisco was started by a husband/wife team who were the heads of IT for the Stanford Electrical Engineering School and Business School respectively. Anecdotally, they first developed the technology trying to connect the networks for both schools.

markus_zhang
0 replies
23h14m

I love "Soul of New Machine" too. It was a blast to read. It even made me ponder the possibility to start over at 40+ and do something hardware-wise (or very low-level software). Of course I then found myself drown by 2 mortgages and dropped the thought.

rongenre
6 replies
1d1h

I played with SGI machines in college and they felt like.. the future. I really hoped they would hire me when I graduated.

Incredible, though, how the relatively cheaper Windows NT machines and 3dfx cards and graphics software just killed them. I was a little sad when I wandered around the campus of an employer in Mountain View and noticed the fading sign that had what was left of the SGI logo.

jandrese
3 replies
1d1h

The awesome old cube logo or the new "we spent millions of dollars on a professional marketing department to design a new logo" that is just the initials in a boring font and off center?

I co-oped for SGI onsite in the sales/marketing/support for a major ISP of the day back in the late 90s and the buzz around the office was that the company (at this point experimenting with overpriced Windows NT boxes and generic Linux servers) was experiencing massive brain drain to some brand new startup that was going to make something called a "GeForce" card for cheap PCs that was going to avoid the pitfalls of the then popular Voodoo cards. Apparently the engineers were unhappy with the direction the company was taking under the new leadership and thought that there was still an interest in graphics acceleration.

mrpippy
1 replies
1d

The "sgi" logo was a big step down from the cube, but it was a lot more attractive than the Rackable/Silicon Graphics International "sgi" logo that looked like a cheap knockoff of the previous one.

https://en.wikipedia.org/wiki/Silicon_Graphics_International

theideaofcoffee
0 replies
1d

It really was a letdown when Rackable resurrected SGI and then brought about that ... thing of a logo. It just felt it hollowed out the brand even more, even if SGI itself was still making some interesting hardware at the time-namely the Altix 4700, UV and ICE, the soul just wasn't there anymore.

rongenre
0 replies
17h49m

The awesome cube, which apparently didn't look good when faxed...

technothrasher
0 replies
1d

I played with SGI machines in college and they felt like.. the future.

I had a couple of Indigos that I supported while an undergraduate (I had a student job with the University's Unix group in their computing center), and the SGIs felt to me exactly like the Amiga- Really cool, but kind of lopsided. I tended to do most of my work on the SPARCstations and ignore the SGIs unless I specifically wanted to play with the graphics stuff.

I actually still have an Indigo XS24 that I collected at one point over the years. Tried to get it to boot a bit ago but it's dead, unfortunately.

nullindividual
0 replies
21h50m

3Dfx didn't play in the SGI space. But Matrox (for 2D), 3Dlabs (another RIP), Orchard (used 3Dlabs chip), STB (again, 3Dlabs chip...), and Diamond (uh... 3Dlabs!).

3Dfx grew up in the arcade market. They were always consumer-focused.

Uhhrrr
4 replies
1d

The article doesn't mention the reason for the fall: less good but cheaper competitors. First Sun, then Windows NT and Linux.

cf100clunk
2 replies
23h24m

The article doesn't mention the reason for the fall: less good but cheaper competitors

The article has this: ''As Bob Bishop took the reigns of SGI, things looked dark. AMD announced their 64 bit architecture in October, PC graphics had made massive strides while remaining significantly less expensive than SGI’s offerings, NT was proving to be a solid and less expensive competitor to UNIX, Linux was eating away at traditional UNIX market segments, and Itanium still hadn’t launched.''

I can agree with almost all of that statement but I object to the ''NT was proving to be a solid and less expensive competitor to UNIX'' part as mostly false in any mixed OS environment over which I'd ever been admin.

Uhhrrr
0 replies
19h41m

But that's 1999, and they were already losing money in 1997, and the article doesn't say why. Sun was why.

BirAdam
0 replies
21h3m

Well, do remember the cost of a UNIX license at the time (unless you were using BSD). If you didn’t have thousands of dollars on hand, NT was a good choice.

pipeline_peak
0 replies
22h52m

Look harder

ChrisMarshallNY
4 replies
1d1h

I remember having a Personal Iris, at the company I worked at, and, later, an Indigo. We never used them. I think they were really there, to impress the visitors (They were in our showroom).

I remember the colors as being very different, from the photos, though.

The Personal Iris was a deep purplish-brown, and the Indigo was ... indigo.

Jim Clark sounds like my kinda guy. I made a hash of my teenage years, and barely squeaked in, with a GED, myself. It has all worked out, OK, in the end, though.

randomdata
2 replies
1d

> We never used them.

When I was in high school we had a lab full of SGI machines. They also never got used. Hundreds of thousands of dollars of computing equipment, and probably that much again in software licenses (at the commercial rate), just sitting there doing nothing. It was heartbreaking.

On a happy note, the SGI bus (a semi-trailer full of SGI machines demoing their capabilities) came to school one time. As a teenage nerd, getting to play with a refrigerator-sized Onyx2 was a good day.

mrpippy
1 replies
23h48m

My goodness, at a high school? Like Indys, or O2s? Was this a private school?

randomdata
0 replies
23h33m

They were O2s. Rural public school.

There were all kinds of toys, though. There was a dedicated classroom setup for video-based remote learning some 30 years before COVID - that got used for one semester, from what I gather (was never used while I was there). The school was even host to a dialup ISP at one point.

The administrators were all in on technology. The teachers, not so much...

Eventually, in my last year, the government changed the funding model and the party ended.

theideaofcoffee
3 replies
1d1h

Oh how I lusted over the Challenges, the Octanes, the Indigo2s of the time. It was a revelation when I finally was able to sit down at a console of an Octane (with two, count 'em TWO R14000 and a whopping 2.6G of RAM), tooling around in IRIX via 4dwm was so much more satisfying than today's UIs. It was snappy and low-latency unlike anything I've used since.

Later on, I was able to do some computational work on an Altix 3700 with 256 sockets and 512G of RAM spread over four full-height cabinets with the nest of NUMAlink cables at the back), at the time running SuSE linux and that was wild seeing the 256 sockets being printed out with a cat /proc/cpuinfo. Now the same capabilities are available in a 4U machine.

The corporate lineage story is also just as interesting as the hardware they made as well. Acquisition, spinoff, acquisition, rename, acqusition, shutter, now perhaps just a few books and binders and memories in the few remaining personnal at HPE are all that's left (via Cray, via Tera, via SGI, via Cray Research).

RIP SGI

bitbckt
1 replies
1d1h

I still keep a maxed out Octane2 in running order for posterity. Occasionally logging in to it reminds me just how a desktop environment should feel. We truly have lost something since then.

hypercube33
0 replies
1d

I really wish they'd do movies like they made for RIM about: Cray, DEC, Compaq, SGi, Pixar. sounds like these places were either wild or strait up IBM culture or some clash or both either inside or outside. Raven and id Software would be neat too. Westwood Studios.

dekhn
3 replies
1d

I was in love with SGI when I was an undergraduate just over the hill at UC Santa Cruz in the early to mid 90s. Everything about the machines from their industrial designed cases but wonderfully colorful cases, and the sexy desktop OS ("This is UNIX. I know this!") and the way IrisGL rendered molecular graphics.

Driving to a Phish show at Shoreline, we passed the low-slung office buildings of SGI which seemed like the sexiest place to work. When I graduated, I thought I was "too dumb in CS" to get a job in Mountain View and went to grad school in biophysics instead.

By the time I was a few years into grad school, I worked in a computer graphics lab outfitted with Reality Monsters and Octanes and other high end SGIs (when you maxxed out an SGI's graphics and RAM, they were really fast). I was porting molecular graphics code to Linux using Mesa (much to the derision of the SGI fans in the lab). When we got a FireGL2 card it had a linux driver and could do reasonable molecular graphics in real time and the SGI folks looked real scared (especially because the SGI Visual Workstation had just come out and was a very expensive turkey).

Less than a decade after that I was working in those very buildings for Google. Google took over SGI's old HQ (Jeff Dean told me there was a period where Google and SGI overlapped in the GooglePlex and the SGI folks looked very sad as they paid for their lunches and teh googlers got free food). There was still plenty of SGI signage strewn about. And now Google has gone dumb and also built their own HQ next door (note the correlation between large SV companies building overly fancy HQs and then going out of business).

Such is the cycle of sexy tech.

dalke
2 replies
1d

We've talked before about our respective molecular graphics background.

I started with Unix on a Personal IRIS as an undergrad working in a physics lab which used it for imaging capture and analysis. I was the nominal sys admin, with one semester of Minix under my belt and just enough to be dangerous. (I once removed /bin/cc because I thought it was possible to undelete, like on DOS. I had to ask around the meteorology department for a restore tape.)

The summer before grad school I got a job at the local supercomputing center to work on a parallelization of CHARMm, using PVM. I developed it on that PI, and on a NeXT. That's also when I learned about people at my future grad school working on VR for molecular visualization, in a 1992 CACM article. So when I started looking for an advisor, that's the lab I chose, and I became the junior co-author and eventual lead developer of VMD.

With a Crimson as my desktop machine, a lab full of SGIs and NeXTs, and the CAVE VR setup elsewhere in the building. Heady times.

I visited SGI in 1995 or so, on holiday, thinking that would be a great place to work. They even had an Inventor plugin for molecular visualization, so I thought it would be a good lead. I emailed and got an invited to visit, where the host kindly told me that they were not going to do more in molecular visualization because they wanted to provide the hardware everyone uses, and not compete in that software space.

In the early 1990s SGIs dominated molecular modeling (replacing Evans & Sutherland), so naturally the related tools, like molecular dynamics codes, also ran on them. But we started migrating to distributed computing, where it didn't make sense to have 16 expensive SGIs, leaving them more as the head .. which as you pointed out, was soon able to run just fine on a Linux machine.

DaiPlusPlus
1 replies
4h38m

Pardon my ignorance, but what is so unique or special about molecular visualisation compared to, say, Quake - or CAD? If you’ll permit me to reduce it down to “just” drawing organo-chem hexagons, lines, and red/grey/black spheres connecting those lines (and a 360-degrees spin animation for the investor-relations video) - where’s the room for the rest of CG? E.g., texture-mapping, fragment shaders, and displacement mapping?

justsomehnguy
0 replies
4h14m

3D games make a lot of smokescreen and mirrors to make you believe you see a lot.

Not looking it up right now but the original Q1 had a very low poly count.

jefflinwood
2 replies
1d1h

I worked on the SGI campus as a consultant/vendor to them in 1999/2000 during the dot.com boom. I really wanted one of those 1600SW flat screens (everything was CRT back then), but they weren't really in use at the time.

One of the neatest things is that they let us (Trilogy/pcOrder) put together a sand volleyball team to compete in their company intramurals.

Their cafeteria was also top notch.

dekhn
1 replies
1d

That cafeteria went on to be known as "Charlie's" at Google and was the main HQ cafeteria (serving great food, and then later, extremely meh food). TGIF was also held there. If there ever was a place that was "Google central", that was it.

cf100clunk
2 replies
23h13m

I'd never met Rick Belluzzo, but having known SGI insiders of his era I gathered that he'd driven the company strongly towards NT on x86 at the cost/risk of losing their corporate and personal UNIX competencies, causing some internal outrage. When his time was up at SGI he quickly popped up at Microsoft running MSN in what I saw as a synecure meant as a pat on the back for a job well done. Am I right on this?

shrubble
1 replies
22h23m

That is a widely held view among Unix nerds like me, at least.

sneed_chucker
1 replies
1d

I remember to thank SGI every time I format an XFS filesystem.

betaby
0 replies
23h32m

Yes, XFS has been in the Linux kernel since 2001! Time flies.

neduma
1 replies
1d1h

Irix OS was my second best OS exposure after Ubuntu during college days.

HeckFeck
0 replies
1d1h

What did you do with it?

I’ve always wanted to play with Irix . The UI looks very intuitive.

The boxes are hard to find, so I went for emulation but I couldn’t get it any further than a boot screen in MAME.

matthewmcg
1 replies
1d

Much of this history is also described in Michael Lewis’s book the New New Thing (2000) which profiles Clark and his various ventures. It’s really a snapshot of pre-.com crash Silicon Valley.

CalChris
0 replies
1d

My takeaway from that book was that Clark invented the dot com.

krylon
1 replies
6h9m

One thing that set SGI apart from the rest was that their cases were so incredibly pretty. Even today, the most beautiful PC cases barely play in the same league (IMHO). When Apple started building those colorful cases around ~2000, they reached that level of aesthetic appeal, but then they went for brushed metal (which is also very pretty, but not that pretty, IMHO).

Had one of the major PC vendors hired their designers and built just run-of-the-mill PCs, housing them in those amazing cases, I wonder how that would have worked out.

formerly_proven
0 replies
5h40m

Interesting - some of them look neat, sure, but they also look fairly cheap. The O2 especially. The construction certainly isn't high quality, they're just stamped coil steel boxes with plastic wrapped around them. Some OEMs still do this, like this Alienware, which is just a squared-off miditower dressed up in a plastic shell: https://www.youtube.com/watch?v=DY1dlVPzUVo&t=8m10s

Of course in the 90s this would've been quite modern I imagine, considering that the competition was gray painted steel boxes and steel cabinets with a gray powder coat.

icedchai
0 replies
20h54m

Meanwhile, today, we have chat applications taking up almost a gig of memory...

vondur
0 replies
1d

It's pretty simple why these Unix vendors all died. Linux and Intel chips. Sure you could get a really nice Sun system at the time with all of the redundancy tech built in, which cost around $50k. Or you can go and get 4 or 5 Linux servers from Dell running RedHat which by the early 2000's were faster too.

timthorn
0 replies
1d

I miss SGI for many reasons, but their industrial design (and that of pre-HPE Cray) is one of the big ones. The 19" rack form factor is the shipping container of the computing world - practical, standardised, sensible... and dull.

The variety in enclosures matched the novelty in architectures of the period. Exciting times to be part of.

temporarely
0 replies
20h33m

We had a few SGI boxes in arch school (for cad). I learned C++ on those boxes (the GL api, which I thought was beautiful btw). But the main attraction were the two demos (this is '91-2): the flight simulator was awesome and there was this deconstructing cube thing that was pretty wild as well. Compared to what was on PCs those days those SGI machines were truly dazzling.

sgt
0 replies
23h53m

The article went very quick from rowdy teen throwing smoke bombs to... boom, he has a PhD in Computer Science.

Really interesting article that goes into depth regarding the SGI products. I didn't know Clark basically invented the GPU.

Speaking of exotic hardware, I'm actually sitting next to an SGI O2 (currently powered off). A beautiful machine!

rdtsc
0 replies
19h6m

announcing SGI’s intent to migrate to Itanium (and collaborating on projects Monterey and Trillian) while simultaneously launching an IA-32 series of machines running NT known as the Visual Workstation.

That's pretty wild, I had completely forgotten about that. It was too late of course by then.

I was working at a CAD/CAE the company which leased SGI workstations in the early 2000s. Was just a fresh grad wondering why were they leasing them, then realized they were more expensive than some cars.

Some of the developers were starting to use NT workstation with 3D graphics, they were not cheap, but they could run circles around Octanes and Indigos at a fraction of the cost. They were rushing to port everything to NT as writing had been on the wall for a while by then.

pjmlp
0 replies
1d

A bit of trivia, SGI used to host the C++ STL documentation based on HP libraries, pre-adoption into the standard.

pixelpoet
0 replies
4h19m

Random trivia which will likely get buried but whatever: the rendering software Octane Render got its name from my refusing to let the author use my rendering software's name, and he had some historical beef with us, the two devs of Indigo Renderer, so he named his renderer after the SGI machine that came after the Indigo, Octane.

He also leant his SGI Indigo2 to the LGR guy on YouTube, and he did a great video on it: https://youtu.be/ZDxLa6P6exc

nonrandomstring
0 replies
1d

I agree that these machines and their OS were too proprietary and over-engineered to weather the PC revolution. But when I think back to my days using Suns and SG (Indigo) the memory feels like driving a Rolls Royce or Daimler with leather seats and walnut panels.

mvkel
0 replies
16h28m

"it is difficult to get a man to understand something when his salary depends on his not understanding it."

If you're the incumbent, a paradigm shift usually forces you to intentionally cannibalize your existing revenue base in service of the not-yet-proven new thing. That's if you want to survive.

It's incredibly difficult, rare, nearly impossible, to pull such a thing off within a company system.

Imagine being Blockbuster, knowing that to survive, you'd need to transition to a content streaming company.

That's a ton to unwind before you can even buy servers, let alone hire the people to -lead the industry- in such a shift. All to simply remain alive.

lowbloodsugar
0 replies
16h13m

I remember the nVidia CEO saying that nVidia is a software company. Whereas SGI thought of itself as a hardware company. That's why SGI is gone, and nVidia is still making the best graphics hardware.

lowbloodsugar
0 replies
16h51m

Nintendo 64 was an SGI workstation with ... 4MB of Rambus DRAM at 250 MHz (actually 4.5MB but 512K is visible only to the GPU)

It was actually 9-bit RAM, but the CPU could only see 8 of the bits!

latchkey
0 replies
1d

Back in 1993, I was in college and working for the extended education department running their all their computer infrastructure.

One day, someone wheeled this approx. 3x3 foot sized box to my door and asked me if I wanted it. It was a SGI Onyx with a giant monitor sitting on top, with a keyboard and mouse.

I plugged it in and it sounded like an airplane taking off. It immediately heated up my entire tiny office. It was the 4th Unix I had ever played with (Ultrix, NeXT and A/UX were previous ones). It had some cool games on it, but beyond that, at the time, I had no use for it because A/UX on my Quadra950, was so much more fun to play with.

I don't even think I ever opened it up to look at it. I don't know what I was thinking. lol.

After realizing it did not have much going for it, I ended up just turning it on when the office was cold and using it as a foot rest.

Oh yea, found a video... https://www.youtube.com/watch?v=Bo3lUw9GUJA

kelsey98765431
0 replies
1d1h

A close friend of mine growing up had a parent that worked at SGI. I'm not trying to start a holy war, but I just have to let it be known that emacs was the recommended editor at SGI. Just saying. Maybe other things contributed to the fall but, in my heart I will always remember emacs.

junar
0 replies
18h28m

I wish the author had shown their work regarding the mother's after-tax income. They apparently assume a 22% effective tax rate, but that seems unrealistically high.

The SSA says that Social Security tax was a mere 1% before 1950, and would remain below 4% until the 1970s. [1]

A 1949 Form 1040 [2] suggests that a single filer with 3 dependent children, earning $2700 that year, would have a federal income tax liability of only $7. Not 7 percent, 7 dollars.

[1] https://www.ssa.gov/oact/progdata/taxRates.html

[2] https://www.irs.gov/pub/irs-prior/f1040--1949.pdf

johndhi
0 replies
1d1h

When I was a kid I remember my brother and I asking my dad a ton of questions about SG. We viewed them as this amazing awesome company that made cool looking towers and the fastest computers in the world.

jibbit
0 replies
23h23m

i'll never forget my 1st time coming into a high end Flame suite. it was so exciting.. i don't think i could have been more excited if you'd told me i was stepping into the worlds first time machine

fuzztester
0 replies
20h54m

This reminds me of the book The Soul of a New Machine.

fnordpiglet
0 replies
23h40m

I was at SGI and left with Clark to Netscape. That was a fun time in my career. That was also a time that it became clear SGI was about to explode as commodity GPUs were being developed by former SGI engineers and the core SGI graphics teams were attritting hard. Fun to read this story again and learn some of the things that happened later.

fernly
0 replies
16h0m

I appreciate the article. As a lowly member of technical staff 92-97 I didn't see or understand a fraction of this. I *loved* my Indy workstation and the IRIX desktop UI, and I admired the brilliance of some of the engineers I worked with.

Remember the Lavarand[1]? Random number generator based on an array of lava lamps?

[1] https://en.wikipedia.org/wiki/Lavarand

davidw
0 replies
1d1h

I remember having access to an Irix box at my first job. It was seen as a real, professional, serious OS, not like the Linux box I set up. Pretty incredible how that all changed in a matter of years.

danans
0 replies
22h56m

On the 10th of July in 2003, SGI vacated and leased their headquarters to Google.

I was around to witness the he tail end of that office space transition (on the incoming side at Google). It was surreal to be sitting in the physical carcass of a company I had long fantasized about (in part due to their marketing via Hollywood).

In retrospect it was ironic because a company that was based on selling very expensive high performance compute (SGI) was being physically replaced by a company selling (albeit indirectly) very cheap high performance compute.

cellularmitosis
0 replies
23h51m

There's a kid on youtube (username 'dodoid') who put together a series of videos on SGI. Neat to see someone young get super enthusiastic about computing history.

browningstreet
0 replies
1d

I had an Indigo2 on my desk in college. I moved back and forth between that an a NeXT cube that a colleague had in their lab. The NeXT was nice but SLOW. The Indigo2 wasn't especially fast but it was nice and could do visual things that just weren't available as readily on our alternatives. We had SunOS and Solaris systems that were mostly used for network and engineering projects, and I was engaged in some visualization work. When the O2 was announced I was quite sure it would be the solution to our speed issues.. around the same time, another colleague was the first to install a beta of Win95, and it did seem awfully pretty.

ben7799
0 replies
22h6m

I was kind of right the exact age to see all this happen all while I was in college.

Fall 95 enter freshman year and we had Indys and IBM RS6000s as the main workstations on campus. Really great setup where you could sit at any workstation and all your stuff just worked and your whole environment seamlessly migrated. The only thing you had to do was if you were compiling your own stuff you'd have to recompile it for the machine you sat down at.

SGI brought a demo truck to campus in the spring of my Freshman year (Spring 96) and blew us all away. They were there for interviews, obviously I was a freshman but we all went to check it out.

Summer 96 I get an internship and for kicks they gave me an Indy with a 21" CRT (huge at the time) and the silly video camera that was like 10+ years ahead of it's time.

Fall 96 we got labs full of O2s.

Fall 1997 I bought a 3DFX card. MS/Intel somehow made a donation to the school and got them to start phasing out the Unix workstations. The windows NT setup was terrible, they never had the printing and seamless movement of files down till after I graduated. Video games in the Fall of 1997 on the 3DFX were basically as impressive as the demos on the $100k refrigerator sized machine SGI showed in 1995.

Probably fall 1998 I remember my Dad got a computer with an Nvidia Riva 128.

Spring 99 I graduated and that fall I rebuilt my PC with a Geforce 256.

I'm not sure when I last saw an SGI, but I did briefly use one of their NT machines IIRC.

Last time I had a Sun machine at work was probably 2004. I remember maybe 2007-2008 at work deciding for the first time we were going to support Linux, then by 2010-11 we had dropped support for Sun.

Most of the commercial Unix workstations had tons of Unix annoyances I never found Linux to have. Irix was maybe the best. HP-UX was super annoying I remember. I didn't use DEC Unix and Tru64 much. Closed source PC Unix like SCO I remember being horrible.

assimpleaspossi
0 replies
1d1h

I was a system engineer for SGI in 1992 working mainly with McDonnell-Douglas in St Louis. It was thrilling to be sitting in the cafeteria and have Jim Clark plop down next to me for lunch. Just one of the guys.

As an outsider--cause I didn't live and work in California--this was the go-go atmosphere of such companies back then where they thought they could do no wrong. And the after work parties were wild (how the heck do you break off half a toilet bowl?).

One of the buildings had plastic over the windows cause that's where they were working on the plugin GL card for the PC. (Ssh! No one's supposed to know that!)

Being the first system engineer in St Louis, my eyes lit up when my manager told me he had ordered an 16-core machine for my office--just for me!

I was hired as a video expert. The company re-org'ed and my new boss decided he needed a Fortran expert so that was the end of my job with SGI.

ThinkBeat
0 replies
1d

I remember when we got the first Indys at the uni. That was magic like. People nearly got into physical fights to use one of them. People came in at night to use them as well.

I wonder if the uni is so locked down now that students can sit in the lab all night.

Being a bit pragmatic in getting my actual thesis done I discovered that there was all of a sudden, a lot more resources available on one of the (older) Sun servers.

It saved me days if not weeks.

TacticalCoder
0 replies
6h46m

The 'G' in SGI is basically the same 'G' as in GPU. Had SGI still rocked the world, today they'd be at the forefront of AI. All the models would be trained on and run on SGI computers.

The attack on SGI didn't only happen on the lame Windows side, with their crappy software and ultra lame 3D software (compared to what SGI had), which people would love because their lameness was matched by their ultra-cheap pricepoint.

The attack on SGI also came from open-source and Linux: cheap commodity hardware that'd run both mediocre but ubiquitous commercial software and the very same cheap commodity hardware that'd run Linux.

On which OS are most (all?) AI models trained today? What OS powers 500 of the world's top 500 supercomputers?

Linux.

That's the tragedy of SGI: as if cheap commodity x86 hardware wasn't enough, they got then attacked by both Windows and Linux on that cheap hardware.

P.S: as a sidenote my best friend (still best friend to this day)'s stepfather was the head of SGI Benelux (Belgium/The Netherlands/Luxembourg) so my friend had this SGI Indy (the "pizza tower" one) at his home. Guess what we'd do every day after school?

KineticLensman
0 replies
1d

They had me at ‘Skywriter Reality Engine’. I’d used vaxen at Uni in the 80s, then got into industry using Symbolics Lisp machines, then had several dark years programming on DOS boxes. The return to a truly innovative workstation blew my mind

Keyframe
0 replies
23h16m

It still has that cachet with, well a bit older generation. I have an Indy and Indigo2 (purple - max impact one) and when someone visit and sees the machines it's like you sprawled a vintage ferrari in (our) eyes. I don't think there's anything comparable today when we have it all.

I demonstrated IRIX to younger colleagues and it was - ok, so it's alright I guess, like anything else we have today? Yep.. but contemporary world was NOT like that.

I had an Octane as well, heavy and loud beast. All in storage now waiting for move. In late 90s I worked a lot on SGI machines (vfx).

Arathorn
0 replies
23h3m

There's a surprising amount of good info here about very first IRISes. I ended up with 3 of the very first IRIS 1400 workstations mentioned in the post which NASA Ames bought; at least one of which is still in working order. They were my first unix workstations (running a Unisoft-based sysv/bsd hybrid pre-IRIX variant), and is where I first learnt UNIX, C, IRIS GL, vi and lots of other good stuff. Irritatingly they shipped with an XNS rather than TCP/IP network stack (although I did get hold of a beta IP stack, I never got it work). I did get XNS working on a LAN using their EXOS 101 ethernet cards tho.

In case anyone's interested, their graphics card (GE1, the world's first ever hardware 3D graphics card?) looks like:

https://matrix-client.matrix.org/_matrix/media/v3/download/m...

...and the PM2 68k processor card mentioned in the post looks like:

https://matrix-client.matrix.org/_matrix/media/v3/download/m...

...and one of the machines itself looks like:

https://matrix-client.matrix.org/_matrix/media/v3/download/m...

Suffice it to say that I have a very soft spot for these machines :)