Whenever this topic comes up there are always comments saying that SGI was taken by surprise by cheap hardware and if only they had seen it coming they could have prepared for it and managed it.
I was there around 97 (?) and remember everyone in the company being asked to read the book "The Innovator's Dilemma", which described exactly this situation - a high end company being overtaken by worse but cheaper competitors that improved year by year until they take the entire market. The point being that the company was extremely aware of what was happening. It was not taken by surprise. But in spite of that, it was still unable to respond.
You highlight one of the most interesting (and perhaps less understood things) about the key Innovator's Dilemma insight. Even if the senior management have read the Innovator's Dilemma books, know they are being catastrophically disrupted and desperately trying to respond - it's still incredibly difficult to actually do.
Not only are virtually all organizational processes and incentives fundamentally aligned against effectively responding, the best practices, patterns and skill sets of most managers at virtually every level are also counter to what they must do to effectively respond. Having been a serial tech startup founder for a couple decades, I then sold one of my startups to a valley tech giant and ended up on the senior leadership team there for a decade. I'd read Innovator's Dilemma in the 90s and now I've now seen it play out from both sides, so I've thought about it a lot. My key takeaway is that an incumbent's lack of effective response to disruption isn't necessarily due to a lack of awareness, conviction or errors in execution. Sure, there are many examples where that's the case but the perverse thing about I.D. is that it can be nearly impossible for the incumbent to effectively respond - even if they recognize the challenge early, commit fully to responding and then do everything within their power perfectly.
I've even spent time sort of "theory crafting" how a big incumbent could try to "harden" themselves in advance against potential disruption. The fundamental challenge is that you end up having to devote resources and create structures which actually make the big incumbent less good at being a big incumbent far in advance of the disruptive threat appearing. It's hard enough to start hardcore, destructive chemo treatment when you actually have cancer. Starting chemo while you're still perfectly healthy and there's literally no evidence of the threat seems crazy. It looks like management incompetence and could arguably be illegal in a publicly traded company ("best efforts to maximize/preserve shareholder value" etc).
I think SGI failed to understand that there was a point where desktop PCs would be good enough to replace dedicated workstations. Continuing to make hardware that's much better than the best PCs wasn't going to save them after PCs crossed the good-enough line - whatever they had, would be relegated to increasingly rarefied niches - the same way IBM now only makes POWER and mainframes - there is no point of making PCs, or even POWER workstations anymore for them, as the margin would be too narrow.
SGI could double down on their servers and supercomputers, which they did for a while, but without entry-level options, their product lines becomes the domain of legacy clients who are too afraid (or too smart) to port to cheaper platforms. And being legacy in a highly dynamic segment like HPC is a recipe for disaster. IBM survived because their IBMi (the descendant of the AS/400) and mainframe lines are very well defended by systems that are too risky to move tied to hardware that's not that much more expensive than a similarly capable cluster of generic and less capable machines. As the market was being disrupted from under them, they retreated up and still defend their hill very effectively.
The other movement they could do was to shift downwards, towards the PC, and pull the rug from under their workstation line. By the time Microsoft acquired Softimage and had it ported to NT, it was already too late for SGI to even try that move, as NT was solidified as a viable competitor in the visual computing segment, running on good-enough machines much, much cheaper than anything SGI had.
I think your analysis of the shifting technology landscape is largely on target. However, I'm not convinced that the true root of SGI's failure was the technology. Clearly their tech did need to evolve significantly for them to remain competitive but that's a transition which many companies successfully make. Even though SGI chose not to evolve the tech soon enough, fast enough nor far enough, I suspect they still would have failed to survive that time period due to an even more fundamental root cause: their entire corporate structure wasn't suited to the new competitive environment. While the "desktop transition" was most obviously apparent in the technology, I think the worst part for SGI was that the desktop shift changed the fundamental economics to higher volumes at lower costs.
SGI had invested in building significant strengths and competency in its sales and distribution structure. This was one of their key competitive moats. Unfortunately, not only did the shift in economics make this strength irrelevant, it turned it into a fundamental weakness. All that workstation-centric sales, distribution, service and support infrastructure dramatically weighed down their payroll and opex. This was fine as long as they could count on the higher margins of their existing business. While it's easy to say they should "just layoff all those people and relaunch as a desktop company" that can't be done in one quarter or even one year. It requires fundamentally different structures, processes, systems and skill sets. Hiring, training and integrating all that while paying for massive layoffs and shutting down offices, warehouses etc takes time and costs a lot of money.
Worse, once their existing workstation customers saw them shutting down the SGI the customers had bought workstations and service contracts from to become a different kind of company entirely, sales revenue would have taken an overnight nosedive. SGI's stock would also have tanked far more immediately than it did as the fickle stock market investors sold stock they'd bought because SGI offered a specific risk/return expectation which just became much more "risk" and much less "return" (at least in the near-term). In making such a dramatic move SGI would have effectively dumped much of their current quarterly revenue and the value of one of their core strengths - all at the same moment. Thus turning them into one of their emerging startup competitors with all of a startup's disadvantages (no big ongoing revenue streams, no big cash pile (or high stock valuation to leverage for cash)) yet none of a startup's strengths (nimble, lower-paid staff and more patient venture investors).
The point of my earlier post was mainly that a true disruptive market shift is nearly impossible for a large, established incumbent to successfully survive because they basically have to rapidly turn into someone else almost overnight. How can a champion sumo wrestler survive a shift so dramatic that their sport quickly turns into a track meet? Even seeing it coming doesn't help. How does one even prepare for such a shift since losing mass turns you into a bad sumo wrestler long before you even start being a viable sprinter? As Christiansen observed, such disruptions are often enabled by technology but the actual cause of incumbent death is often due to the shift turning an incumbent's own strengths into weaknesses almost overnight.
I don't think you need to shut down that distribution. Because fundamentally, existing costumers often continue to buy existing product at existing prices, as long as they get a faster product. This was something that Gordon Bell pointed out. Existing costumers are design to work at that level of expense. Expensive graphics workstations and rendering servers and so on still exist. Sure it would go down eventually as all business does in the long run.
The real failure is not picking up new business along the way. With the N64 they showed they could design a consumer product. But outside of that they were in no future consoles. 3dfx and ArtX both came from former SGI. You don't need to stop selling workstations just because you make chips for consoles and other such devices. Nvidia barely survived and might not have if not for consoles. There are other markets where their expertise could have been applied.
Surviving changes like that often requires finding other markets. And then when it comes to making hard choices you need to cut part of the thing that is unprofitable. But this is really hard to do and in some ways it goes against the 90s US corporate philosophy of focusing only on the 'Core' business. DEC for example sold profitable business units that might have been helpful to have. DEC had a printer business and had the potential for a Database business. Oracle calls RDB one of the best acquisitions.
Commodore tried for a play where their - still much lower end - newest generation chipset would have scaled (with the same chips) from being a full low end computer, console, or set-top box, computer (it had a PA RISC core on chip, so could run standalone), a high end graphics card for PCs, and the chipset for a higher end Amiga at the same time.
They ran out of money - short maybe a few million - before being able to test if that would have worked as a way to widen their market.
I wish we'd have gotten to see how that'd unfolded, though Commodore's deep management dysfunction probably still would have made it fail.
Legend says Commodore was at some point approached by Sun willing to OEM their Amiga 3000/UX machines as entry-level Unix workstations and sell them through their distribution channels. The same accounts say Commodore management refused (maybe because the deal had a non-compete part).
They also missed an (earlier) boat with the Commodore 900 workstation, which would run Mark Williams' Coherent OS (a Unix-like system).
Many Amiga engineers were UNIX heads, and even Amiga DOS /Workbench started as what might be another take into UNIX like systems with multimedia hardware, when they pivoted away from a games machine.
There are some vintage computer club talks where they dive into this.
Commodore could have used Coherent as the Amiga OS.
On the other hand, thankfully they haven't taken that approach, as otherwise wouldn't have been the multimedia machine it turned out to be, or have OS design decisions like its use of Libraries, Datatypes and everything scriptable that are yet to be mainstream.
With exception of a few UNIX systems like Irix, Solaris with NeWS, NeXTSTEP, Apollo, everything else tends to be the same deck of cards reshuffled.
Commodore is one long history of management failures... As a big fan at the time, without insight into how dysfunctional it was, it was thoroughly depressing both to see from the outside, and then reading up on the internal chaos years later. It's a miracle they survived as long as they did, and at the same time there were so many fascinating lost opportunities.
ChuckMcM was at Sun at the time, and mentioned a while back he tried to get Sun to buy Commodore outright:
https://news.ycombinator.com/item?id=39585430 (his later replies in that sub-thread are also worth reading)
I think the real problem here is PC workstations, with Windows NT (later Linux) and an Intel chip, could do 90% of the SGI workstations for a fraction of the price. By workstation I mean an honest workstation with a high end graphics card, 10k RPM SCSI drives, and hundreds of megs of RAM.
The price of SGI workstations was "call us" which translates to tens of thousands of dollars per workstation. PC workstations didn't and couldn't replace all uses of SGI workstations. What SGI was not able to handle was the fact their customers suddenly having a viable option besides paying them tens of thousands of dollars for their workstations. Even their Visual Workstations weren't a real improvement cost wise as those were still overpriced compared to competitors' PC workstations.
This is my recollection of the era as well. A quality PC like a Compaq was already a good alternative during the Motorola era. In a lot of ways the whole RISC thing was a dead end as all that happened was SGI, IBM, HP and Sun cannibalised each others sales.
ARM is the only one left standing from that era which with hindsight seemed so unlikely.
Keep in mind several of the others survived long past their public visibility. There were MIPS smart phones for a while. Both PPC and MIPS long outsold x86 in number of CPUs - just at low margin licenses with the bulk going into embedded uses, like ARM.
ARM had the advantage in that space of starting from the very low end, and being able to squeeze margins instead of being squeezed.
Don't forget IBM is still selling (and improving) their POWER family, running AIX (which had the low-end eaten away by Linux) and IBMi (which is the only minicomputer family still standing). IBMi is extremely well defended as it's not even fully documented (unlike mainframes) and, therefore, doesn't even have a way to be emulated outside IBM. And I would not be surprised if that "secret sauce" to be kept in an underground facility under a mountain in the middle of a desert, in a completely offline system behind multiple biometric locks.
ARM survived this long because it had a niche others couldn't break into (and still can't) as the highest performance per watt anywhere.
I think it would be more accurate to say the RISC workstation built entirely of custom low volume components was a dead end. Even if SGI decided to cut their margins to compete on cost their lowest prices would still be way above a PC workstation. Compaq benefitted from economies of scale that SGI could not.
RISC architectures live on today. Your house likely has dozens of MIPS chips in various components. You've got more ARM chips for sure but no shortage of other RISC chips in various components.
“Custom low volume components” is a marketing strategy. They could sell them to competitors and make money off the components. Late in its history SGI owned MIPS and sold their hardware to others. I have a WindowsCE laptop running on a MIPS R4000 processor. Runs NetBSD with effort, but with dignity.
IBMs market was also just far larger. SGI sales were around 3 billion, Sun were around 3x that. IBM was even more. SGI and Sun were based Unix so Linux could disrupt that far more easy then IBMs systems.
IBM also came into the game more vertically integrated. Having your own fab is expensive, but if you read on what Sun and SGI had to do in order to get chips, that route also wasn't great.
In the beginning there was a chance that Sun and SGI could have merged, the reason it didn't happen was mostly because leadership couldn't agree on who would lead the company. Between them they duplicated a lot of technology while sitting right next to each other. Both doing their own RISC chips, at times Sun was doing their own graphics cars, competing in 'low' priced and mid priced work stations, incomparable Unix developments, competing in the large SMP market against each other. If they had been together and things could have been different a larger install base and more investment into the architecture might have given them a better chance.
Arguably, the shareholders don't want the management to try to harden the company. If the shareholders wanted to invest in a new approach, they could do that on their own. Rather, the shareholders expect the company to squeeze what they can out of what they are, even if that means winding down the firm.
Right. It's not actually in anyone's interest most of the time to deliberately destroy the company in the short term to have a glimmer of hope of maybe successfully reinventing it in the longer term. Shareholders always have the option of moving on and employees/management can probably collect paychecks for more time while they possibly look elsewhere. Corporate legal continuity is probably overrated.
Companies and management would rather a slow controlled descent into the ground than unexpected success. They balance their risk at the portfolio level not at the company level.
During good times focus on your core and then peter out. Just follow a nice parabolic arc.
A lot of success is just timing and attempting to fight all threats look foolish. A remember, for every up and comer that “we all saw coming” there were lots that didn’t make it. If you waste time fighting those you wasted money.
I wonder if any major companies have experimented with an internal “red team” startup? It would be on level playing field with your upstart competitors, free from existing hierarchy and structure to develop disruptive tech like any other startup, only you are bankrolling them which gives them a head and shoulders boost relative to any other startup. Eventually you can let this startup grow to be more dominant than the old company and repeat the process with the next disruptive technology.
I've seen this in several mid-size companies but the trick to success is that you make a start-up in another vertical so your existing customers aren't dropping their contracts for the cheaper thing you disrupt yourself with. You're building a new business for the same market that already know you, so you can capitalize on your brand, using revenue from a dying but profitable model.
As a thought experiment, let's say Facebook did this.
The red team designs a new streamlined look with less click baity posts and fewer ads. Users flock to it, abandoning the existing platform. The new platform isn't monetized as effectively as the old one so revenue drops by billions per year - maybe an order of magnitude more than the new product brings in. Shareholders demand an answer as to what management are doing about it.
There might be some golden path, but that typically relies on finding a new market that doesn't interfere with the existing business (e.g. AWS and Amazon). However, the far easier options are a) shutdown the new service or b) increase its monetization, thereby starting the enshitification cycle.
"Maximally efficient is minimally robust."
A company that optimizes for efficiency will get stomped flat when the environment changes.
The problem is that there are no incentives in business to optimize for robustness.
Well, that’s partially because of the converse: a company that optimizes for robustness will get stomped flat before the environment changes to require robustness. Short term games are bad in the long term, but often good enough in the short term to win before the long term arrives.
Everyone has been reading that book since the late 90es.
I remember a talk by Clayton Christensen talking specifically about Intel and how they setup the Celeron division to compete with themselves (based on his advice).
A key property of tech in economics lingo is that it is “natural monopolies” - all fixed cost and no variable cost.
This creates these winner takes all games. In this case both Intel, SGI plus others knew the rules and it just ended up with Intel taking the prize and it all becoming Wintel for a decade or so - basically until the smart phone allowed enough capital to be accrued to challenge the old monopoly.
Even just changing simple behaviors across a large company can be impossible. I worked at a company with several thousand employees all required to attend mandatory training on "effective meetings", several hours long spread out across thousands of employees. Rule 1 was to have an agenda for the meeting. This was something nobody did at this company and you would regularly attend meetings unprepared.
After all that training, still nobody did it (ok I did and one other guy). That company couldn't change anything. It was amazing.
They had a project to change a department into more proactive than reactive. The solution was to create a lot of bureaucracy surrounding being proactive. As you can imagine bureaucracy about being proactive was really just institutionalizing ... not being proactive.
I eventually left and work at a smaller company now. It's been refreshing for years now when we can decide "this process doesn't work, let's not do it anymore" and it just happens. Even just new coworker: "I won't be in tomorrow, who do I tell?", me: "you just did, have a great time off" seems revolutionary after being at a big company for so long.
I'm convinced that as the sheer numbers of humans increases the friction to making real change in a company decreases and there's not much you can do. Fundamental change to respond to real disruption, nigh impossible.
Does this apply to the US response to Chinas lead in battery and solar production? The Inflation Reduction Act is essentially a protectionist policy for manufacturing in response to Asian and Chinese tech manufacture. I feel like we are trying to fight our way out of one of these situations. I'm concerned because every time I see something happening that excites me (the massive policy shift from enacting the IRA and the surge in American manufacturing) it usually means something really bad is about to happen.
There's no way the shareholders win this lawsuit. Violating fiduciary duty involves doing things you know won't provide value. Long term planning would be a defense for this.
The shareholders could absolutely oust those executives, though. And they may very well do so.
Having worked longtime for a minicomputer company--which actually survived longer than most mostly because of some storage innovations along with some high-end Unix initiatives--it's really hard. You can't really kick a huge existing business to the curb. Or otherwise say we're going to largely start over.
Kodak was not actually in a position to be big in digital. And, of course, the digital camera manufacturers mostly got eclipsed by smartphones anyway a decade or so later.
On the contrary, Kodak was well placed to do well by anticipating 'Moore's Law' as pertinent to sensor pixel density and sensitivity versus film. Film resolution was towards the end of intense development in pixel terms - not much further to go. They had pioneering patents and ongoing R&D would have enabled a long period of dominance during the transition and to this day!! The board and scientists were asleep on a mountain of cash, and they sold their future for a few crumbs left for shareholders after bankruptcy. Blackberry did much the same with fewer excuses. I met with some board members of Kodak in the 80's and they were like old English gentlemen - long on pomp and procedure, but they wore blinders and a vision bypass - TRIH.
Kodak was essentially a chemical company at one point. They even spun off an actual chemical company. Kodak could probably have played a better hand even if they did probably before their time things like PhotoCD. But they could have been Apple or maybe Instagram? That's a stretch.
I'm not a particular Kodak apologist but suggesting that a company should have been able to anticipate and correct for their business collapsing by 90% in a decade or so seems to need a lot of particulars.
They could have been a Sony. The iPhone camera sensor is made by Sony.
And Sony has certainly had rough patches too. And that's for a company coming from an electronics manufacturer angle.
Kodak could have spun off a consumer electronics or semiconductor manufacturing company. But it's not clear why that is actually a better model than someone else just spinning up similar entities.
I don't need all the chemical engineers and a lot of other people connected with the old business anyway. And I'm sure not turning them into semiconductor experts.
So you're one of the 10% of employees in HR who snuck through to the other side. Is that really a big deal?
That's right. The chief executives and the HR lady basically get transferred over to a new startup funded with Kodak's money and everyone else is fired.
Kodak did fine in the transition to digital. They made some popular compact cameras and tried to make DSLRs. They were wiped out by compact cameras being killed by smartphones. The survivors are the old camera makers like Canon and Nikon that have ecosystems. The other big survivor is Sony, which bought a camera company and makes most of camera sensors.
Fuji is interesting, they weren't that successful in first digital cameras, but now have some interesting mirrorless ones. They still make film.
Fujifilm is a much smaller company than Kodak was. They also applied a lot of their expertise in emulsions to medical applications.
And, yes, they have some interesting if somewhat niche cameras.
Kodak was well aware of what was going to happen. Company culture killed digital photography.
I was at Apple when we worked with engineers from Kodak who were working to change various format standards to allow digital photos. This was in the late 1980s or early 1990s.
But, from the perspective of today, Kodak would have had to basically eclipsed Apple.
Even displacing the big Japanese camera manufacturers, who by then had dominated high-end photography, would have required reversing decades of a shift away from high-end cameras like the Retina line.
I don't doubt there was company DNA against digital photography but it's not like non-smartphone photography, especially beyond relatively niche pro/prosumer level, has had such a good run recently either.
There is still a lot of business opportunity in supplying image sensors and lenses to smartphones.
But it is nowhere near as profitable as the 35mm film system was.
The 35mm system was a huge consumables business down through the food chain. That basically doesn't exist with digital. (Aside from ink jet supplies and I'm not sure how true even that is any longer.)
Data General?
Yes. CLARiiON eventually enabled a sale to EMC (which arguably saved EMC for a time) and the Unix business (especially NUMA servers) were sufficient revenue producers for a while to keep the lights on. ThinLiiNe (or whatever the capitalization was) never went anywhere but neither did a lot of things in the dot.com era.
I knew it :) thanks for confirming! And for sharing.
I was the PM for a bunch of the minicomputers from the mid-80s on. Then I was PM for the initial Unix AViiONs and later the NUMA servers including being one of the main liaisons with CLARiiON.
Seems like Sun really shot itself in the foot by not buying DataGeneral. A Unix storage business that fits pretty well with their larger datacenter portfolio. And a start to a real x86 business.
I just finished reading 'LIFE UNDER THE SUN: My 20-Year Journey at Sun Microsystems' that talks about Sun and the storage businesses a bit. Sun was never happy with how well they did in storage. Sun was part of defining Fibre Channel.
For some reason that still doesn't make sense to they bought StorageTek for an incredible $4 billion at the time when Sun wasn't exactly flying high. The explanation from the CEO given in the book mentioned above is baffling.
Edit:
Seems they bought:
1999: MAXSTRAT Corporation, a company in Milpitas, California selling Fibre Channel storage servers.
Never heard of them. Not sure how comparable it was.
I remember Clayton Christensen mentioned that Andy Grove invited him to Intel to talk about how to deal with the dilemma, and interrupted Christensen while he was talking and said something like "I know the problem, and I need you to tell me the solution". Similarly, Peter Drucker repeatedly mentioned one of the biggest challenges in business is "killing the cash cow". Along that line, Netflix's Reed Hasting is really amazing. He somehow managed to kill the DVD business and used it to milk the streaming business, when almost everyone in the industry and some of his lieutenants in Netflix didn't believe him.
Yes and:
These years later, while the innovator's dilemma thesis describes what, there's still little treatment of why and how.
I keep wanting someone to account for the roles of investment and finance.
Amazon's innovation was lower cost of capital. They convinced investors to wait for returns. And they got a massive tax holiday. (How could they not succeed?)
Ditto Tesla, with its saavy moves like govt loans, prepurchases, tax incentives, and selling direct.
That cheap capital was necessary, but not sufficient. Both still had to create products customers wanted.
I keep coming back to Apple. How'd Apple avoid the trap? Despite their terrible position. My guess is better financial strategy (if that's the right phrase). Apple focused on margins (and monosophony) instead of market share. And then leveraged their war chest to implement monosphony.
Apple also did not hesitate to kill their cash cow, supplanting the iPod with the iPhone (an observation made, I think, by Horace Dediu).
iPod was killed quite late into iPhone era, when the sales have plummeted to unsustainable levels.
By “killed” GP means “cannibalised” not “cancelled”. They out-competed the iPod while it was still hot.
Apple also recognized one of its big problems was marketing to the end user, in particular in big box stores where not many people knew and appreciated what Macs could do. Creating their own stores created innovation and an experience, rather than less than knowledgeable staff and price tag comparisons against the latest Dell PC. That meant letting go of some of the traditional stores and the chain of legacy Mac resellers. Of course, now, you can get Macs at Costco.
Investors only 'wait' if you can show revenue growth and large market.
Tesla was very capital efficient compared to how hard the task was, only just enough to get the next product out. By the time they were making big loses, you could see the Model 3 was gone turn around once they reached scale. There was always a clear story of if we do X next, that means Y.
I think by changing CEO radically at the right time. Whatever was gone work for Apple, it wasn't any of the typical things you would think off. I'm not at all a Jobs fan, but I don't think many people could have turned it around.
Jobs had the ability to come back and radically shift the company. They also had the benefit of 20 years of loyal costumers, maybe the biggest asset Apple had was their loyal costumer base (or captured market). Personally I can say my dad was a complete Apple fan, he would just buy Apple. He simply didn't care the higher performance or any of the other things.
Apple’s limited SKUs let them maximize their economies of scale. When Jobs took over, despite teeny market share, products like iMac and iBook were #1 in shipments, compared to Dell and their 300 models.
For a while you could view Netflix online and rent DVDs from them.
Oh dang, I thought you still could. Looks like they shut down the DVD rentals about 6 months ago.
There are now a couple of smaller disc-by-mail startups who aim to serve that market. I've signed up with DVDInBox out of Florida and they've done a good job thus far.
From what I've seen their biggest current challenge is their mailer. Netflix spent a lot of time designing their signature red envelopes, working with the USPS on ensuring they would survive a trip through the postal sorting machines. DVDInBox has yet to reproduce it - their envelopes sometimes arrive fairly mangled, or have not arrived at all (lost to the bowels of the machines).
I think it's a near impossible situation - the status quo is literally something that should not exist given the new market realities. Pivoting is pretty much asking a company to commit seppuku - asking the layers of leadership to basically replace themselves and quit in many cases. Which is pretty much what happens anyway.
And, just like every Unix workstation vendor of the 1990s, they got hit with a perfect storm. They had their hardware being disrupted by x86 really coming into its own as a viable option for higher-end computing at the exact same time that Linux was becoming a serious option for the operating system.
"Literally something that should not exist" is the perfect way of putting it. In 1990, lots of people needed boutique workstation vendors. In 2000, nobody did.
Even Apple, which became the unlikely last of the Unix workstation vendors, was disrupted by Intel (and moved from PowerPC to x86 for a while). Ironically, Apple is now the very last Unix workstation vendor in existence.
Yes, although they have clearly not always been committed to the Mac Pro over the years, it still exists and the pricing is certainly workstation like.
Every Mac is a Unix workstation these days. I work on a Macbook Pro and spend an inordinate amount of time running Unixy stuff in the terminal.
It's worse than that. Instead of "nobody" it was conservative slow moving vendor locked clients that could convince you to keep selling at those prices ("look at the margins!") instead of focusing on alternatives. I remember $5,000+ PCs being considered "cheap" workstations when those clients finally migrated.
Probably the lack of vision is not just failing to turn into the direction of new “products” but not acquiring, digesting and eliminating those busines who start to grow before they’re too big. See Microchip for example, how many relatively small semiconductor and technologies manufacturers have already eaten.
And, at some point, what does it matter if the leadership and most of the employees turn over, typically involuntarily?
Is there any significance really to Foot Locker basically being a reorganized Woolworth's as opposed to being a brand-new company?
If you're big enough and have some product lines that still bring in a lot of money and didn't totally collapse like IBM you can sometimes pull it off. But it's hard.
This is conventional wisdom (and thus, usually correct).
However, it's always interesting to look at counterexamples: Beretta, for example (in business for 500 years).
https://www.albertcory.io/lets-do-have-hindsight
or the IBM PC, which cannibalized IBM's business, at least in IBM's mind. Thus, they screwed it up and let Wintel make the real billions. So it worked, until they took your advice and decided that had to stop.
They sort of tried. Around then they had a Windows NT machine that cost around US$12,000. But it was too late. The first serious graphics cards for PCs were appearing, from Matrox and others, with prices of a few thousand dollars.
(I tried some early NT graphics cards on a Pentium Pro machine. This was before gamer GPUs; these were pro cards from tiny operations. Fujitsu tried unsuccessfully to get into that business, with a small business unit in Silicon Valley. At one point they loaned me a Fujitsu Sapphire graphics card prototype. When I went back to their office to return it, the office had closed.)
Also, there was a bad real estate deal. SGI owned a lot of land where Google HQ is now. They sold it to Goldman Sachs in a sale and lease-back transaction, selling at the bottom of the market. That land, the area north of US 101 in Mountain View had, and has, a special property tax break. It's the "Shoreline Regional Park Community", set up in 1969. The area used to be a dump. Those hills near Google HQ are piles of trash. So there was a tax deal to get companies to locate there. That made the land especially valuable.
SGI tried its hand at the PC video card business as early as 1990. I was at Autodesk at the time and got one of these to beta test on a DOS 486 running AutoCAD. It was an impressive product. But huge; it took up two full-length ISA slots. And the display drivers were a bit buggy.
I wish they ported IRIX to x86. You can make more money by making stuff for Windows, but it won't protect you from market erosion.
I doubt that would have helped. Customers didn't love Irix as an OS. They tolerated it in order to use SGI's superior hardware and specialized applications.
Competitors such as Sun did port their proprietary Unix flavors to the x86 PC platform but never achieved traction. It was impossible to compete with free Linux, and lack of device drivers was always an obstacle.
Kind of. Oxide uses what's left of OpenSolaris as their OS of choice and Oracle still sells Solaris SPARC servers. It's the retreat up - they can defend their hill for a long time that way and, in Oracle's case, they don't even innovate the same way IBM does with POWER and Z.
Sounds just like Nvidia in 2024
Except Nvidia's modern cards are even bigger.
Here's a brochure for the IrisVision boards - uses 2 ISA or Microchannel slots.
Prices start start at $3,495
https://www.1000bit.it/js/web/viewer.html?file=%2Fad%2Fbro%2...
In the late 90s I was in the last year of high school. Silicon Graphics came to do a demo of their hardware for students that were interested in taking a computer science course at university in the following year.
The graphics demos looked like trash, basically just untextured and badly shaded plain colored objects rotating on the screen. For reference I was playing Quake III around the time which had detailed textures and dynamic lighting.
I asked the SGI presenter what one of his Indigo workstations cost. He said $40,000, not including the graphics card! That’s extra.
I laughed in his face and walked out.
In the late 90s, SGI demos were much more impressive than what you describe. It was used by technical folks to do real stuff, with stringent criteria.
More importantly, the things that made Quake III so great were state-of-the-art for gaming. But those things couldn't render lines quickly and well (a mainstay of CAD at the time), or render at very high resolution (which IIRC was 1280x1024 in that era).
Here's what Carmack said abotu the SGIs a few years before: """SGI Infinite reality: ($100000+) Fill rate from hell. Polygons from hell. If you don’t trip up on state changes, nothing will come within shouting distance of this system. You would expect that.""" SGI was also key for map builds before PCs were capable.
But yes, 1999-2000 was just around the cusp of when SGI went from "amazing" to "meh".
If I remember correctly, their cards had a high compute rate even at double precision, but had tiny memory buffers and basically couldn’t do texturing at all in practice.
It turned out that double precision was a mistake that was sold as a “professional” feature. By sharing edges correctly and using the correct rounding modes, single precision provides pixel-perfect rendering. Efficiencies like this allowed the consumer GPUs to run circles around SGI hardware.
They did texturing just fine.
Most SGI graphics cards had no texture ram at all, and had zero texturing capability. At the time there was one card that had 4 MB of texture RAM, but in the same year a typical PC GPU had between 16 or 32 MB of shared memory, most of which was used for textures.
A “budget” SGI card cost more than my parents’ car. I bought three different GPUs by that point with my pocket money.
I honestly don'tt know what you're talking about- I wrote many programs that did high performance texture mapping on SGIs and they had both texturing capability and RAM. When you say "SGI card", it makes me sound like you're talkijng about something other than an Octane2 or InfiniteReality.
The curve that maps fucking around with finding out is not linear. By the time you start finding out, it's very hard to stop finding out much more than you would like to.
It was clear they were trying to do more consumer grade things, just look at the N64. Couldn't get more mainstream than that. Seeing how the graphics market ended up, it looks obvious from here but in the mid 90's it was still the wild west and everybody was throwing mud at the wall seeing what would stick.
I have never really said that they where "taken by surprise", but a part of it felt like (from the outside) that management had been a little blinded by their pass success and the profit margins from their workstations combined with no clear path forwards for the whole industry. Nvidia could have very easily been just a curiosity of the past but they managed to strike it lucky standing on the shoulders of others.
If SGI had always been a company that could provide graphics workstations the worked with x86/Windows PC's early for example - maybe they would have fared better. Would have gone with the flow of technology at the time rather than fighting uphill no matter the potential technical brilliance. But being saddled to their MIPS processors and custom OS meant that once people left, they almost never came back. One can have the best tech and still fail.
Integraph started making PCs with high-end graphics at one point, when they abandoned CLIX and gave up on their (Fairchild's, really) Clipper processor. It didn't work for them either. SGI did their own "Visual Workstation" that ran Windows and had a Pentium, but that too was a huge disappointment.
The 320 and 540 had a few nice things going for them: You could option them with the same 1600SW monitor that the higher-end O2 workstations used without having to use the fiddly multilink. SGI paid Microsoft for a HAL license and did a better job with multiprocessor than what you got from installing the vanilla version.
They had decent bandwidth internally allowing them to playback uncompressed realtime standard definition video which normal PCs running Windows couldn’t do at the time.
The moment your product runs Windows, it'll compete with thousands of others. Being able to do one single thing better than the others won't help you much unless that one single thing can drive all your sales.
Even for video rendering, if your box is twice as fast, it'll be outcompeted by machines that cost half as much or less. At times my desktop PC was not fast enough, it was simpler to get another PC and run time-consuming things on it while I did other things on the other.
Yes, but the team that did that also left SGI, then worked directly with Nintendo for the GameCube and are acquired by ATI. I’m not sure how SGI managed to not support that effort within itself.
Nintendo wasn't loyal to the company it was loyal to the team, so when they just decided to leave and form ArtX they took the customer with them... SGI was happy with the Nintendo contract. They earned $1 in additional royalties for every single N64 cartridge sold worldwide. Losing the team was a big blow.
Someone at SGI wrote a paper/web page/blog post titled "Pecked To Death By Ducks", claiming that x86 chips could never compete with SGI, and claiming to refute all the arguments that they could.
Then Intel introduced dual core (or maybe just two chips in one housing sharing a bus), and that generated a lot of buzz. So he wrote a follow-up titled "Pecked To Death By Ducks With Two Bills".
I don't recall the timing, though, how it related to the timing of asking everyone to read The Innovator's Dilemma. But at least some of the time, there was a pretty deep denial (or at least a pretty deep effort to keep the customers in denial).
That’s really funny for some reason.
IIRC, "Pecked to Death by Ducks" is the title of either a short (nonfiction) story or a book by Gerald Durrell, one of my favorite childhood authors.
I don't recall that one, and I thought I knew Gerald Durrell's work pretty thoroughly. There is a book of that title by Tim Cahill, though; maybe that's what you're remembering?
Yesss! Thank you. I'm embarrassed because I used to have that one, too :)
On the upside, I've never known anyone else who had even heard of Durrell.
I would have thought that lots of folks had watched the "The Durrells in Corfu" TV series (PBS Masterpiece Theatre) and then did some research on the family.
I don't know whether this is available online, but I can recommend it as a pleasant programme, with lovely scenery, interesting storylines, and engaging actors.
A bit like Seymour Cray's plowing a field with 1024 chickens instead of two oxen.
I remember reading how Andy Grove, IIRC, spent two years trying to exit the DRAM business. He would flat out order people to stop working on it and they wouldn’t listen, believe or understand. The amount of inertia for large bodies is remarkable.
And yet there are stories of people ignoring the company direction, continuing work, and saving the company.
See: The Blue LED https://youtu.be/AF8d72mA41M?feature=shared
So it’s not clear that the company knows better. Feels like educated guesses but a lof of luck involved.
Very true
Why was exiting the DRAM business such an obvious positive?
Intel is basically at its root PC hardware. Yes, it doesn't dominate the entire industry, but memory is pretty key to a PC and the OS that runs on it.
In fundamentally it's another fab product. Like chipsets.
I guess an interesting question would be whether Nvidia is SGI in the late 90s or Intel of 80s.
I have been comparing them to SGI of the late 90s but the next 18 months will prove me right or wrong as Intel, Apple, Google and AMD try to compete.
SGI was a pretty small 3D workstation company at the bottom of the S&P500, even at it's peak. Microsoft was at least a hundred times as big as SGI. (And Nvidia was even smaller, selling GPUs for both SGI worthstations and Windows PCs at the time.) Now Nvidia is a titan at #5 in the S&P. They could certainly be taken down a notch or two as CUDA alternatives mature and people start buying cheaper AMD/Intel/FPGA(groq, etc) hardware. But they're the best at what they do, with the best people. They don't really have a doomed business model the way SGI's "take out of a second mortage for the Onyx workstation to make N64 games" business model was doomed. I don't buy Nvidia stock, personally, but I especially wouldn't bet against them long term either.
I get the vague feeling they are more like Intel in the 80's. Nvidia has had a real talent partially by their own smart moves and partial just pure luck for stumbling from one win to another. Seeing them fall into the Crypto boom and then straight onto the AI boom has been wild to see.
I was making crazy money in the dot-com boom and bought a SGI 540 in 1999 (with an SGI monitor).
With money to burn SGI was a childhood brand, legends in 3D. Such wonderful memories. 15k on a desktop setup - it was loose change, however it shows how clueless I was back then. However I'd felt like I'd "arrived".
SGI with Windows NT - lol - I wrote my first OpenGL game in Visual Basic... I've always been somewhat of an outlier ;-) God help me.
The point? My personal experience says something about the strength of the SGI brand - even in the face of what was happening at the time (3DFX and so on - my previous company was one of the few 3DFX api devs - illustrating how clueless I was...)... it all happened so quickly... I'm not surprised SGI couldn't respond - or more importantly understand the strength of Microsoft/OpenGL/DirectX in the boiling pot of 3DFX / Nvidia and the rest... From memory it took three years and SGI was done - shared memory architecture? No longer worth the cost. :-(
Looking back, I was such a kid - a complete fool.
Greybeard advice: bet on the monopoly. Be smart. Brands like SGI are nothing in the face of install base. Think about how crazy it was to spend 15k on a desktop SGI back then... nostalgia is insanity, vanity.
Yeah but it still sounds really cool!
From 1991 when I first saw SunOS I wanted a SPARCstation. I started college in 1993 and the school was full of DEC Ultrix machines, Suns, HP PA-RISC, and a handful of IBM RS/6000 and SGIs.
I just thought DOS/Windows PCs were such garbage. Single user, no preemptive, multitasking, no memory protection. Then Linux came out and it changed everything. I bought a PC just to run Linux. My dream of a Unix RISC workstation faded away.
My roommate in 1996 bought a DEC Alpha. Not the cheaper Multia but an Alpha that could run OSF/1 Digital Unix. He actually ran NetBSD on it.
In 1997 I took the computer graphics class and we used SGIs. There was just one lab of them reserved for that class and grad students. I was so excited and it was really cool but I didn't think I could ever afford one. It's still really cool though that you had one.
Linux on RISC-V ... Its happening ... eventually ... probably.
The one thing that I always have to point out to folks who didn't live through it. The pace of change in everything was so rapid, especially in the graphics space.
It is wild to think that in games for instance, we went from Quake in 1996 running software rendering to Quake 3 requiring a GPU only 3 years later and that becoming the standard in a matter of months.
I worked a lot with SGI hardware and IRIX in the 90s and very early 2000s. I ported a lot of code to and from Linux in that time.
I remember trying to explain to SGI reps that while we loved the SGI hardware, Linux on commodity x86 was the increasingly more effective choice for our projects. We wanted to buy SGI x86 boxes but they were pushing NT.
It was very apparent that SGI salesmen knew which way the wind was turning and they were milking the last bits of the commission gravy train.
Even when everyone understands “The Innovator’s Dilemma”, the incentives of salesmen and shareholders can be to optimize for extracting value out of the dying company. If I am a shareholder in a company being out innovated, it might make sense to want them to maximize short term profits while reallocating capital to the innovators instead of trying to improve the dying company.
Seems like issue is a shift from high margin sales representative lead growth to low margin mass where you don’t need sales people to convince you to buy it. Then there is no way to justify the sales team, so they don’t support it. The company is beholden to them, and they sabotage the move to mass market.
That is certainly one case, maybe the most likely. But another case I was involved with involved sales just not understanding the innovation. They could have potentially made just as much or more money with the new thing, but were too comfortable selling the old thing.
Could this eventuality happen to Google, Nvidia, or Apple?
Why not? Google's only been a powerhouse for the last decade or so.
The way to survive is to eat your own lunch. Be the low cost competitor and cannabilize your own market.
Otherwise you don’t build the iPhone because you don’t want to lose iPod sales.
I not so obvious but similar example of this would be with Nintendo and the Switch. They had always hand their handheld and consoles as separate things but eventually decided to just combine them into a single thing. So instead of double dipping on the hardware and potentially the software sales, it just become a single entity. And to say that has worked out well for them is an understatement. Sold 139 million units and are still taking their time of getting to a successor. They are probably going to take the #1 highest selling games machine of all time and they did it by going against their old business practices.
I believe having a talented dictatorial leader at top may be only solution. Like Steve Jobs, Bill gates or Jeff Bezos. Once they believe in a path, they have methods to get it done. Internet Tidal Wave memo is a good example of it. Zuckerberg is able to invest 100s of billions on a future he believes in.
Obviously the observation has a confirmation bias.
Companies through their usual hierarchical structuring are authoritarian by nature. However, many leaders are no actual leaders, they are just bureaucrats, and that is the reason why things get stuck.
When statements like X was better than Y come up. I always think of "All models are wrong, but some are useful", from statistician George E. P. Box, and rephrase it as "All models are flawed, but some are useful" so that model becomes ambiguous, such as a smartphone, TV, computer, car, programing language, programing design pattern, social media platform, and so on.
Price-point, SGI technology was a financially flawed model pertaining to the growing market and more useful than flawed performance of the low cost technology market.
Did anyone at SGI try to simply buy the low tech products, play with them a bit, and see about slowly integrating your tech to make that low tech product just a little better than the competition and cost effect for the market?
Thanks for this comment, very much appreciated.
They had the best processor on the market, yet they decided to sell Intel and Windows. I really don't underestand what were they smoking.
I was at one of SGI's competitors and we had teams doing cheap HW - ATi and other cards, like IBM's GPUs at the time - and yet the company as a whole was like "ALL GOOD LET'S KEEP BUILDING MASSIVE CUSTOM GRAPHICS SYSTEMS!"
They were as dead as SGI in the same timeframe.
Can you guess why? Was it maybe outmoded or outdated sales models that didn’t take reasonable competition into account.
I’m so sick of dancing around this topic. Managers and business types destroy companies. It never stops. Company after company. How many times do we have to pretend this isn’t the case before we see the truth.
Innovators make things worth selling. Business types drop it on the floor.
Walking the streets of nyc 19 years ago. I saw a sgi indigo on the street and sadly I just kept walking on. Like wow.
This is even a major point of discussion in the book. The incumbents always see it coming a mile away. They can't respond because doing so breaks their business model. Employees can't go to managers and say, "We need to enter this low-margin market for low-end products." Doing so is a good way to get sidelined or fired.
The "dilemma" is that either traditional option, compete with the upstarts or move further up-market, leads to the same outcome - death.