return to table of content

Intel's anti-upgrade tricks defeated with Kapton tape

donatj
60 replies
2d3h

What is the financial benefit to Intel in artificially limiting its CPU sockets like this? Logic and reason would have me believe they'd want to sell as many CPUs as possible, and keeping the socket compatible for as long as possible would seem logical.

My thoughts on reasons they might have done this. I honestly have no idea, these are just uninformed guesses.

- The Charitable Answer: There actually is some sort of minor incompatibility like a voltage difference to some pin where it can still boot but it's maybe not good for the CPU?

- The Practical Answer: They make more off the associated sockets/compass direction bridges/etc than they would off increased numbers of CPU upgrades.

- The Cynical Answer: Motherboard manufacturers paid them off

- The Cynical Practical Answer: They have a schedule for socket changes as some sort of deal with motherboard manufacturers and some engineers decided to do so in the laziest way possible

- The Silly Answer: They're an evil corporation and want you to suffer

fizzynut
27 replies
2d3h

Intel basically made the same CPU for about 6 years straight because of 10nm process issues.

They had to keep pretending the next gen "Lake" CPU was substantially different from the last, so they just took last gen product, made some minor tweaks to break compatibility and called it a new generation

_the_inflator
26 replies
2d3h

Same goes for most cars. No real revolution, tweaks or changes due to regulatory demands, but nothing groundbreaking.

jeffhuys
19 replies
2d3h

Still, when you’re due for a new car and look for the newest of the newest, would you go with manufacturer A, who released their latest car 8 years ago, or manufacturer B, who released it 1 year ago?

Incremental upgrades get so much hate around the internet (mostly about phones) by people having the version before it. Saying things like “ah they changed almost nothing! Why would I upgrade?!” While for instance me, only on my 3rd smartphone EVER, would love all the incremental updates over the years when I finally decide I need a new one, because I always get the latest and greatest. If a company then doesn’t release anything for a few years, I’d go somewhere else.

traverseda
11 replies
2d2h

The one with a reliability data for the past 8 years.

It's surprising to me that people would want to make a major financial decision like a car without knowing about its reliability history.

Groxx
5 replies
2d1h

8 years of the same parts, repair knowledge, and continued software support?

Sign me up immediately.

Zambyte
4 replies
2d1h

continued software support

Unfortunately due to the extremely minimal software rights that exist (see: proprietary software) this is pretty much nonexistent in cars AFAIK.

I would rather get a car that is old enough to not be limited by software constraints. Which is pretty disappointing, because I actually really like electric cars. I think they would work well for my needs. But they are all so intentionally kneecapped, I have no interest in any particular model that's available.

nehal3m
3 replies
1d22h

I would love a super bare bones electric car. One that functions the same as any late 90s/early 2000s era car would, except with an electric power train and maybe cruise control.

sitkack
1 replies
1d17h

2011-2013? Nissan Leaf fits the bill.

londons_explore
0 replies
1d12h

This.

It's so thoroughly reverse engineered that if you decided you wanted to reconfigure the software so it would only drive at prime numbered miles per hour, you could.

constantcrying
0 replies
1d6h

Those cars are essentially illegal to sell in many countries.

randomdata
2 replies
1d21h

> It's surprising to me that people would want to make a major financial decision like a car without knowing about its reliability history.

Some people will always be surprising, but it is pretty clear that the pickup truck is the most purchased type of vehicle (in North America) exactly because they have a much better reliability track record as compared to most cars. This idea doesn't escape the typical buyer.

tyre
1 replies
1d19h

it is pretty clear that the pickup truck is the most purchased type of vehicle (in North America) exactly because they have a much better reliability track record as compared to most cars

The pickup truck is also deeply engrained in American culture as masculine, even if the owner does nothing that requires it.

randomdata
0 replies
1d19h

Yes, it seems most products that gain a reputation for reliability end up taking on a "masculine persona", of sorts. Which I guess stands to reason as in popular culture reliability is what defines the "manly man".

jandrese
1 replies
1d23h

What if the reliability is like "these bearings are known to fail every 10k miles or so, but we have no product refresh planned for at least 3 years so the problem will remain unresolved?"

This is what incremental improvements are supposed to be. Well that and discovering that the vehicle can last till the end of the warranty period with one less bolt in that spot, so you can eliminate it.

bestham
0 replies
13h3m

You are wrong and right. Wrong because continuous improvement is not about making a vehicle only survive a 3-5 year warranty period. Right because the master of continuous improvement have a 10 year warranty (20000 km, 12500 miles) where I live if you do service at a dealership or authorised service centre and I think this extended warranty influence the decisions about what minimum level of quality the manufacturer will accept.

yellow_postit
0 replies
2d1h

Buying first gen models is always a crapshoot. Often same for last gen if they try to squeeze new capabilities into a platform it wasn’t intended for.

Tesla is particularly terrible but this has been true for every manufacturer.

You want a couple years for them to work out the kinks.

xattt
0 replies
2d2h

Case in point: A new car released 8 years ago, but with incremental upgrades (i.e. Mitsubishi RVR in NA), still won’t have the same fundamental design considerations around safety or fuel efficiency as a more recent model.

nfriedly
0 replies
2d

Honestly, from a reliability standpoint, the ideal new car is one that had a major refresh ~2 years ago. By then most of the kinks should be worked out.

Or just pay attention to the warranty. If they guarantee it for 10 years, they probably expect it to run for 10+ years.

graemep
0 replies
2d2h

In the case of cars and CPUs its not that people mind incremental upgrades, it is that they mind incremental upgrades sold as big upgrades.

For phones the mindset of people who upgrade when they have to and/or buy cheaper phones is very different from those who regularly upgrade to the latest flagship phone.

boplicity
0 replies
2d

Car buyers aren't always so dumb. When we bought our car, I was fully aware that major updates to models happen only so often. We bought used (of course), and the "major update" was our major criteria, more so than the specific year release date. (We bought a 2014 model in 2018; the year they released significant safety improvements compared to the 2013 model.)

ars
0 replies
1d22h

For sure A. I would never buy a car that is the first model year of a revamp. I would give them at least a year to work out the bugs.

Dylan16807
0 replies
1d23h

New models every year are fine if they're honestly labeled and have technologically reasonable compatibility.

Cars and phones meet those criteria a lot better than Intel CPUs. The problem isn't releasing things, it's the way they pretend every release is a big advance and the way they make the motherboards almost never able to upgrade.

lazide
1 replies
2d2h

Luckily it’s not common to need to replace your garage every time you get a new car.

someguydave
0 replies
2d2h

In this analogy, Intel sells the parts to make garages too

BobaFloutist
1 replies
2d1h

I mean I can't speak to ICE cars, but electric cars ranges seem to scale pretty dramatically with how new they are.

4jertdhf
0 replies
1d20h

There hasn't been significant combustion engine efficiency changes in a long time. My scrapbox from 2007 still goes 550 miles on a tank of diesel, about the same as my 1997 car did before it.

settsu
0 replies
2d

This is arguably exactly what most people actually need in a vehicle that you are spending thousands of dollars on: accumulated refinements seamlessly incorporated over time.

Year over year this typically results in good outcomes on a purely practical basis. However it just inherently makes for very boring publicity/promotional material.

Edit to add: it can also admittedly result in older solutions getting baked in which prevent larger beneficial changes. (Toyota's North American 4Runner and Tacoma models might be good real world examples of this approach resulting in generally high reliability but also larger, "riskier" changes being seemingly eventually necessary.)

constantcrying
0 replies
1d6h

Cars don't get a new model every year. They are even called "facelifts" to make it clear that it is essentially the same with minor modifications and upgrades.

Also there isn't much "groundbreaking" you can do in a car, except for the EV switch, the industry has existed on many small upgrades over time. (Like many other industries)

grandinj
8 replies
2d3h

It costs money to validate a new processor on an old motherboard, and no corp wants to waste money on a product they have already sold.

vel0city
1 replies
2d2h

It costs money to validate a new processor on an older motherboard design. How much does it cost to make a completely new motherboard design?

wmf
0 replies
2d1h

They're going to make new motherboards every year regardless, so any support for old motherboards is in addition to that.

lupire
1 replies
2d3h

The new processor isn't sold yet.

thfuran
0 replies
2d3h

Sure, but why pay to validate on old boards when you can instead get many of their users to pay you for new ones?

crote
1 replies
2d2h

AMD ran into significant compatibility issues with AM4, with some motherboards not being able to supply the amount of power needed by the newer CPUs and PCI-E Gen 4 support being removed in the final BIOS release due to reliability issues. A lot of motherboards also didn't have enough flash space to contain the firmware needed for the whole CPU range, so the update to support the newer gen had to remove support for the oldest gen.

Turns out it's really hard to guarantee compatibility with several years of third-party products which weren't designed with your new fancy features in mind.

Dylan16807
0 replies
1d23h

Newer doesn't usually mean more power, though. You can just have a power limit, it's fine. And I don't expect PCIe to get faster with a CPU upgrade anyway.

The flash space was pretty forseeable and they dealt with it, more of an excuse than anything.

gregmac
0 replies
2d3h

This is my thought, too.

I'd bet customer perception is also a factor; there's a risk the old boards (not even made by Intel) die and cause problems, and Intel wouldn't want to deal with press/comments like "This CPU stopped working after 2 months" or "I installed this new CPU, and within 2 months it killed my motherboard that had been working fine for 7 years".

They've released CPU upgrades with the same socket before, I'm sure they have the sales data to know how that performs vs new socket.

Laptops have outsold desktops for well over a decade, and their CPUs pretty much non-upgradable. I can't easily find a nice chart to reference, but intuition tells me the desktop industry is similarly trending towards complete "system" sales vs individual parts. In other words: Most people don't upgrade their CPU, they upgrade by replacing the entire system. If true, this also means the socket would be almost entirely irrelevant to sales performance.

babypuncher
0 replies
2d

Motherboard manufacturers don't seem to mind doing it for AMD chips

Laforet
7 replies
2d2h

Intel actually intended for LGA1151 to remain unchanged for Coffee Lake but found out late in the testing process that many existing motherboards did not have enough power delivery capability to support the planned 6 and 8 core parts. Hence the decision to lock them out in software only. They are probably aware of the bad optics but decided that it’s better than trying to deal with the RMAs later.

It’s very similar to what had happened in 2006 when the 65nm Core 2 series were released in the same LGA775 package used by 90nm Pentium 4s, however the former mandated a specific VRM standard that not all comtemporary motherboards supported. Later 45nm parts pretty much required a new motherboard despite having the same socket again due to power supply issues.

AMD went the other route when they first introduced their 12 and 16 core parts to the AM4 socket. A lot of older motherboards were clearly struggling to cope with the power draw but AMD got to keep their implicit promise of all-round compatibility. Later on AMD tried to silently drop support for older motherboards when the Ryzen 5000 series were introduced but had to back down after some backlash. Unlike the blue brand they could not afford to offend the fanboys.

P.S. Despite the usual complaints, most previous Intel socket changes actually had valid technical reasons for them:

- LGA1155: Major change to integrated GPU, also fixed the weird pin assignment of LGA1156 which made board layout a major pain.

- LGA1150: Introduction of on-die voltage regulation (FIVR)

- LGA1151: Initial support for DDR4 and separate clock domains

This leaves the LGA1200 as the only example where there really isn’t any justification for its existence.

Dylan16807
2 replies
1d23h

Why does a socket that can support DDR3 or DDR4 need to be different from a socket that only supports DDR3?

And with the current socket being 1700, they're going to change it again for the next generation to 1851, and with a quick look I don't see any feature changes that are motivating the change. (Upgrading 4 of the PCIe lanes to match the speed of the other 16 definitely does not count as something that motivates a socket change.)

So by my reckoning, half their desktop socket changes in the last decade have been unnecessary.

qball
1 replies
1d22h

Because DDR4 is electrically different and memory controllers are all on-die.

Intel could get away with doing that pre-Nehalem because the memory was connected via the northbridge and not directly (which is what AMD was doing at the time; their CPUs outperformed Intel's partially due to that), so the CPU could be memory-agnostic.

AMD would later need to switch to a new socket to run DDR3 RAM, but that socket was physically compatible with AM2 (AM3 CPUs would have both DDR2 and DDR3 memory controllers and switch depending on which memory they were paired with; AM3+ CPUs would do away with that though).

There were some benefits to doing that; the last time Intel realized them was in 2001 when RD-RAM turned out to be a dead-end. Socket 423 processors would ultimately prove compatible with RDRAM, SDRAM, and DDR SDRAM.

Dylan16807
0 replies
1d22h

Because DDR4 is electrically different and memory controllers are all on-die.

Them being on-die is exactly why you don't need a socket change to take full advantage of DDR4, since they directly showed a socket can support both at once. Unless you're particularly worried about people trying to buy a new motherboard for their existing CPU, but who does that? You can tell them no without blocking CPU upgrades for everyone else.

pre-Nehalem

AM3 CPUs would have both DDR2 and DDR3 memory controllers

LGA1151 supported both DDR3 and DDR4.

qball
1 replies
1d23h

LGA1155: Major change to integrated GPU, also fixed the weird pin assignment of LGA1156 which made board layout a major pain.

Of course, the P67 chipset was trivially electrically compatible with LGA1156 CPUs; Asrock's P67 Transformer motherboard proved that conclusively.

That said, the main problem with 1155 was their locking down the clock dividers, so the BCLK overclocking you could do with 1156 platforms was completely removed (even though every chip in the Sandy Bridge lineup could do 4.4GHz without any problem). This was the beginning of the "we're intentionally limiting our processor performance due to zero competition" days.

LGA1150: Introduction of on-die voltage regulation (FIVR)

Which they would proceed to remove from the die in later generations, if I recall correctly. (And yes, Haswell was a generation with ~0% IPC uplift so no big loss there, but still.)

Laforet
0 replies
1d20h

P67 chipset was trivially electrically compatible with LGA1156 CPUs

Well it’s possible to shoehorn in support for the determined but iGPU support is definitely out of reach and I am not sure what segment of the market is that targeted to. Seems like an excuse for AsRock to get rid of their excess stock. The socket change was actually very well received by everybody in the industry.

Haswell was a generation with ~0% IPC uplift so no big loss there

You are right that FIVR did not last long in that particular iteration. However Haswell does have a 10% to 30% IPC advantage over the previous gen depending on the test[1].

Haswell also added AVX2 instructions which means that it will still run the latest games whereas anything older is up to the whims of the developer (and sometimes denuvo, sadly)

https://www.anandtech.com/show/9483/intel-skylake-review-670...

xxs
0 replies
1d8h

many existing motherboards did not have enough power delivery capability to support the planned 6 and 8 core part

Coffee lake's 8700k had 6 core not 8, the refresh lake-s did feature 8 cores. The release TDP of 8700k was the same as 6700k - 95W, of course it consumed more at peak, and when overclocked.

The support would still depend on the motherboard manufacturers adding the CPU support (along w/ the microcode) to their BIOS. Many high-end boards, e.g asrock z170 extreme6/7, would have supported 8700k.

The situation is no different today that many (most) boards do not top support top end processors anyways, due to poor VRMs, even when the socket is the same. (or if they support the CPUs are effectively power throttled)

Rinzler89
0 replies
2d2h

Thank you for providing valuable insight. I wish these kinds of comments would end up at the top instead of the usual low quality "hurr-$COMPANY evil, it's all because greedy obsolescence-durr", from people who have no idea how CPUs and motherboards work together and the compatibility challenges that come when spining new CPUs designs with big difference that aren't visible to the layman who just counts the number of cores and thinks there can't possibly be more under the hood changes beyond their $DAYJOB comprehension.

Here's a video from gamer's Nexus on AMD's HW testing lab, just to understand the depth and breadth of how much HW and compatibility testing goes into a new CPU, and that's only what they can talk about in public. https://www.youtube.com/watch?v=7H4eg2jOvVw

Qem
2 replies
2d3h

Logic and reason would have me believe they'd want to sell as many CPUs as possible, and keeping the socket compatible for as long as possible would seem logical.

x86 market is a near monopolistic one, with two companies cornering most of the market. Monopolies can afford to sustain irrational/inefficient practices as long it helps to squeeze the consumer. I hope RISC-V succeeds in breaking this duopoly. Perhaps now with the latest round of sanctions against China, if their full industrial might is thrown behind open designs, we may have some hope to crash the duopoly.

hinkley
0 replies
2d2h

If you’re hoping RISC-V will get market share then it’s three companies, not two. Intel, AMD, and Apple.

II2II
0 replies
2d2h

If you're going to bring in RISC-V, why not mention ARM? They're currently more of a threat to Intel than RISC-V and likely will be over the next decade. They have a near monopoly for anything that is not a computer that requires anything more than a low performance microcontroller, and are supported to varying degrees by the three major general purpose operating systems.

RISC-V likely has a promising future, but the foundations are still being laid.

zackmorris
1 replies
2d3h

Coming of age in the 90s and witnessing the sheer audacity of greed that followed, I can tell you that the cynical answer tends to be the right one.

hinkley
0 replies
2d2h

Especially considering Intel came of age in the 90’s as well.

utensil4778
1 replies
2d2h

Intel sells the chipset that goes along with the processor, as well as selling their own motherboards. I think the profit incentive here is obvious.

Why sell just a CPU when you can sell a CPU and a chipset and a motherboard?

kube-system
0 replies
1d22h

Intel stopped selling motherboards in 2013.

pwg
1 replies
2d1h

What is the financial benefit to Intel in artificially limiting its CPU sockets like this?

They (Intel) also make the chipsets that go on the motherboards. So anyone who disposes of their old motherboard and buys a new motherboard because of this limitation results in:

   1) new CPU sale to Intel
   2) new chipset sale to Intel (indirect sale via the motherboard manufacturer)
Given that a "new CPU" sale plus a "new motherboard chipset" sale is more revenue to Intel than just a "new CPU" sale alone the financial benefit becomes obvious.

godelski
0 replies
1d19h

Not to mention that they know people have to upgrade, and in not that long of a time (5 years-ish?).

The strategy doesn't work if it was something like a car where the lifespan is 20+ years, but with high turnover they have people over a barrel. Now AMD competes, but I think we forget that this is a new thing (and likely because if you're on HN, you're deep in this environment). So the big question is: will Intel continue, or will they recognize that they don't have the choice anymore. That is, of course, as long as AMD decides to not play the same game.

pessimizer
1 replies
2d3h

I would say that Intel is a company, not a person, and isn't motivated but directed. If the same people own both Intel and the mobo manufacturers, they win by forcing new purchases of new products primarily distinguished by a higher price.

One computer owner in a thousand upgrades their processor alone, or would even know how to.

Arelius
0 replies
2d2h

I would say that Intel is a company ... and isn't motivated but directed

You know, it's a matter of perspective, but I'd disagree.

I think we'd like to think companies are directed, but I think as they get larger and older, especially public companies, they operate less by difection, and the systemic forces take over, and they operate and function more by the aggregate sum of all the motivations of the actors involved.

I think it's true that some companies do exist, perhaps by sheer force of personality of their leaders, that remain primarily "directed" but feel that's more the exception than the rule.

fennecfoxy
1 replies
2d2h

To be fair I don't upgrade PC all that often (my 1080ti still going strooong!)

So when I do upgrade (every 6 years or so, it's currently been since 2018 and I still feel no need) then not only does the CPU technology need a bump, but the bridges on the mobo also need an update. Going from a 2018->2024 mobo is guaranteed to get you things like more/faster m.2 slots etc as well.

I suppose they could make compatible with old and new boards but I imagine it's much easier for them to design 1:1 new cpu + chipset than to design and test 1:* new cpu, new chipset, old chipset 1, old chipset 2, etc.

immibis
0 replies
2d1h

My i7-6700k was going strong until the motherboard died for the second time. I suspect a solder crack caused by issue vibration from a damaged fan (and my only evidence for that is the fact the fan is damaged and vibrates and that two different motherboards died). Might try reflowing it to revive it, eventually.

After reviewing my options I would have either bought a third motherboard or upgraded to something like a Threadripper (I did the latter). Upgrading to a current desktop Intel system just didn't seem really worth it when I already had most of a working one and it was still fast enough.

You actually have FEWER expansion slots on the newer desktop motherboards because they've pre-decided what you want: a desktop has a 16x GPU slot and a 4x NVMe slot, and that's it. Gone are the days of generic uncommitted expansion slots, mostly.

ikekkdcjkfke
0 replies
2d2h

I am going to use this listing in my pre-prompt

Cold_Miserable
0 replies
1d12h

Intel makes money selling motherboard chipsets. There's no excuse, just greed.

scrlk
17 replies
2d6h

Reminds me of the "pencil trick" to overclock CPUs in the late 90s/early 00s. You'd rub a pencil over contacts on the CPU to bridge them, which would unlock the clock multiplier.

jsheard
9 replies
2d5h

You rarely see those kinds of unlocks anymore unfortunately, since they started using eFuses to disable parts of the chip permanently. It still happens occasionally though, like the time when AMD decided to add a 4GB SKU alongside the originally planned 8GB SKU of one of their graphics cards so late in development that they didn't have time to actually change the hardware, so all of the initial batches had 8GB installed with half of it disabled in the VBIOS where applicable, which was easy to reverse by flashing the VBIOS from the 8GB version.

The one time in history when "download more RAM" wasn't just a meme.

terlisimo
4 replies
2d4h

Honorable mention of ATI Radeon 9600 that was soft-upgradeable (via hacked drivers) to 9800 PRO for a 200% perf boost. Good times.

ielillo
2 replies
2d3h

Actually it was the Ati Radeon 9500 non pro that could be modded into a Radeon 9700. Regarding the ATI Radeon 9600, there was a variant called Radeon 9550 that could be overclocked from 250 mhz to 400 mhz.

micv
0 replies
2d

Actually it was the Ati Radeon 9500 non pro that could be modded into a Radeon 9700.

Sometimes. At least some of those 9500s were binned parts that showed their broken bits when you modded them. I had one. The screen turned into a kaleidoscope when I tried to play a game.

Was definitely worth a try if you had one though!

Moto7451
0 replies
2d2h

And my personal favorite, the Radeon 9100 which was actually a Radeon 8500 in PCI format instead of AGP. With a slightly tweaked 8500 Mac Edition bios you would have a very fast GPU for first gen PCI Macs through the Yikes G4. I believe some faster NVidia PCI cards ended up appearing and being made to run on Macs but I had moved from my XPostfacto PCI Macs to an Intel mini by then.

iwontberude
0 replies
2d3h

And NVidia GeForce 6600GS to 6800GT with pencil mod and flash.

lithos
2 replies
2d3h

IBM servers this was the default as well. Where the difference between ram amounts on AS400 systems was a phone call to enable more.

jsheard
0 replies
2d2h

Having dormant hardware which is intended to be unlockable later is a separate thing, which is rarely seen on consumer hardware. Intel tried it a while ago but the backlash was so severe that they gave up and reverted to permanently fusing off the silicon.

https://en.wikipedia.org/wiki/Intel_Upgrade_Service

I believe they still have something similar which allows Xeon processors to be upgraded in the field though. Car manufacturers have been testing the waters too, e.g. installing heated seats in every model and making them a paid software unlock for lower end models that didn't have that option enabled from the factory.

hinkley
0 replies
2d2h

IBM’s problem was going to be inventory, labor, scheduling, and transportation costs.

Your mainframe is slow and IBM is going to charge you 6 hours of labor to come out for 90 minutes next Friday, or we can deal with it over the phone.

I can’t recall but did IBM use that extra hardware for physical redundancy in case of hardware failures? I know they researched letting equipment die and coming out and replacing it after multiple failures instead of single ones, but I don’t know if they applied that to shipping mainframes, to regular rackmount hardware, or just in Research.

Astronaut3315
0 replies
2d4h

Wow, that’s even better than my old PNY GTX 465. It was really a 470 that was cut down by the VBIOS. I was able to download 256MB of VRAM, a wider memory bus and some GPU cores on that one.

benreesman
2 replies
2d4h

Likewise the felt marker on DVDs during the DRM Wars.

StimDeck
1 replies
1d13h

I feel like I should know what this is a presence to.

wrigby
0 replies
2d6h

I remember this being the way to unlock Athlon XP’s (though if you bought the mobile version, which used the same socket and ran just fine in a desktop motherboard, they came unlocked from the factory).

whalesalad
0 replies
2d5h

Taking me back to the Athlon XP Barton days

stronglikedan
0 replies
2d4h

They do just that at the end of the embedded video.

laweijfmvo
0 replies
2d2h

and the initial "fix" from AMD to stop it was to laser burn a trough between the contacts, so that you couldn't draw a line between them!

accrual
10 replies
2d4h

A couple of historical CPU mods similar to this:

- AMD K6-2+ can be converted to K6-3+ by moving a 0-ohm resistor under the IHS to unlock the full 256K of L2 cache only present on K6-3+ models. The CPUs were basically all the same and were binned into separate SKUs using the position of this resistor.

- AMD K7 (Athlon XP) can be similarly unlocked by bridging conductive pads on the chip using something as common a graphite pencil.

- Intel Pentium 3 Coppermine chips could be run on earlier Pentium 2 boards by using a slotket adapter with a modified socket. The socket could sometimes be further modified to support the signaling used by even later P3 Tualatin chips, allowing for Tualatin CPUs to run on 440BX chipsets which were never designed for them. Also needs a BIOS mod and sometimes a VRM replacement.

litenboll
5 replies
2d3h

If anybody else wonders why a 0-ohm resistor is a thing, apparently it is because it makes it possible to install a jumper on printed circuit boards with the same equipment that is used for normal resistors.

https://en.m.wikipedia.org/wiki/Zero-ohm_link

l33tman
1 replies
2d3h

You put 0-ohms where you think you might at some point want to have something else there like a > 0 ohm resistor or an inductor (for EMI blocking for example), it's much easier to just reprogram the SMD robot than to make a new PCB.

a1369209993
0 replies
1d22h

That's accurate, but given the lack of words to the effect of "You can also put", it deserves the clarification that the most common something else is - as the parent suggested - a ∞-ohm resistor, either as a specific component or more commonly in the form of the absense of a component.

nine_k
0 replies
2d

A "0-ohm resistor" could also be more clearly called a "conductor", but it's not, despite its electrical function.

I suppose it's because the mechanical function is more salient: it is a standard SMD part like other resistors, capacitors, LEDs, etc, not a conductor etched on the PCB.

lazide
0 replies
2d2h

Resistor shaped wires are also a lot less dangerous than fuse shaped ones. In my experience.

4gotunameagain
0 replies
2d

It can serve multiple purposes. You can have 0 ohm resistors to serve as a bridge to run a trace below if you're running out of space.

You can have multiple resistor footprints and depending on where the 0 ohm is placed, different configurations are enabled.

You can use it to be able to isolate parts of the circuit after the fact, although solder jumpers are more common for that purpose

crote
1 replies
2d2h

I remember upgrading my ATI Radeon HD 6950 to a 6970 - it was just a simple firmware flash! Worked beautifully. The card did eventually die so it might not have been the best idea, but in the meantime it did mine enough Bitcoins to pay for itself.

stordoff
0 replies
2d

I flashed my Radeon 9550 into a Radeon 9600 Pro. Bumping the core/memory clocks from 250/200 to 400/300 was a pretty decent upgrade, and there was still some overhead for overclocking (IIRC, I got it up to about 450MHz on the core).

mrandish
0 replies
2d3h

Not a CPU mod because it was enabled by BIOS and motherboard but still perhaps the most legendary CPU ever for overclocking: Celeron 300A (1998). >50% was typical just by changing one setting.

candiddevmike
0 replies
2d3h

I remember the AMD unlocks using just an advanced setting in the BIOs. My cheap ass got a quad core processor for the price of a dual core!

bell-cot
9 replies
2d6h

Sounds like good* strategy by Intel. Cheap for them & the MB manufacturers to make a few little changes, all the potential issues with Old MB/New CPU systems are now "sorry, not supported", and the Rebel 0.01% can imagine that they've cleverly won a real victory - which "victory" will barely be a rounding error on any of the Big Players' financials.

*"Good" by Capitalist Overlord standards

yourusername
5 replies
2d5h

But if i can buy a new CPU for my motherboard i would be almost guaranteed to stay with Intel. Now ever upgrade is a chance to jump ship to the competition. There are probably people on their 3rd AM4 CPU with the same motherboard.

jimbobthrowawy
4 replies
2d5h

Intel only doing one upgrade per compatible mobo is kind of annoying. I had to buy and return a 6000 series CPU just to update my board's firmware so I could use the 7700k I intended to. How's the compatibility on the AMD side? Will a board work enough to upgrade itself with a newer CPU than it launched with?

toast0
0 replies
1d22h

Last time it was a major problem, AMD had a program to send you a low end old generation processor and you'd send it back when you were done. CPUless flashing is pretty common now though, and inventory is low, so there's less stale bios motherboards out there.

mrguyorama
0 replies
2d4h

When the Ryzen 3xxx series CPUs came out, some motherboards had the same situation. Some computer stores were anecdotally lending out 2xxx series Ryzen CPUs so you could update your BIOS

daneel_w
0 replies
2d5h

I had no problems upgrading from a quad-core Zen 2 to a hexa-core Zen 3 on my particular motherboard (Gigabyte A520M H).

InvaderFizz
0 replies
2d5h

If the board supports headless USB bios updates, yes. Those can update the bios without a cpu present.

xxs
1 replies
2d4h

Rebel 0.01%

Lower than that, much lower - as it required custom built/assembled BIOS, effectively. The hardware mod was the very easy part.

amarcheschi
0 replies
2d3h

I followed a guide to mod my bios (amd tho) to unlock features and with a tutorial is kinda easy... like, just follow the tutorial, open this, change this to that... i don't know however whether it would require a handmade mod for each bios or if a generalist tutorial would be okay for different oems mobos

MrBuddyCasino
0 replies
2d5h

It is a way to guarantee MB vendors a steady recurring revenue, thus making it a good business to build mobos for you.

AMD supports upgrades for longer, which is nice but presumably doesn't do them any favours in the OEM relationship game.

Luckily, as a customer you have a choice.

sokoloff
5 replies
2d6h

This is dated 2024, but is talking about Intel 8th gen chips released in 2017.

causality0
2 replies
2d5h

And? Submissions talking about the architecture of the Game Boy don't need to be tagged 1989.

jsheard
1 replies
2d5h

Mentioning the Game Boy would date it implicitly, they don't make Game Boys anymore. Intel is still around.

a1o
0 replies
2d3h

"they" Nintendo doesn't. But "they" random people from the internet make FPGA motherboard of Gameboy that is compatible with factory buttons, case and accessories.

jimbobthrowawy
1 replies
2d5h

The hackaday post is from this year, and talks about forum threads/guides from 2019. I would guess the author only learned about it or decided to write about it now.

Even if it's not that useful to people nowadays, it's interesting to learn such a thing is possible.

xxs
0 replies
2d4h

threads/guides from 2019.

2018, April

qwerty456127
3 replies
2d2h

Anti-upgrade is disgusting. Upgradeability is a major reason to buy an tower PC. I would love to pay any remotely-reasonable amount of money for a motherboard which would let me just swap CPUs and cards for 15 years. This is what good motherboards were in the good old pre-PCIe days. It was so lovely to buy the best MB + cheapest everything else, then upgrade whatever as you need and can afford it. I've read Intel even used to make "overdrive" CPUs to fit really completely new generations into old sockets.

immibis
2 replies
2d1h

One reason I chose to buy the lowest-end Threadripper instead of the highest end Ryzen. In the future, even if there's never any compatibility with socket sTR5, the parts in the same generation provide a part-by-part upgrade path up to 96 cores, 1TB of RAM (or was it 2TB? I forget), and 128 PCIe lanes.

qwerty456127
1 replies
1d22h

I would prefer 4 really powerful cores to 96 weaker cores in a CPU though.

Sohcahtoa82
0 replies
1d21h

That's going to certainly depend a lot on the tasks you do.

An embarrassingly parallel task would work better in the weak 96 cores than 4 strong cores unless those strong cores were literally 24 times more powerful, or if pegging 96 cores causes RAM to be a significant bottleneck.

userbinator
1 replies
2d4h

It's interesting how the pins differ between motherboard manufacturers, which suggests they didn't copy Intel's reference schematics completely or used different revisions of them.

They used to publish the reference schematics on their site up until the P4 era, but I guess it made things like this too easy.

StimDeck
0 replies
1d13h

Makes sense in the slow slide of anti repair.

irisgrunn
1 replies
2d5h

Similar to how you can use a socket 771 Xeon in a socket 775 mainboard with changing a few pins and mod the bios.

helf
0 replies
2d4h

I still have a system running a modded xeon in a 775 board lol

DarkmSparks
1 replies
2d4h

not saying this contributed significantly to Intels recent $7 billion loss in chip making, just that they probably shouldnt be pushing buyers away quite so hard given their current situation.

xxs
0 replies
2d4h

coffee lake was the 1st time Intel had to wake up and put more than 4 cores in a (consumer) CPU; so it was a big thing.

xxs
0 replies
2d4h

It's an old topic[0], the site does reference it. Back then it was widely discussed in the overclock community. Effectively Intel's [6-10] series are all skylake. Shorting the cpu pins (well LGA) was possible even with a pencil's graphite.

[0]: https://community.hwbot.org/topic/175489-asrock-z170-mocf-li...

wannacboatmovie
0 replies
2d4h

If your system isn't unstable enough... I'm sure you could drop a Ford V8 into a Hyundai with a few simple mods; it doesn't make it an intelligent idea either. But boy would it generate clicks.

peepeepoopoo74
0 replies
1d22h

Ah yes, I am reminded of the age-old internet proverb: "A socket change a year keeps the goyim in fear."

nottorp
0 replies
2d3h

Don’t get too excited though, as projects like Intel BootGuard are bound to hamper mods like this on newer generations by introducing digital signing for BIOS images, flying under the banner of user security yet again. Alas, it appears way more likely that Intel’s financial security is the culprit.

It's okay, soon we'll lock down all software too in the name of security.

josephcsible
0 replies
2d3h

Don’t get too excited though, as projects like Intel BootGuard are bound to hamper mods like this on newer generations by introducing digital signing for BIOS images, flying under the banner of user security yet again. Alas, it appears way more likely that Intel’s financial security is the culprit.

Indeed. The rule of thumb is that if you don't have the ability to turn off some security feature in something you own, then it's really there to make the thing secure against you.

dghughes
0 replies
2d4h

This reminds me of the olden days (1990s) when people would fill in laser cuts on hobbled CPUs with solder to boost...something. I'm old so l forget.

daneel_w
0 replies
2d5h

I was myself once, before switching to AMD, a user of the related trick where one could run a used $35 quad-core Xeon on e.g. a Core 2 mobo by just switching two pins on the CPU's pin grid with a little sticker. Miles better experience than the $400 Core 2 Quad.

chmod775
0 replies
1d19h

Contrasting this to AMD’s high degree of CPU support on even old Ryzen motherboards, it’s as if Intel introduced this incompatibility intentionally.

That is because they introduced it intentionally.

They don't give a rat's ass about how this screws people over and creates e-waste. Pencil pushers at Intel just couldn't figure out how to put consumer goodwill on a balance sheet.

aranchelk
0 replies
2d4h

It’s called “Coffee Mod”? I would have gone with “Kapton Lake”.

SomeoneFromCA
0 replies
1d21h

The worst intel did is fusing off AVX512 in Alder Lake. It is the _ONLY_ consumer grade CPU family with hardware AVX512 FP16 support. Fantastic instruction set for machine learning.