return to table of content

Researchers claim first functioning graphene-based chip

ajb
22 replies
2d

Buries the lede a bit? Near the end is "Conventional GFETs [graphene transistors] do not use semiconducting graphene, making them unsuitable for digital electronics requiring a complete transistor shutdown. [...] the SEC [material] developed by his team allows for a complete shutdown, meeting the stringent requirements of digital electronics."

As far as I heard before, this was the problem with graphene transistors, they had a nonlinear response but did not shut down the current flow, making them not useful for digital logic, only analog circuits. But, it's a while since I read about this so maybe someone else already achieved this before.

singularity2001
19 replies
1d20h

Semi analog transistor would be perfectly fine for AI operations though? ( matrix multiplication, sigmoid, tanh etc )

marcosdumay
16 replies
1d19h

Analog or digital are properties of the signals you put on the circuits, not of the transistors.

You can do digital manipulation with non-gapped transistors too. You just can't use the extremely intense when active but self-limiting designs we use today.

Besides, unless we get some very creative new insights, analog computers are a dead-end.

lbourdages
15 replies
1d19h

I've been trying to find the source for a while, but I recall reading about a fundamental property of analog computers that makes them inherently unstable (as in, noise will always win on the long run), unlike digital computers. No matter how good the design is, little bits of noise add up in complex analog computers and the output is inexact.

I guess my Google-fu is good enough, because I've been unable to find where I read about it.

persnickety
5 replies
1d7h

Is it true, though?

Look deep enough, and every modern digital circuit is emulated by an analog circuit, just because the components are analog in their nature, and the 0s and 1s are an interpretation of analog data. That includes digital computers.

Does this make digital computers inherently unstable? Clearly, simple enough computations both of digital and analog kind are good enough to be useful. So there must be a breaking point further away, but on what axis?

ajb
4 replies
1d5h

A transmission line accumulates noise at it increases in length. Eventually the signal to noise ratio is too high. In old school analogue comms, the solution is to decode the signal to the digital domain before it reaches that point, which allows us to drop the noise. Then the data is reencoded back to a clean analogue signal again.

In digital logic, this process happens at every gate. Hence the reliability of digital logic.

Analogue logic doesn't do this. So analogue logic is only useful if the noise introduced at each step is lower than the error from your source data was already (at whatever point in the computation you have reached). If there is a way round this, I don't know it.

persnickety
3 replies
1d4h

There is a way around this: discretize your values. That's how analog calculations can be a basis for digital calculations.

That's why I think the original comment was about a specific, limited meaning of "analog computation" that does not allow for emulating anything digital. But I struggle to come up with one that doesn't throw the baby of being universal out with the bath water of emulating digital computations.

formerly_proven
2 replies
1d3h

There's a term for analog systems using discretized values: digital.

persnickety
1 replies
1d3h

Exactly! And under that lens all fundamentals applying to analog also apply to digital: that they are inherently unstable and that noise will always win in the long run.

Which sounds at the very least imprecise to me, considering that I'm writing it on a digital (and therefore analog) computer.

ajb
0 replies
1d1h

In fact, Von Neumann proved mathematically that it's possible to build a reliable system out of unreliable components. But, modern digital logic uses a simpler mechanism: the thing you are missing is that noise does not propagate or accumulate in discretize (digital) systems, as long as it remains below a threshold, and circuits are designed so that it is far below the threshold.

marcosdumay
4 replies
1d18h

Are you thinking about the property of analog data that it can't be stored, reprocessed, or copied without the noise increasing?

That doesn't make the computers unstable. It just makes the data less fit for long-term storage. And even then, people manage.

lbourdages
3 replies
1d14h

No, it wasn't about storage. It was really during calculations, you can't prevent the noise from affecting the results, from a fundamental level.

marcosdumay
0 replies
1d1h

Oh, ok. You can't.

I'm not sure you can find a citation though, it's like searching for a source that the sky is blue.

Keep in mind that digital calculations have noise too. The digitization noise behaves in a completely different way, but any single computer has a finite precision whatever the technology behind it. Infinite precision doesn't exist on the real world.

imtringued
0 replies
1d6h

Floating point calculation is "noisy" and yet there are models that use smaller and smaller floating point numbers without losing much accuracy.

fooker
0 replies
1d8h

For any complicated calculation, you need to store and propagate multiple intermediate results.

This is what goes wrong.

error(x + y) > error(x) + error(y)

jari_mustonen
2 replies
1d13h

Could this be the reason why human (and things with brains) need to sleep from time to time? Sleep would reset the accumulating noice from the brain.

smolder
0 replies
1d11h

It's not really clear what you mean by the brain accumulating noise. An analog computer can suffer from compounding imprecision, because error bars carry forward. We are analog computers in a vague sense, but as humans we aren't imprecisely crunching numbers all day such that our results are out of whack by nightfall... You could say sleep is like a reset for the brain, but that doesn't say much. Sleep is complex and relates to most everything else about an organism in complex ways.

Qwertious
0 replies
1d10h

Nah, humans just blank-out for a couple of seconds mid-task instead.

magicalhippo
0 replies
1d11h

Multiplying two numbers in an analog circuit requires taking the logarithms and add those, then converting back, or some similar trick.

In practice this is done using non-linear effects of transistors[1], however the exact details of those effects are individual to each transistor and is also temperature dependent.

Since the multiplication circuit relies on different transistors behaving identically, compensation circuitry and trimming is required, which will never be perfect.

[1]: https://www.analog.com/media/en/training-seminars/tutorials/...

imtringued
0 replies
1d6h

Leaky ReLU is the best though.

ajb
0 replies
1d5h

Yes, analogue computing. It's a world of pain though, each individual operator in each individual device is going to have a slightly different level of accuracy, which is also going to be affected by temperature, so you need to learn a whole new set of analytical skills and operational practices to ensure that your model works correctly on each one at a customer location. And that's not even thinking about the testability at shipment of the devices being in spec, and that they are at all operating temperatures and over the lifetime of the device.

j16sdiz
0 replies
1d21h

That's the "bandgap" thing mentioned in the article. They exists, but very unreliable.

happytiger
0 replies
1d18h

I believe what you are referring to is called the Bandgap problem.

Good explanation on the issue is here:

https://www.allaboutcircuits.com/technical-articles/graphene...

Convenience quote of relevant section:

Lack of Bandgap Despite being a fast and efficient transistor, the GFET does not have a bandgap. The gapless structure means that the valence and conduction bands meet at zero volts, hence making graphene to behave like a metal. In semiconductor materials such as silicon, the two bands are separated by a gap which behaves like an insulator under normal conditions.

Usually, the electrons require some additional energy to jump from the valence band to the conduction band. In FETs, a bias voltage enables a current to flow through the band which acts as an insulator in the absence of the bias.

Unfortunately, the absence of a band gap in GFET makes it hard to turn off the transistor since it cannot behave as an insulator. The inability to completely switch it off results in an on/off current ratio of about 5, which is quite low for logic operations. Consequently, using GFETs in digital circuits is a challenge. However, this is not a problem with analog circuits hence making the GFET suitable for amplifiers, mixed-signal circuits, and other analog applications.

Multiple parties are researching ways to address these bandgap challenges, including techniques such as the negative resistance approach and the bottom-up synthesis technique of fabrication.

throwbadubadu
19 replies
2d

The outcome is transistors capable of operating at terahertz frequencies, offering speeds 10 times as fast as that of the silicon-based transistors used in current chips.

Why? We are single digit gigahertz, so terahertzes should be ~ 100 times faster?

mtlmtlmtlmtl
7 replies
2d

Not an EE, but my guess is that current transistors can likely operate at a much higher clock rate than current chips.

The problem is that when you pack them together, they get far too hot to permit dennard scaling to reach the limits of the individual transistor.

tux3
6 replies
2d

This one's not because of heat, but because signal had to go from the beginning of a stage, through several transistors, to the end of the stage

So the transistors switch fast individually, but the speed of the chip is limited by the slowest path in any stage, where you wait for every transistor on the path in series

foota
2 replies
1d22h

Out of curiosity, how long are these paths generally?

tux3
0 replies
1d20h

So, when I talked about the length of the path in number of transistors, I oversimplified a bit. The length of the path would be measured in nanoseconds (or even picoseconds).

That number depends not just on the number of transistors, but also on the routing delay (are the interconnect wires between transistors long or short? How high is the resistance/capacitance?), and a lot of low-level details of the fabrication process that are extremely not public

But say you buy a brand new CPU and it's clocked at 5 GHz, you can easily get a rough estimate of how long the critical path is, since 1/5GHz = 0.2ns

What you can't easily get is the speed of the transistors or the number of transistors in the critical path, that info is not public, and you could only make a very rough guesstimate.

jiggawatts
0 replies
1d20h

Low single digit millimeters for current chip designs.

One reason that clock speeds are going above 5 GHz these days is that chips are getting smaller. That means shorter signal propagation distances.

magicalhippo
1 replies
1d11h

I recall reading some decades ago about dividing chips into separate domains that would commuicate asynchronously, precisely to shorten the longest path and increase the max frequency.

It's my understanding modern processors do indeed do this to varying degrees.

tux3
0 replies
1d8h

Yep, the term you might be thinking of is "clock domains". This is good if you want different parts of the chip to run at different frequencies, potentially saving a lot of power & gaining perf where needed

mtlmtlmtlmtl
0 replies
1d22h

Ah, right, thanks for correcting me.

AdamH12113
4 replies
2d

The clock frequency is single-digit gigahertz. But digital signals have to propagate through multiple layers of logic gates, so the transistors have to switch much faster.

At a glance, the article doesn't make it clear whether the "terahertz" speed refers to switching frequency or gain bandwidth (fT). You can definitely get transistors with fT in the hundreds of gigahertz range right now.

itcrowd
3 replies
1d19h

digital signals have to propagate through multiple layers of logic gates, so the transistors have to switch much faster

I don't think this is accurate. Are you saying that in digital computers each individual transistor switches faster than the clock rate of, say, 3 GHz? I think there is one clock signal that is distributed to all transistors and they turn on/off synchronously at this rate. The GHz number on the processor advertisement is the switching rate of all transistors, not some hypothetical 'system rate' which would somehow be much lower?? Please clarify or correct me if I am mistaken.

xyzzy123
1 replies
1d19h

I think it's easier to see if you jump up 1 level of abstraction from transistors to logic gates.

Imagine an adder made up of logic gates. The gates aren't inherently synchronous - they don't have a clock input - signals appear at their inputs and some time later propagate to their outputs.

To make the adder synchronous you need flip flops at the inputs/outputs and a clock.

If you squint a bit you can view most designs as blobs of async logic sandwiched between sync elements (gated by the clock).

We can see that a signal might have to go through a lot of gates/transistors between flip-flops and so the gates (and their underlying transistors) will necessarily need to be able to switch faster than the clock.

rbanffy
0 replies
1d1h

This is the beauty of asynchronous logic: you don't need the flip flops and the clock - you need a yes-this-is-it signal propagating along the results.

AdamH12113
0 replies
1d18h

You’re mixing up time and frequency. For a signal to propagate through multiple logic gates in one clock cycle, each gate must switch in a fraction of a cycle. That means a single gate could switch more often (higher frequency) if it were by itself. But it’s not.

(Technically, many gates do switch more than once per cycle since their inputs change at different times. But their outputs are only latched at the end of the cycle, so any extra switching is ignored.)

imtringued
1 replies
1d6h

PCIe 5.0 does 32 Gbit/s per lane, which means you need to run at least at 32GHz and probably more if you want to sample the signal more than once. So no, we are at way higher frequencies already.

pezezin
0 replies
16h10m

I think that the current state of the art is 224 Gbit/s with PAM-4 modulation, which means 112 GHz :O

It is intended only for the highest end network gear, the range is very limited, and requires very special and expensive cables.

sheepscreek
0 replies
1d18h

Why? We are single digit gigahertz, so terahertzes should be ~ 100 times faster?

*1000 times

itcrowd
0 replies
1d23h

For digital computers, clock speeds are single digit GHz. For analog circuits, ~100 GHz is achievable in silicon. E.g., automotive radar chips, communication systems etc.

This THz comment relates to analog circuits, which is supposedly around a factor 10 higher with this new tech.

If you want to learn more, read about Fmax and Ft of transistors.

deepnotderp
0 replies
2d

Single transistor level frequency is different than chip level frequency.

Each clock stage has many layers of transistors

daveFNbuck
0 replies
2d

We're at single digit gigahertz for an entire chip, not a single transistor.

bee_rider
15 replies
2d2h

The manufacturing process seems quite friendly compared to the (albeit, local) ecological nightmare that is Si.

It sure is hard to compete with half a century of Si advancement, though. Especially given how many STEM-brains get attracted to software instead, nowadays…

varispeed
13 replies
2d2h

Especially given how many STEM-brains get attracted to software instead, nowadays…

It's just common sense. You won't earn decent money at semiconductor company and given the niche, you will be sentenced to whims of one or two (if you are lucky) companies operating in your region. If you find that your employer doesn't treat you well, you cannot exactly leave and bootstrap your own chip making business.

Software is more democratised and you have much better chances to grow wealth of your family in that area. Although corrupt regulators are doing their best to pull as many ladders as they can to limit ways workers can go their own way (like changed IR35 legislation in the UK that massively limits how small service based business can operate).

ngneer
5 replies
2d1h

This is right. The flip side is that having many talented software engineers renders each one slightly more dispensable. You want to find the right Goldilocks niche. Not too rare that mobility is limited, but not too prevalent that compensation suffers.

tyre
4 replies
2d1h

As a hiring manager, I can confidently say that we are nowhere close to saturated with talented software engineers.

bboygravity
1 replies
1d15h

* talented very experienced software engineers under 35, who are willing to work on-site full-time (which often means move to the company) and are allowed to work in the US?

Or am I being too synical?

Firmwarrior
0 replies
1d10h

Software is the easiest thing in the world to outsource. There's no reason we can't play Chinese-made games or use Indian-made chat apps

Market forces (more demand than supply) are what push coding salaries into the stratosphere

agumonkey
1 replies
2d1h

so the niche is in training ? :D

throwaway22048
0 replies
1d22h

Yes, I earn 3x more doing training than myself or any of my senior SWE colleagues/friends. Think 1500 EUR daily rate instead of 500 (in Central Europe).

mindentropy
2 replies
2d1h

It is simple. If the Govt wants to support the semiconductor industry provide proper worker protections. It should know that the a few companies would have cartel like power and exploit their employees.

robertlagrant
0 replies
1d20h

It's not worker protections that will help. They will just end up incentivising industry to move elsewhere. The problem is it's too hard to set up competitors. Competitors are what raise employees' standards of living, far higher than vote-winning legislation.

idiotsecant
0 replies
1d23h

Its rare that nuanced problems have simple solutions. The problem is not that evil employers are oppressing helpless employees, at least not entirely. The problem is that making physical things requires physical infrastructure and investment at a massive scale. Things move slowly and barrier to entry is high. Conditions like that are going to cause a slower, less transferrable skill set just because few entities are willing to take the capital risk to develop that infrustructure so fewer entities exist.

Contrast that with software where you and I could go start a business now and potentially unseat a major player somewhere with enough talent and dedication.

Its obvious which one is going to have more mobile employees.

soitgoes511
0 replies
1d20h

Won't earn decent money? Have you ever worked for a big semi company? I have and that is simply not true. Niche, maybe... But, they did not skimp high performers regarding wage. Believe it or not, there are plenty of good software positions in semiconductors too..

rbanffy
0 replies
1d1h

you will be sentenced to whims of one or two (if you are lucky) companies operating in your region.

Or you work at a foundry-less IP house. There are many of those.

burnished
0 replies
1d23h

Yep! When I was in school and learning about semiconductors I was so interested that I went to the library for further reading and dropped by the physics department to ask questions, purely for my own edification. But I chose to pivot towards software due to the career prospects.

bee_rider
0 replies
2d2h

I agree that the market has shifted in a way that we’re all individually making the rational decision by going into software, but it still seems like a shame.

crotchfire
0 replies
1d23h

Just wait until they scale it up and commercialize it; the nightmare will return.

The sad fact is that nasty and exotic chemicals provide ways to make many manufacturing processes cheaper and run large batches. When you're pushing the limits of everything and the stakes are high, you can't really take them off the table.

mthcalixto
12 replies
2d1h

Graphene is the future and Brazil is the largest country that has this resource, it could be the best country in the world.

binarymax
3 replies
2d1h

Graphene (https://en.m.wikipedia.org/wiki/Graphene) is manufactured from carbon. You don’t pull graphene out of the ground, you manufacture it in a lab.

neuronexmachina
2 replies
2d

I've been playing too much Dyson Sphere Program lately, my first thought was "you just need to combine graphite and sulfuric acid in your chemical plant": https://dyson-sphere-program.fandom.com/wiki/Graphene

Looks like that's sort-of a real thing? https://www.americanscientist.org/article/mass-producing-gra...

Researchers at Rutgers University are making sheets of graphene out of ordinary graphite flakes and some sulfuric or nitric acid.
seaal
0 replies
1d22h

Always nice when games introduce you to real life concepts. Shout out to Greg Tech: New Horizons, often leading me down a wikipedia rabbit hole.

klyrs
0 replies
1d20h

Even if you could just magically make a reel of graphene with this graphite+sulphuric acid recipe, it wouldn't help one bit in chip fabrication. You need to build things layer by layer; there's only one place where a bulk material is used, and that's the substrate.

wigster
0 replies
2d1h

its made out of sellotape/scotch tape and pencils! ;-)

ur-whale
0 replies
2d1h

I thought Graphene was just a form of Carbon, and mostly produced from regular carbon through some sort of process.

Are you saying Graphene can be found in nature?

r3d0c
0 replies
2d1h

lol.. unless brazil is the only country in the world with carbon, this is absolutely not true

neuronexmachina
0 replies
2d1h

Assuming you mean graphite (which can be used in graphene production), it looks like Turkey might have even more reserves. China, Madagascar, and Mozambique also have pretty significant graphite reserves: https://www.statista.com/statistics/267367/reserves-of-graph...

moffkalast
0 replies
1d22h

Even if that were true, countries where most of the wealth is dug or pumped from the ground are the worst places to live because it can all be be easily extracted by foreign companies who pay a cut to the dictator directly and the people have nothing to threaten the laughably rich autocrat with.

dudeinjapan
0 replies
2d

Brazil is the world’s third largest producer of graphite, behind China and Mozambique. Our current most reliable graphene production methods start with graphite and exfoliate it into single layers of graphene.

dtx1
0 replies
2d1h

Isn't graphene just carbon?

cassiogo
0 replies
2d

I belive you are thinking about Niobium and not Graphene

programjames
9 replies
2d2h

Where did they claim to make a chip? It seems to be just a single epigraphene layer.

jacquesm
8 replies
1d23h

'just'?

robertlagrant
7 replies
1d20h

A chip is a lot harder than a wafer, and a chip is what's claimed.

jacquesm
6 replies
1d20h

From nothing at all to a wafer of a new material is a giant step compared to the step from a wafer to a chip.

topspin
3 replies
1d18h

Is it? What is the plan for depositing these materials with precision to implement logic? Doesn't seem like conventional lithography is applicable.

h0l0cube
2 replies
1d17h

According to a sibling comment they already made a FET

https://news.ycombinator.com/item?id=39057303

topspin
1 replies
1d16h

That paper describes using e-beam deposition: they're using an electron beam to fabricate individual components in the lab.

That's fine for experiments. It has zero value for making devices with billions of components. Unless there is something akin to lithograph techniques you can't make products economically.

jacquesm
0 replies
1d15h

It would seem a bit much to require that they jump from 'have a wafer' to 'volume production' overnight. But the fact that they can use electron beam deposition to fabricate indicates one very important thing (to me, at least): the same kind of technologies that can work on Silicon wafers appear to be viable for these Graphene based ones. And that would allow recycling a whole pile of tech, possibly speedrunning through the last bunch of feature sizes because of all of the experience gained with Silicon. To me it says Graphene will now work in the next five to ten years or it will likely never happen. It would need to outperform Silicon on at least one metric (density, power consumption, speed) to make it worth it for niche applications and then bit by bit economies of scale would take over.

programjames
1 replies
1d3h

To be fair, epigraphene wafers already existed, they just weren't nearly as pure.

jacquesm
0 replies
11h52m

That's very much a key bit though. If you want to have any kind of yield.

margorczynski
5 replies
2d1h

The question is can this be as effectively processed and miniaturized as silicon. Still as someone pointed out they've already produced a transistor based on it and the claim is the same processes used for silicon can be used here so I sure hope it works out.

mcshicks
1 replies
2d

Silicon based chips rely on oxide and metal layers to connect the transistors. Not clear to me from the article how that would work for graphene based devices. This was also an issue for GaAs based chips. From the Wikipedia article on GaAs

"The second major advantage of Si is the existence of a native oxide (silicon dioxide, SiO2), which is used as an insulator"

anticensor
0 replies
21h48m

Carbon conductors to connect between layers, and carbon dielectric as insulators. Carbon all the way.

EasyMark
1 replies
1d23h

Yep, we have much faster switching semiconductors than silicon based ones, but that is like 10% of the deal; there is also miniaturization techniques, fab equipment, engineer/scientist expertise in the subject, conductor/semiconductor/insulator layers & interfaces, and oh so much more. It's nice to hear about advancements outside the regular silicon ecosystem but it's a tough row to hoe to push silicon aside for a "better material". It's probably going to have to be a revolutionary advantage (100x?) improvement over current tech to move to a new tech stack.

dotnet00
0 replies
1d22h

Since density with Si is approaching a wall soon and thus a switch in materials likely to end up being necessary at some point in the next 2-3 decades anyway, I feel like rather than being immediately revolutionary, all that'll end up mattering is where the theoretical limits are. After all, at that point, you need to retool and retrain anyway.

moffkalast
0 replies
1d22h

Even if it can't, ridiculously high current mosfets without heatsinks FTW.

dylan604
5 replies
2d2h

Having a title for anything with "Researchers claim..." is such a red flag that I don't even want to click the link. I've been burned too many times. I know that fire is hot, I don't need to touch it again to find out. I'm a quick study, and after the 10th time, it took hold

colordrops
3 replies
2d2h

Speaking of graphene, I've been hearing about its miraculous powers for years. Are there any practical applications yet?

theragra
0 replies
2d2h

It is used in industrial uses widely, for composites with greater durability, strength etc

theragra
0 replies
2d1h

Also, for fun, search for nano tape. I think it is using carbon nano structures, close relatives of graphene

dotnet00
0 replies
1d23h

I'm not sure if they're available on market already, but I recall that graphene Li-ion batteries are being seen as a decent near term upgrade, with the production issues of graphene reduced because they don't need large sheets of it.

bogtog
0 replies
2d2h

Isn't it notable that the article is at least acknowledging this uncertainty?

ngneer
4 replies
2d1h

Trouble is always with scaling and equipment. Unless one uses the existing infrastructure, low chance of success.

px43
3 replies
2d1h

I also think that a fundamental issue with using carbon for compute is that we don't actually have a lot of it, and what we do have we largely need for all that organic chemistry that keeps us alive.

For reference, carbon makes up about 0.18% of the earth's mass, whereas silicon makes up 27.7%. The moon basically has zero carbon, and the regolith is 20% silicon.

AdamH12113
2 replies
2d

Far more of the silicon (from sand) goes into concrete and glass, though. And it's not like we're short on hydrocarbons. From what I can gather, the total yearly worldwide production of concrete is tens of billions of metric tons, fossil fuels are billions of tons, and silicon is millions of tons -- barely a blip on the radar.

klysm
0 replies
1d13h

It's not like we're anywhere close to short on Si though... not sure it's a relevant comparison

h0l0cube
0 replies
1d17h

And imagine running out of coal!

causal
4 replies
2d2h

If I'm reading this correctly, this would more accurately be a graphene wafer, right? There are no circuits on this thing, it's just a sheet of graphene, right? Still remarkable if so, just not what I would call a "functioning chip".

idiotsecant
0 replies
1d23h

The difference is just engineering once the material science is there.

gwill
0 replies
2d2h

yea, looks like the author doesn't understand the difference between a semiconductor and a chip.

the researchers said "functioning semiconductor".

deepsquirrelnet
0 replies
2d1h

Likely single devices patterned from a epitaxially grown graphene layer. It would be a functioning transistor —- or set of transistors, albeit not terribly useful as single devices. Typically they would pattern the devices with large metal landing pads to drop probes onto, and perform characterization from there.

OnACoffeeBreak
0 replies
2d2h

I only scanned the article and the paper, but the paper linked in the article talks about the researchers characterizing performance of a MosFET.

kaptainscarlet
3 replies
2d2h

This could be the breakthrough that could miniaturized AI more powerful.

programjames
2 replies
2d2h

This is still years away from mass production. The process takes hours at very high temperatures to make a single layer.

givinguflac
1 replies
2d1h

Nope, it takes about an hour for one layer.

FTA: “Then a high-frequency current is run through a copper coil around the quartz tube, which heats the graphite crucible through induction. The process takes about an hour.”

programjames
0 replies
1d3h

My mistake, I read the paper days before I wrote my comment.

foota
1 replies
1d22h

Are there any niches for super high speed chips that it isn't possible to fill with silicon based chips? Would there ever be a niche for a small company making handfuls of them, or would it only be feasible at scale?

Jweb_Guru
0 replies
1d19h

Sure. Network switches for example have a pretty insatiable supply for processing speed.

RecycledEle
1 replies
1d20h

This heating step is done with an argon quartz tube in which a stack of two SiC chips are placed in a graphite crucible, according to de Heer. Then a high-frequency current is run through a copper coil around the quartz tube, which heats the graphite crucible through induction.

So they use a tube furnace?

“The chips we use cost about [US] $10, the crucible about $1, and the quartz tube about $10,” said de Heer.

Tube furnaces do not cost $10 except when you build it yourself and do not count the thousands of dollars of equipment you have sitting around.

uSoldering
0 replies
1d19h

They are talking about the price of consumables in the process. You also need a building and electricity and employees and a whole bunch of other stuff you can infer via deduction.

GalaxyNova
1 replies
1d16h

moore's law is not dead.

klysm
0 replies
1d13h

I don't think this has any relevance (at least in its current state) for digital compute.

photochemsyn
0 replies
2d1h

Full research article:

https://arxiv.org/pdf/2308.12446

They did make a proof-of-concept device: "The electrical properties of the SEG were measured by characterizing a fabricated top-gated SEG FET."

johntb86
0 replies
2d
foota
0 replies
1d22h

It's amusing to think about something like a several thousand (probably more like billions?) transistor graphene based chip outdoing a trillions transistor behemoth.

I don't know if that's possible in reality though, since things like cache space are important regardless of your clock speed and there's no point in being able to do fast logic if you can't pipe in enough data.

LAC-Tech
0 replies
2d

oh very cool. I remember hearing about graphene in inorganic chemistry 101 and thinking it that was the neatest thing in the world. That and carbon nanotubes.

De Heer says that it will take time to develop this technology. “I compare this work to the Wright brothers’ first 100-meter flight. It will mainly depend on how much work is done to develop it.”

Something to look forward to!