return to table of content

Nvidia is now more valuable than Amazon and Google

behnamoh
152 replies
16h2m

Somehow I wish it weren't true because I hate monopolies in any markets. Nvidia has been enjoying their absurd premium on consumer and production GPUs and intentionally removed the NVLink capability in the GeForce 4090 series (3090 had them) to avoid cannibalizing their A100 and H100 business. With NVLink, people could chain multiple GPUs and increase the overall VRAM without much reduction in bandwidth. But without it, if you chain two 4090s, your effective bandwidth gets shrunk, so higher VRAM but much slower for LLM inference.

I'm with Linus Torvalds on this—f*** NVidia.

FredPret
79 replies
15h57m

I understand the sentiment.

But the only reason you're mad at Nvidia is because they made something desirable (a GPU that does NN training) but didn't do it in the way you prefer (NVLink, Linux compatibility etc).

It's not their fault AMD and others have been asleep at the wheel and handed them the whole market for free. Any company with a monopoly on the hot new thing would try to capitalize on it.

When the other GPU makers eventually catch a wakeup, the situation will finally improve.

latchkey
20 replies
15h54m

AMD has figured it out and is in the process of catching up. It won't happen over night though.

throwup238
19 replies
14h59m

Yeah we’ll have power positive fusion any day now.

khazhoux
13 replies
12h28m

Azure, Oracle, Meta, Tesla, Hot Aisle... all buying up MI300x.

You really should disclose that you’re a founder when you mention your company like that

latchkey
10 replies
11h50m

It is in my profile. Pretty well disclosed imho.

abraae
7 replies
11h34m

Insufficient disclosure. You must be aware that of the companies you listed, one (yours) sticks out as being unknown.

Don't die on this hill, just do the decent thing and add a disclosure. This doesn't show you in a good light.

latchkey
6 replies
11h23m

No hill to die on here. I did nothing wrong by mentioning my company. It is factual that I am buying these GPUs.

There is nothing in the guidelines or faq on this topic. So, it is best that we conclude that you are making up some fake rule that you expect people to abide by and then claiming a weird superiority complex about how I am looking bad according to your made up rules.

I do not need to tolerate that.

Classic HN… getting downvoted by the silent minority. Would love to see how you deal with bullies.

abraae
2 replies
11h0m

I did not intend to be confrontational and you are correct that talking up your company without declaring your affiliation is not against HN guidelines. I feel it is however a convention that is widely followed and for good reasons. Obviously your call at the end of the day.

sangnoir
0 replies
9h3m

TBF, they weren't "talking up" their company, they simply added the name to a list of companies doing something that's going against the grain.

To quote Bill Burr roasting Apple, there's a bit of "Einstein - Gandhi - Me" going on, but its way below the threshold for identifying themselves as a founder, IMO.

latchkey
0 replies
1h58m

The guidelines state:

"Please don't use HN primarily for promotion. It's ok to post your own stuff part of the time, but the primary use of the site should be for curiosity."

I was not "talking up" my company. I made a reference to it, by name, on a relevant thread. This is well within the guidelines.

sufficer
0 replies
9h15m

Man you are wild.

matt_daemon
0 replies
9h48m

Good way to ensure no one buys anything from your company.

inemesitaffia
0 replies
9h47m

You've fe'd up

khazhoux
1 replies
10h41m

It is a norm here and a common courtesy to let people know whenever one mentions their own company, product, or project. Otherwise, we don’t know that a comment might be self-marketing.

And yes, yours was an instance of self-marketing, even if unintentional, by name-dropping your company alongside power players like Meta and Tesla. You put yourself on people’s radar as a GPU-intensive company.

latchkey
0 replies
2h19m

The guidelines state:

"Please don't use HN primarily for promotion. It's ok to post your own stuff part of the time, but the primary use of the site should be for curiosity."

What I posted, just the name of my company, was entirely within the bounds of common courtesy.

Of course, I do want to put myself on people's radar... I am building a GPU-intensive company and I'm making a small relevant comment on a thread.

I looked it up. Over the years, there has been endless bike shed discussion on this topic, such as:

https://news.ycombinator.com/item?id=24353959

https://news.ycombinator.com/item?id=36404027

I think that it is smart of the moderators here to simply sit on the sidelines and not try to control the discussion of this too much.

cko
1 replies
12h17m

Aviato

latchkey
0 replies
11h50m

I loved that show!

throwup238
3 replies
14h37m

“Meta and Microsoft say…”

“Oracle is set to use…”

“Elon Musk implies…”

Not one of them have actually bought anything. AMD is just muddying the waters by grouping HPC sales with AI. It’s nonsense.

Edit: I see you’ve got a startup on MI300x hardware. Now it makes sense, I guess - you need to believe. Good luck!

mlyle
1 replies
14h1m

Not one of them have actually bought anything.

AMD isn't shipping in large volume yet, but MI300x instances can be spun up in preview on Azure today with availability ramping up. Microsoft has absolutely bought a lot of MI300x.

latchkey
0 replies
13h6m
latchkey
0 replies
13h7m

I guess - you need to believe.

Or maybe I know things you don't. =)

jayd16
11 replies
15h7m

"You're only upset because they <used their market advantage to exploit consumers>"

selectodude
8 replies
14h57m

You using your market advantage to exploit your employer or do you feel like you’re being paid a fair amount?

jayd16
6 replies
14h48m

I have exploitable market advantage? Explain.

selectodude
5 replies
14h14m

Do you work in tech in the United States? Congratulations, you’re one of the wealthiest people on the planet. That’s one hell of a market advantage.

iraqmtpizza
4 replies
11h23m

Nothing like what it would have been without a non-stop flood of H-1Bs

FirmwareBurner
2 replies
6h27m

Then offshore offices would be more and larger. You can't really stop capital flight by banning migration. Corporations are more mobile than workers.

iraqmtpizza
1 replies
6h7m

Nope. H-1Bs allow bringing in people from a dozen countries and having them together in an American office. Moving the company to a worse tax and legal environment with worse latency and worse food and weather and entertainment and prestige doesn't mean those workers will all leave their home countries and move there instead. Nor would the locals necessarily even allow that.

EDIT: bro, you're the one that said corporations are mobile, not me. secondly, all of my points still apply to a satellite office. where the headquarters is or where it's incorporated is irrelevant. thirdly, if they still do significant business in the US then the US maintains massive leverage over them anyway. so, try again.

FirmwareBurner
0 replies
5h6m

Who said about moving the company? If you open another office offshore yoiu don't move the company. The HQ still stays in the US.

>EDIT: bro, you're the one that said corporations are mobile, not me

I say it and you misunderstand it. Mobile as they can open an office in any country, not move the HQ alltogethere.

UncleMeat
0 replies
3h1m

Software developers create software development jobs. H1bs are a contributor to the fact that the US is the center for the software development industry. If companies weren't deciding to headquarter in the bay area to be at the center of things, there wouldn't be nearly as much upward pressure on US developer compensation.

eru
0 replies
14h51m

I'm not sure you can trust the (would-be) exploiters opinion on whether the amount they are being paid is 'fair'?

FredPret
1 replies
14h45m

They gained market advantage by creating something that delights consumers. I'd call that a net win.

jayd16
0 replies
11h28m

The complaint is about losing those delights now that the market is secured by software lockin.

The bait and switch.

sjwhevvvvvsj
10 replies
14h5m

If AMDs version of “asleep at the wheel” is them eating Intels lunch for the last few years, I’d hate to see them awake.

Intel is weak, and it’s much smarter to put resources into squeezing Intel while they are on the ropes than shift too much focus to ML.

FredPret
5 replies
13h55m

I’m talking GPUs, not CPUs.

ptero
4 replies
13h15m

And he is saying AMD focusing on non-GPU markets and going from being a laughable underdog to a powerhouse hardly qualifies as "being asleep at the wheel".

FredPret
3 replies
13h10m

Alright, fair point, but that is also why they missed out on the AI boom.

iraqmtpizza
2 replies
11h25m

As if it was just there for the taking. AMD doesn't have the money to do what Nvidia does

FredPret
1 replies
2h39m

Yes they do.

They were in the same ballpark until the past coupld of quarters. Nvidia laid the groundwork for this ages ago by developing drivers that play nice with NN.

AMD could easily have done the same at the time, and can do so now.

Even if it costs $1b-$10b to do it, AMD can afford it and it’s worth doing. They’ve got $55b in equity, and a huge $280b market cap to issue more stock into. [https:valustox.com/AMD]

caeril
0 replies
1h47m

I dunno, man.

They're releasing APUs with tensor cores right now.

They're finally getting their dogshit ROCm drivers fixed right now.

Yes, they're late. But they're definitely playing. This whole "asleep at the wheel" thing may have been true a year ago, but it's not true now.

[edit] Apologies, responded to the wrong comment, but I'll leave it here anyway.

namlem
3 replies
13h59m

Yeah exactly. AMD has other markets to compete in. Their desktop, laptop, and handheld offerings are incredible right now. I assume they are also going to replace Nvidia for the Switch 2.

philistine
2 replies
12h30m

Nope. Nintendo is sticking with Nvidia. I'll eat my hat if they don't.

whaleofatw2022
1 replies
4h45m

It would be just a little surprising, since so far they have a pretty bad track record with 3rd parties.

paulmd
0 replies
1h40m

Says who, Charlie demerjian and circa-2010 AMD fanboys echoing him?

Things like Bumpgate as a major social media mindshare thing largely stem from him and his coverage. And he leaves off the part where AMD gpus were failing too (because of the same RoHS-compliant solder) and where apple kinda wanted to part ways anyway because they saw the “platform” nvidia was building and didn’t want any part of it on macOS… game knows game.

That’s the problem is like the evga thing there is a distinctly contemporary take and that’s shaped by people like Charlie, and then there’s the take with the benefit of a couple years of hindsight and maybe it wasn’t exactly the way the bunnyman said it was. Maybe apple didn’t want CUDA usurping them, and maybe apple wanted to leave the way clear for the pivot to apple silicon and metal and the platform Apple wanted to build there.

https://blog.greggant.com/posts/2021/10/13/apple-vs-nvidia-w...

The problem is the AMD fanboys are just as noxious today and - just like the evga thing - it’s gonna take years before people are willing to re-evaluate things fairly.

https://www.reddit.com/r/nvidia/comments/xgn7do/we_need_more...

https://youtu.be/vyQxNN9EF3w?t=5044

There’s a little backstory like this behind almost all of the AMD fanboy “lore” around nvidia. Crysis 2 was never a thing either - the whole point of wireframe mode is seeing the full geometry at maximum LOD, with no culling, for example. nvidia never “sold directly to miners”, tech media didn't actually read the RBS article they were citing and it doesn't say what they say it says. NVIDIA "recently said they're completely an AI company now" was 2015. Etc. There's just a group of people who just get off on hating and they don't mind making shit up if that's what it takes.

It's essentially impossible to unwind the popular opinion of NVIDIA from this ridiculous amount of past-hate... yeah, after 30 years of blood libel from ATI fanatics they probably do have an overall negative public opinion! People know the stories are true, because NVIDIA is bad, because there's all these stories, so they must be true.

llm_trw
10 replies
15h48m

There is _one_ other GPU maker.

ergocoder
2 replies
14h50m

What the hell is intel doing? I have been holding their stock for a few years now. Where is the moon?

newsclues
0 replies
14h45m

Hoping to build chips for others as they open their foundry business up

nealabq
0 replies
13h51m

ASML recently delivered their first high-NA lithography system to Intel. While TSMC is being more cautious, hesitating to commit to this very expensive machine. Intel's making a big bet, and it may (or may not) pay off. We'll probably know by the end of 2025.

Intel may end up back on top.

https://www.asml.com/en/news/press-releases/2022/intel-and-a...

https://www.datacenterdynamics.com/en/news/intel-receives-it...

https://www.tomshardware.com/tech-industry/manufacturing/tsm...

__float
2 replies
15h39m

Intel makes desktop cards too

PostOnce
1 replies
15h20m

and they have more memory bandwidth (per dollar) and more VRAM per dollar

the trajectory looks good if they can just resist giving up early without seeing instant success

giantg2
0 replies
14h31m

"just resist giving up early without seeing instant success"

Sounds like I've been doing life wrong.

philistine
1 replies
12h29m

Intel, AMD, Apple?

oblio
0 replies
11h52m

Apple's not super relevant for a mass market OEM type of discussion.

akmittal
1 replies
15h36m

They don't even take themselves seriously

throwoutway
0 replies
15h26m

I'd say they took themselves too seriously, and rested on their high margin laurels for too long

benreesman
9 replies
13h40m

I don’t have hard evidence of this, which makes it a thought experiment and not an assertion.

Big-cap tech is utterly notorious for some of the worst anti-trust violations and other cartel-style behavior of any sector. This goes back as least as far as “Wintel” in the 90s and probably further that I didn’t watch up close. Suffice it to say that the Justice Department is extremely disincentivized to go after domestic economic Cinderella stories in a globally competitive world and has had to bring lawsuit after lawsuit on everything from bundling to threatening OEMs to flagrant wage fixing in print (I do have second-hand that the mass layoffs are coordinated aka Don’t Poach 2.0).

Crippling “gaming” (high margin but not famine-price gouging margin) cards, controlling supply tightly enough to prevent the market clearing at MSRP routinely, stepping at least up to and likely over the line on the GPLv2: these things or things like them have been ruled illegal before (though it’s hard to imagine that happening again).

It’s possible that tech is just a natural monopoly and we had it right with Ma Bell: innovation was high, customer prices were stable and within the means of most everyone, and investors got a solid low-beta return.

It’s easy to view the past with rose-colored glasses and the Bell Era wasn’t perfect, but IMHO this status quo is worse.

MichaelZuo
7 replies
12h42m

To be honest this just sounds like a bunch of talking points stuck together, can you write a credible argument based on known case-law/precedent/etc... if you have a legal angle to critique?

mtsr
5 replies
12h7m

Society isn’t shaped just by current laws, but also by where we want things to go. If the current situations is undesirable, changing laws and regulations is the way to change it. (Of course actually enforcing current laws and regulations can also go a long way in many cases…)

fauigerzigerk
2 replies
11h31m

So what changes to the law do you propose? None of the little annoyances that Nvidia is occasionally creating amount to anything that could have resulted in their current dominance in AI processing.

They made a good product. The competition is responding too slowly. And they got lucky with this crazy (in a good way) AI boom. I don't see how this could possibly be outlawed or why you would even want to. It will resolve itself on its own given enough time.

panick21_
1 replies
8h12m

How about right-to-repair. How about forcing better access, documented, attributable firmware. Open Drivers for Linux.

The government as a big costumers could demand these things, even without having a low. The military should demand these things to be open for various reasons.

I think Nvida can still make plenty of money in such a world.

In cases where an interface has absurdly high value for society if its a standard, the government could also 'buy' that and open it up. Just like they do with other infrastructure.

One could make the argument that the x86 interface should be public domain as it amounts to infrastructure that most of society builds on. How such a thing would exactly work is of course up for some debate. But the concept of the government 'liberating' common interfaces makes sense from a society perspective.

fauigerzigerk
0 replies
7h22m

>How about right-to-repair. How about forcing better access, documented, attributable firmware. Open Drivers for Linux.

How would that have stopped Nvidia from dominating the AI market?

>The government as a big costumers could demand these things, even without having a low. The military should demand these things to be open for various reasons.

I agree that governments should use APIs with multiple competing implementations (or one truly open source implementation) where possible. This could make a difference in some cases, but I doubt it would have had a big impact in this particular case as demand for GPGPU is overwhelmingly coming from the private sector.

>In cases where an interface has absurdly high value for society if its a standard, the government could also 'buy' that and open it up. Just like they do with other infrastructure.

Agreed, but is there a legal reason why AMD and others are not allowed to create a clean room implementation of CUDA? Haven't they done exactly that with ZLUDA (which they have now defunded)?

I thought not supporting CUDA was more like a failed strategic move by competitors to prevent CUDA from becoming an industry standard.

I think we agree on all the relevant principles. I just don't see how any of these principles make a big difference in this particular case.

Also, I don't see the Nvidia situation as particularly problematic. They are not too entrenched to unseat. Some of their biggest customers (themselves huge oligopolists) are shaping up to be their biggest competitors.

Most of AI processing will be inference, not training. Bringing the costs down is absolutely key for broad AI use. My bet is that the hardware margins will end up being slim.

killingtime74
1 replies
11h43m

Sounds like it's just undesirable to you. (I don't own any Nvidia stock). Their growth to their current position in the market was organic. You don't just break down big companies for no reason other than them being big. If we did, there's a long line of companies ahead of Nvidia to be broken up first.

benreesman
0 replies
2h53m

I don’t see what being organic has to do with market failures (if in fact there were no shenanigans, which seems a stretch to posit at zero).

A market failure is a market failure.

There’s a big lobby on HN who want to defend or minimize or justify leaving market failures be, which is weird given they’re really bad for most people on HN, it’s a free country.

But let’s call it what it is.

benreesman
0 replies
7h47m

I’m saying that we have “deregulated” big-cap tech to the point of not enforcing laws that we enforced even a decade ago. This is analogous to the last time a sector 100% captured the regulators which was high finance in the Greenspan era last time. Ended poorly.

The likely height of big-cap tech regulation was the regulated monopoly of ATT/Bell Labs/Western Digital all through the 20th century.

I prefer the outcomes in that era to the present status quo.

You can agree or disagree, and disagreeing is simple: “I prefer these outcomes because…”. I welcome such disagreement. And FWIW I’m not one of the many people who downvoted you: but they were justified in doing so because you moved a comparison of policies and their outcomes into a more abstract space of 1-bit generalizations, e.g. “talking points” vs “law/precedent”. I gave examples, it’s public record and trivially Googleable.

You didn’t fail to understand my point, you didn’t like it and took a cheap shot.

paulmd
0 replies
45m

Market segmentation generally benefits consumers though. Enterprise is willing to pay much more than an average consumer, and they buy a lot more PCs/gpus/etc than the average consumer, so if you banned market segmentation tomorrow the response would be that gpus get a lot more expensive. You can see that happening when miners or enterprises buy up gaming gpus (which they are doing right now with 4090s for example). The clearing price wouldn't fall, it would rise.

The de-facto mechanism that's occurring with market segmentation is that consumers pay less than the clearing price and enterprises pay more. And consumers benefit from this subsidy.

If you banned it, you also wouldn't change the cost of developing new generations of products, so product development just would go slower, and the market would get even lumpier and less efficient (vendors squatting newer nodes would have an advantage, but it's inefficient to launch something immediately to compete with them, etc).

refurb
3 replies
13h44m

But the only reason you're mad at Nvidia is because they made something desirable

This is true. It's like blaming the company for being successful. "Stop making products that are so much better than any other product".

I'm not sure we want that?

matwood
2 replies
11h0m

Not dissimilar from the conversations around Apple. They have made products so much better than any of the competitors people want to force them to change.

dns_snek
1 replies
5h37m

People want Apple to change because they can't handle Apple making such amazing products? That's certainly a brand new take. Elaborate?

matwood
0 replies
3h15m

It's pretty straight forward. Android provides all the openness people claim to want. Different device makers, different roms, different stores, side loading, etc... But people don't want to use Android or its hardware. People want to use iPhones and iOS and force Apple to add side loading, stores, etc...

If Android were superior, no one would care what Apple did. Everyone would simply use Android.

qwertox
1 replies
13h11m

have been asleep

I still don't see the required effort put into place by Intel together with AMD in order to create an attractive alternative to CUDA.

Right now they're only only getting looks because their devices are cheaper for the hardware you get and for big projects because they're available, so you're required to have in-house experts who know how these platforms work in order to get stuff done that you know would definitely work on Nvidia.

The games Nvidia is playing with the consumer market is really annoying, but we have to thank Intel and AMD that Nvidia is in a position to do this. Microsoft is probably also at fault.

We're at a point where Apple only has to say "Oh, one more thing, you can now put Nvidia GPUs in your Mac Pro" for Intel and AMD to notice in what position they've put themselves into.

lumost
0 replies
13h0m

Big companies are weird. There are probably 10 teams each at intel, arm, and amd claiming that they can beat the h100 or build a better compute runtime.

Half those teams are full of crap, a quarter will take the funds and try to do something like CUDA (but better) cough ROCm. Then another eighth to a quarter will simply not have the political clout to get the whole thing done.

Add to this that we just funded 10 teams to get 1-2 functional teams… and you see why chasing after an incumbent is hard. Even when you have near infinite money to do so.

jakogut
1 replies
14h9m

Unfortunately, I think developers handed them the market too, it wasn't just the competition not being up to snuff.

Developers valued their own development experience, features, and performance over value, efficiency, portability, openness, or freedom. The result is anyone depending on CUDA became vendor locked to Nvidia. Similar as the story with Direct3D, though thankfully there are workable solutions for D3D now implemented on top of Vulkan. Nvidia's CUDA moat may be less fordable.

bmicraft
0 replies
13h2m

Just today something very interesting came out: ZLUDA

It's a reimplementation of cuda on top of rocm, and it's a drop in replacement you can use with already compiled binaries. It's even faster than native rocm/hip in blender

SmokeyHamster
1 replies
13h11m

Is that even logistically possible for rivals at this point?

You think AMD doesn't want to carve out a chunk of that pie?

What Nvidia is doing is hard and the engineers with the skill to design those chips are few far between. I'm sure Nvidia's already hired most of the best in the business.

Anyone who wants to unseat Nvidia will need very talented people, who aren't cheap, and that means massive investment capital which largely doesn't exist.

ActorNightly
0 replies
11h5m

If you look at graphics performance, AMD already is pretty much on par with Nvidia. All of the math with graphics and ML is largely the same.

The thing that is missing is AMD focusing their software engineers that develop the drivers, and making them put work into RoCM to make it usable across all cards, all driver versions, like with NVIDIA.

thfuran
0 replies
14h28m

Any company with a monopoly on the hot new thing would try to capitalize on it.

So what? "Any company with a monopoly" would also pay children in company scrip to work 20 hours a day in unsafe factories unless compelled to do otherwise.

quickslowdown
0 replies
14h33m

AMD being asleep at the wheel, or choked out by the Intel + Nvidia cartel?

iraqmtpizza
0 replies
11h32m

Any evidence that the much less well-funded AMD has been "asleep at the wheel" w.r.t. the GPU market?

Without going practically all-in on Zen, they'd be bankrupt.

ij09j901023123
0 replies
14h30m

The only people that buy AMD GPUs are gamers; AMD have long since stopped targeting the enterprise side of their GPU market. NVIDIA makes the majority of their profits by selling A1000+ cards to businesses that require the power for computation (video rendering, LLM processing, etc). AMD neglected and disregarded productivity, and now they are paying the consequences.

MuffinFlavored
22 replies
15h41m

Somehow I wish it weren't true because I hate monopolies in any markets.

If you peel the layers back... isn't the real monopoly ASML?

wbsun
21 replies
15h30m

I am always wondering why ASML hasn’t got any antitrust lawsuits for their monopoly.

blackoil
9 replies
15h14m

INAL. Antitrust is not against monopoly, but monopoly "abuse". MS is free to make an OS which everyone wants and free to be a monopoly. When they tried to leverage it to gain market share in the distinct market of browser and media players it became a problem.

So till ASML starts abusing the position they should be good legally.

eru
8 replies
14h50m

Watch out, different jurisdictions have different 'anti-trust' laws. And, especially in common law jurisdictions, interpretations of a law can also change over time.

rezonant
6 replies
14h32m

Who's this warning for? Do we have any budding monopolists in the HN crowd who need the warning?

renewiltord
2 replies
12h53m

Everyone's learning to talk like a GPT.

eru
1 replies
12h32m

Eh, I was already uncool before GPT came around.

renewiltord
0 replies
12h4m

Then perhaps it's the other way around and you're the reason I'm sometimes warned by LLMs that I shouldn't kill processes.

blackoil
1 replies
14h26m

Maybe for not lawyers like me who pretend to be expert in everything :|

eru
0 replies
12h36m

Yes, sort-of.

But no worries: in a democracy we expect ordinary people to have some interest in the laws, after all, they are supposed to be electing people who decide on how laws should be changed (or not). So layman need to talk about laws, too.

I'm just always a bit cautious (or at least I should be). I know that eg in the US insider trading is about stealing secret from your employer; but in eg France insider trading is about having an unfair advantage over the public.

I can image that there are jurisdictions that treat monopolies by themselves as a problem. (Perhaps France, again?)

Btw, for the US herself have a look at https://fee.org/articles/the-myth-that-standard-oil-was-a-pr... to see how the prototypical case against Standard Oil wasn't really about monopoly abuse, either. At least no one really bothered proving that a monopoly was abused, they mostly just assumed it.

eru
0 replies
12h41m

The warning is for armchair speculators like you and me. Specifically:

Antitrust is not against monopoly, but monopoly "abuse".

So this might be true about anti-trust in the US right now. But I'm not sure whether it's true about ant-trust law in eg France?

Also, in practice this was not true about anti-trust law in the US historically: Standard Oil was smashed into pieces without anyone proving in court that consumers had been harmed, or that the monopoly had been 'abused'.

See eg https://fee.org/articles/the-myth-that-standard-oil-was-a-pr... and https://www.econlib.org/library/Enc/Antitrust.html

pests
0 replies
14h45m

Aloca :(

dralley
1 replies
14h42m

Because they're not a monopoly based on anticompetitive practices, they're a monopoly because what they do is so difficult and massively expensive that the monopoly is the only thing that makes it economically viable. The worldwide market for these machines is only perhaps a few dozen a year at most, it just can't support paying the R&D bills of multiple competitors.

ASML is also part-owned by their customers. Intel, Samsung and TSMC all invested in ASML to get EUV tech over the line - see aforementioned capital requirements.

pbmonster
0 replies
9h47m

Also, it's only really a monopoly at the top end, and only right now. It's not like ASML is the only company building litho machines. Canon and Nikon still sell a lot of machines, they just decided not invest to much into competing at the top end over the last 10 years.

This could, theoretically, change at any moment, and Canon is trying something with their new generation of nano-print tech.

docandrew
1 replies
15h20m

Have they actually engaged in anticompetitive behavior or are they just the only player?

eru
0 replies
14h49m

Well, that would be for a judge (or jury?) to decide, if someone sued.

sjwhevvvvvsj
0 replies
13h57m

the answer is it’s a state sanctioned monopoly which concentrates control of advanced technological processes in the hands of NATO.

ASML is a strategic asset.

renewiltord
0 replies
12h53m

They're not the only lithography machines, just the only EUV lithography machines. It's not a crime to have a better product.

mikewarot
0 replies
15h16m

The relevant EU Law[1] seems to be "Any abuse by one or more undertakings of a dominant position within the common market or in a substantial part of it shall be prohibited as incompatible with the common market insofar as it may affect trade between Member States."

So, as long as it treats fabs in Europe fairly, it can do whatever it wants in the rest of the world.

[1] https://en.wikipedia.org/wiki/European_Union_competition_law

linksnapzz
0 replies
14h49m

Because the patents that they have for their deep EUV tooling are...licensed property of the US DoE. Someone doesn't want that disturbed.

ksec
0 replies
12h55m

I hope you are not being serious and it is missing /s somewhere.

dehrmann
0 replies
13h10m

I'm not sure if this is the motivation, but any sort of action against ASML would give Chinese fabs time to catch up.

Valakas_
0 replies
5h29m

It's not ASML's fault that 20 years ago when Martin van den Brink (CTO) went full steam to develop EUV everyone laughed at him saying EUV would be impossible to achieve. Their monopoly is well deserved. Other companies have a few years to catch up for full fault of their own.

lukeschlather
17 replies
15h45m

I don't think Nvidia is worried about cannibalizing their A100 and H100 business, I think they're trying to keep the cards something resembling affordable for gamers.

AnthonyMouse
16 replies
15h22m

Then why wouldn't they just make more of them? The excuse is supposed to be fab capacity, but the 3070 has better performance than the 4060 etc. and is built on the older process which should no longer be in short supply.

aurareturn
6 replies
14h29m

  Then why wouldn't they just make more of them? The excuse is supposed to be fab capacity, but the 3070 has better performance than the 4060 etc. and is built on the older process which should no longer be in short supply.
There is actually very little margins in the midrange consumer discrete GPU market. The market for discrete GPUs have been shrinking since the mid 2000s.[0] Most GPUs are sold as integrated such as SoCs and in consoles nowadays.

In a shrinking market, the midrange and low-end products will cease to be profitable. Hence, Nvidia's 60/70 offerings are lackluster because they don't make much money from them. They want you to buy the 80s and 90s cards.

Furthermore, node advancements have stopped scaling $/transistor. So the transistors aren't getting cheaper, just smaller.

Lastly, Nvidia wants to allocate every last wafer they pre-purchased from TSMC to their server GPUs.

[0]https://d15shllkswkct0.cloudfront.net/wp-content/blogs.dir/1...

AnthonyMouse
5 replies
14h10m

Most GPUs are sold as integrated such as SoCs and in consoles nowadays.

But those are mostly AMD, and doesn't really have anything to do with what features someone puts on their discrete gaming cards, except insofar as it implies gamers don't need the cards some amateur ML hobbyist might buy.

In a shrinking market, the midrange and low-end products will cease to be profitable.

That's assuming the products have high independent development costs, but that isn't really the case. The low end products are essentially the high end products with fewer cores which use correspondingly less silicon -- which have higher yields because you don't need such a large area of perfect silicon or can sell a defective die as a slower part by disabling the defective section, making them profitable with a smaller margin per unit die area.

Furthermore, node advancements have stopped scaling $/transistor. So the transistors aren't getting cheaper, just smaller.

Which implies that they can profitably continue producing almost-as-good GPUs on the older process node.

Lastly, Nvidia wants to allocate every last wafer they pre-purchased from TSMC to their server GPUs.

Which is why the proposal is for them to make as many GPUs as the gamers could want at Samsung.

aurareturn
4 replies
13h59m

Customers who want midrange or low end GPUs are price sensitive. Therefore, midrange and low end GPUs are a low margin business. Hence, there's little to no reason to provide a very compelling product in those categories.

While midrange GPUs are a cut from highend GPUs, they're still significantly more expensive to manufacture than say CPUs, at a transistor to transistor level. Look at an AMD 7950x transistor count, and then an RTX 4060 transistor count. The GPU has ~50% more transistors but sell at half the price. In addition, the GPU requires RAM, a board, circuitry, and a heatsink fan. The margins simply aren't there for lowend GPUs anymore.

Previously, Nvidia and AMD can make it up through volume. But again, the market has gotten much smaller going from 60 million discrete GPUs per year sold to 30 million. That's half!

Based on your logic, AMD should feast on midrange and low end discrete GPU market because Nvidia does not have value products there. But AMD isn't feasting. You know why? Because there's also no profit there for AMD either.

Once you stop thinking like an angry gamer, these decisions start to make a lot of sense.

AnthonyMouse
3 replies
11h45m

Customers who want midrange or low end GPUs are price sensitive. Therefore, midrange and low end GPUs are a low margin business.

Customers who want petroleum are price sensitive. Therefore, petroleum exporting is a low margin business. This is why the Saudis make the profit margins they do. Wait, something's not right here.

Look at an AMD 7950x transistor count, and then an RTX 4060 transistor count. The GPU has ~50% more transistors but sell at half the price.

You're comparing the high end CPU to the mid-range GPU. The AMD 8500G has more transistors than the RTX 4060 and costs less.

In addition, the GPU requires RAM, a board, circuitry, and a heatsink fan.

The 8500G comes with a heatsink and fan. The 8GB of GDDR6 on the 4060 costs $27 but the 4060 costs $120 more. A printed circuit board doesn't cost $93.

Previously, Nvidia and AMD can make it up through volume. But again, the market has gotten much smaller going from 60 million discrete GPUs per year sold to 30 million. That's half!

That's not because people stopped buying them, it's because they shifted production capacity to servers.

Based on your logic, AMD should feast on midrange and low end discrete GPU market because Nvidia does not have value products there. But AMD isn't feasting. You know why? Because there's also no profit there for AMD either.

But they do though. You can find lower end AMD GPUs from the last two years for $125 (e.g. RX 6400) whereas the cheapest RTX 3000 or 4000 series is around twice that.

And anyway who is talking about the bottom end? The question is why they don't produce more of e.g. the RTX 3070, which is on the old Samsung 8LPP process, has fewer transistors than the RTX 4060, is faster, and is still selling for a higher price.

aurareturn
2 replies
11h30m

  And anyway who is talking about the bottom end? The question is why they don't produce more of e.g. the RTX 3070, which is on the old Samsung 8LPP process, has fewer transistors than the RTX 4060, is faster, and is still selling for a higher price.
What do you think? I gave you my reasons. Why don't you take a crack at your own question? There has to be a logical business reason right?

  Customers who want petroleum are price sensitive. Therefore, petroleum exporting is a low margin business. This is why the Saudis make the profit margins they do. Wait, something's not right here.
One is a commodity. The other is about as high tech as it gets. Completely different economic rules that govern these products.

  You're comparing the high end CPU to the mid-range GPU. The AMD 8500G has more transistors than the RTX 4060 and costs less. The 8500G comes with a heatsink and fan. The 8GB of GDDR6 on the 4060 costs $27 but the 4060 it costs $120 more. A printed circuit board doesn't cost $93.
One has an entire board that needs soldering, assembled by a manufacturing line, tested with many parts, and a team of dedicated engineers optimizing drivers constantly. The other is a CPU that is machine tested, and shipped with a heatsink fan unattached. Come on now.

  That's not because people stopped buying them, it's because they shifted production capacity to servers.
That's not true. The discrete GPU market has been shrinking for 14 years straight with some crypto boom years here and there. See the chart I posted previously. Fewer and fewer people are buying discrete GPUs

  But they do though. You can find lower end AMD GPUs from the last two years for $125 (e.g. RX 6400) whereas the cheapest RTX 3000 or 4000 series is around twice that.
AMD cards do not "feast" on low-end and midrange. According to Steam charts, Nvidia still dominates midrange cards.[0] Furthermore, when I said "feast", I meant making profits. AMD does not make much profit from midrange or low end cards.

The bottom line is, you keep wondering why no one is offering compelling value in the midrange area but there's a very obvious reason why: profit is not there.

[0]https://store.steampowered.com/hwsurvey/videocard/

AnthonyMouse
1 replies
11h3m

Why don't you take a crack at your own question? There has to be a logical business reason right?

Selling 30M GPUs with a huge margin is more profitable than selling 60M GPUs with a modest margin, and they can point to Bitcoin or AI as an excuse.

But also, we're talking about them crippling the cards "for gamers" so there will be cards "for gamers" -- the premise of this has to be that they're supply constrained (artificially or otherwise) because otherwise they would just make more at the evidently profitable price gamers are already paying. It can't be a lack of demand because the purpose of removing the feature is to suppress demand (and shift it to more expensive cards).

One is a commodity. The other is about as high tech as it gets. Completely different economic rules that govern these products.

So you're saying that if a high tech product only has a limited number of suppliers then they could charge high margins even if customers are price sensitive.

One has an entire board that needs soldering, assembled by a manufacturing line, tested, with many parts, and a team of dedicated engineers optimizing drivers constantly. The other is a CPU that is machine tested, and shipped with a heatsink fan unattached. Come on now.

GPU manufacturing is automated. The CPU heatsink isn't attached because it mounts to the system board, not because attaching it would meaningfully affect the unit price.

Driver development isn't part of the unit cost, its contribution per unit goes down when you ship more units.

You can buy an entire GPU for the price difference between the 8500G and the RTX 4060.

That's not true. The discrete GPU market has been shrinking for 14 years straight with some crypto boom years here and there. See the chart I posted previously.

That's only because you're limiting things to discrete GPUs and customers have increasingly been purchasing GPUs in other form factors (consoles, laptops, iGPUs) which have different attachment methods but are based on the same technology.

According to Steam charts, Nvidia still dominates midrange cards.

Steam is measuring installed base. That changes slowly, especially when prices are high.

Furthermore, when I said "feast", I meant making profits. AMD does not make much profit from midrange or low end cards.

They make a non-zero amount of profit, which is why they do it.

aurareturn
0 replies
10h45m

I think you answered your own question, really.

Although I would modify your statement slightly:

Original: Selling 30M GPUs with a huge margin is more profitable than selling 60M GPUs with a modest margin.

Modified: Nvidia and AMD must sell at a higher ASP because the market for discrete GPUs has shrunk from 60m to 30m/year.

That's your answer! It's what I've been arguing for since my very first post. It isn't Nvidia and AMD's choice to have the market shrink in terms of raw volume. It's because many midrange gamers have largely moved onto laptops, phones, and consoles for gaming since 2010. The remaining PC gamers are willing to pay more for discrete GPUs. Hence, both Nvidia and AMD don't bother making compelling midrange GPUs.

I remember midrange GPUs that have great value such as the AMD HD 4850. I don't think those days are ever coming back.

autoexec
4 replies
14h52m

Is it an issue of supply? I figure they think non-gamers will be willing and able to pay far more for their hardware than most gamers could ever afford. If they sold comparable products to gamers at a low price the non-gamers would just end up buying them up and even if there was enough supply to satisfy the needs of everyone Nvidia would still be out a fortune.

It'd be better for them to sell low priced gaming cards that would perform poorly for non-gaming purposes and sell extremely high priced specialty cards to the people who want to use them for AI or crypto or whatever other non-gaming uses they come up with.

That'd at least keep the price of video cards low for gamers, avoid supply issues, and allow nvidia to extract massive amounts of profit from companies with no interest in video games. The only downside would be that it makes it harder for anyone who doesn't have deep pockets to get into the AI game.

AnthonyMouse
2 replies
14h40m

If they sold comparable products to gamers at a low price the non-gamers would just end up buying them up and even if there was enough supply to satisfy the needs of everyone Nvidia would still be out a fortune.

That's the point. They're not trying to do gamers a favor, if it was that they'd just make more cards. What they're trying to do is market segmentation, which customers despise and resent.

creato
1 replies
12h17m

I don’t know, in the world you want, gamers don’t have GPUs, and NVIDIA has less money. The only winners are people buying hardware for data centers.

AnthonyMouse
0 replies
11h42m

They could leave the connector on the GPUs and then make more of them. Then everybody has more GPUs and everybody wins except Nvidia. Or to put it the other way, in the status quo everybody loses except Nvidia.

solardev
0 replies
14h34m

They used to do this with the Quattro workstation cards. What happened to those?

ksec
3 replies
12h57m

and is built on the older process which should no longer be in short supply.

I am not sure where that idea came from. It is from Samsung 8nm Fab. It never had the capacity to play with in the first place. Especially when Samsung Foundry is upgrading to chase with leading node.

AnthonyMouse
2 replies
11h34m

In general you want to upgrade your oldest fab to the newest node, or build a new one, instead of shutting down a recent one and taking it out of production even though there is still demand for that node.

There are still plenty of things being produced in fabs with older technology than that. Global Foundaries is the third largest in the world and they're offering 12nm or worse. People buy it because not everything needs a node which less than six months old and the price is right.

ksec
1 replies
11h15m

Yes. But that is specific to GF and TSMC when you have recurring customer on older node. Which is not true for Samsung Foundry.

AnthonyMouse
0 replies
10h52m

Why is it different? Wouldn't Samsung prefer to build new fabs rather than retrofitting old ones given the demand, since they would then have more customers and make more money?

system2
10 replies
15h51m

AMD is not too far behind. They can go wild in a few years.

behnamoh
8 replies
15h49m

In the LLM world, each month is a year and each week is a month.

AMD hasn't done anything substantive in the past two years since GPT-3/GPT-3.5. Almost every research paper implements their algorithm in CUDA. AMD can't beat software with good hardware.

HWR_14
3 replies
15h14m

Didn't AMD just commit to a CUDA compatible api? Did I hallucinate that?

zozbot234
0 replies
8h22m

Different kinds of compatibility. HIP is source compatible and officially supported. Zluda is the newly released project for running CUDA-compiled binaries.

tomoyoirl
0 replies
15h7m

If you mean the one posted here earlier today, I believe that article was more like “we paid a contractor to implement this, and then decided not to use it, so per our terms it’s open source now.”

lhl
0 replies
14h59m

Well, you probably read a inaccurate headline about it. The project is called ZLUDA https://github.com/vosen/ZLUDA and it had a recent public update because of the opposite - AMD decide not to continue sponsoring work on it:

Shortly thereafter I got in contact with AMD and in early 2022 I have left Intel and signed a ZLUDA development contract with AMD. Once again I was asked for a far-reaching discretion: not to advertise the fact that AMD is evaluating ZLUDA and definitely not to make any commits to the public ZLUDA repo. After two years of development and some deliberation, AMD decided that there is no business case for running CUDA applications on AMD GPUs. > > One of the terms of my contract with AMD was that if AMD did not find it fit for further development, I could release it. Which brings us to today.

It's worth noting that while ZLUDA is a very cool project, it's probably not so relevant for ML. Also from the README:

PyTorch received very little testing. ZLUDA's coverage of cuDNN APIs is very minimal (just enough to run ResNet-50) and realistically you won't get much running. > However if you are interested in trying it out you need to build it from sources with the settings below. Default PyTorch does not ship PTX and uses bundled NCCL which also builds without PTX:

PyTorch has OOTB ROCm support btw and while there are some CUDA-only libraries I'd like (FA2 for RDNA, bitsandbytes, ctranslate2, FlashInfer among others), I think sponsoring direct porting/upstreaming compatibility of the libraries probably makes more sense. Also from the ZLUDA README:

ZLUDA offers limited support for performance libraries (cuDNN, cuBLAS, cuSPARSE, cuFFT, OptiX, NCCL).
dartos
2 replies
15h17m

AMD has rocm and added a CUDA compat layer to it.

Nvidia is in the limelight, but their product (GPU compute) is a commodity.

Once someone else has it cheaper, then it’s a race to the bottom.

pests
1 replies
14h42m

They did not add a compat layer, where did you get this?

The recent news was about AMD giving up on that path.

dartos
0 replies
4h14m

https://github.com/vosen/ZLUDA

They still funded it and it was created.

brucethemoose2
0 replies
15h29m

There's more ROCm compatibility than you'd think.

I'm at a startup, and we'd love to be using MI300Xs. Out stack works with it, they are amazing, our wallet is open... But we can't! We simply can't find any. It seems they are unobtanium for megacaps only.

dingi
0 replies
14h8m

Oh please, AMD's offerings are worse than Nvidia's in every imaginable way except the open source nature of their Linux drivers. ROCm is shit. Nvidia actually supports CUDA on almost any Nvidia card. ROCm not so much.

acchow
8 replies
13h35m

Big tech and AI startups would buy all the 4090's in existence if they had NVLink. You would not be able to find one for gaming without a multiple x markup

eastbound
7 replies
12h12m

So we’re deeply hampering our AI startups, limiting their growth and possibilities by jacking up the prices x10, to reserve some silicium only for gamers which could perfectly be used for AI?

Maybe gamers can wait and the next technological leap should have priority?

wastewastewaste
0 replies
11h50m

let the startups do some work on their own for once lol

oblio
0 replies
11h48m

This is a ridiculous argument.

People need entertainment, too, we're not machines.

If the current ML craze is really that impactful, it will happen regardless. If it can't break through it's because most of it is just a bunch of hot air.

imhoguy
0 replies
10h52m

jacking up the prices x10

AI is goldrush and Nvidia is selling golden shovels.

They try to play it longterm though. Goldrush ends at some point, if they upset "gardeners" gamers, these may be jumping onto AMD or even Intel shovels already. That was already the case during Bitcoin goldrush.

But I wouldn't be also surprised this is a plan agreed at closed door meetings between big corporations. They may want to just kill any independent AI advantage, and force everyone to their cloud walled gardens. Future will tell.

guappa
0 replies
11h21m

the next technological leap should have priority?

lol

dns_snek
0 replies
5h26m

I'd say it would be mutually beneficial if AI startups just skipped the part where they produce nothing of value despite having access to every resource, and skip ahead to the part where they go out of business.

Your AI startup has a 0.1% chance of becoming successful (going by SV definition of success, as yet another rent-seeking, privacy-abusing SaaS) while millions of gamers have a 99.9% chance of enriching their lives with entertainment. Statistically that's just the reality of this situation if you insist on being utilitarian.

We keep the GPUs, Nvidia keeps a diverse customer base, and you don't have to waste years of your life doing self-important busywork. Deal?

blitzar
0 replies
8h41m

Won't somebody think of the crypto miners.

FloorEgg
0 replies
10h11m

Yeah I guess to hell with all those people who are building the next generations of games and gaming platforms, and the millions of people who are looking forward to them. Let's sacrifice all their purpose and enjoyment so your AI startup can get cheaper hardware and push out some AI product no one wants, just so it can go out of business anyway when the VC money runs out.

mr_toad
2 replies
12h51m

Nvidia purportedly enjoys a 1000% profit margin on H100’s.

https://www.tomshardware.com/news/nvidia-makes-1000-profit-o...

epivosism
0 replies
10h16m

I think that is actually the markup number. Profit margin is capped at 100% since you can't profit by more than you sell something for.

SideQuark
0 replies
10h12m

Only by ignoring all upfront costs, as also mentioned in the article.

Marginal profits need to be high when upfront costs are huge.

up2isomorphism
0 replies
13h7m

But all Linux’s arch enemies are becoming more and more lucrative: Microsoft, apple, Nvidia. Actually less evil one s like Sun died. In fact these companies are so dominating now largely thanks to Linux and open source software.

pdntspa
0 replies
11h52m

NN research and crypto assholes ruined the market for gaming graphics. I'm glad NVlink was taken out.

pavelstoev
0 replies
14h43m

You can do this in software, effectively.

leetcrew
0 replies
12h3m

I don't love nvidia, but this is not the example I would choose to show why.

cutting edge hardware exists at prices that are affordable for hobbyists largely because a few key features for commercial use cases can be "artificially" turned off.

it sucks that you don't get to pay consumer prices for your highly lucrative application anymore, but the alternative looks a lot more like "tensor prices for geforce skus" than "geforce prices for tensor skus". we are already seeing this play out with crypto mining to an extent. I'd hate to see what would happen to the consumer market if AWS could just buy a bunch of RTX parts to rent out.

ksec
0 replies
12h49m

Nvidia has been enjoying their absurd premium on consumer

This has been running wild on the Internet. If anything Nvidia has arguably earned less premium in the consumer market with most of the initial R&D cost being amortised with their AI / datacenter chips.

kimixa
0 replies
12h28m

Design and development-limited tech loves economies of scale. Stuff like software is functionally free per-unit past the development, hardware still scales very well as per-unit costs are a smaller proportion of the end unit cost (and much of their advantages is also in software, like CUDA and DLSS etc.).

The steam survey suggests that Nvidia have over 90% of the GPU market, so for every design they sell somewhere near 10x the number of units, so if they sell at the same margins they get 10x the development resources per unit.

That's a lot of slack to lose to business inefficiencies, or competing with more specialized but smaller market devices. Assuming they don't just purchase such possible competitors when they pop up.

It may be that a monopoly is the "natural" end state of such tech markets. I think it's self-evident this isn't a good end state for consumers.

foobiekr
0 replies
11h24m

Nvidia’s value is completely out of line with their revenue and eventually that will catch up with them.

6510
0 replies
4h31m

The last comment here made me laugh

https://news.ycombinator.com/item?id=38144619

beebmam
63 replies
16h23m

It seems deeply overvalued to me, as it's pretty likely that NVIDIA's competitors will have parallel computation hardware that competes well enough. I'm personally not a fan of NVIDIA's drivers or the reliability of their hardware.

fnordpiglet
22 replies
15h59m

I’m not sure I would bet an ads/spyware company with admittedly deep deep pockets would beat a specialized hyper focused company at its core business, especially as they have deeper pockets now with 30 years of experience in the sub sector. If Nvidia were wandering in the desert like Cisco I could believe it. But Nvidia isn’t, and I don’t believe Google, Amazon, or others will beat Nvidia at their own game. (Given my time in FAANG, I also speak from inside knowledge of how deeply f’ed these companies are - making your own arm chip or network adapter isn’t the same thing as taking on high end designers at their own game)

tuyguntn
7 replies
15h48m

don't underestimate market forces.

Same could have been told about Intel, but Apple anyway beat them in some ways and took away big market share

AnthonyMouse
4 replies
15h0m

AMD has taken more CPU market share from Intel than Apple.

But the weird thing about the "Nvidia will remain undefeated forever" theory is that it seems to assume they have some kind of permanent advantage.

Nvidia was well positioned to make an early investment in this because they had the right combination of existing technology and resources. Other companies would have had to invest more because they were starting from a different place (e.g. Microsoft), or didn't have the resources to invest at the time (AMD).

But now the market is proven and there are more than half a dozen 800 pound gorillas who all want a piece of it. It's like betting that Tesla will retain their 2022 market share in electric car market even as the market grows and everyone else gets in. Maybe some of the others will stumble, but all of them? Apple, AMD, Google, Intel, Microsoft, Amazon and Facebook?

eru
3 replies
14h40m

It's like betting that Tesla will retain their 2022 market share in electric car market even as the market grows and everyone else gets in.

Yes. Especially once you take the Chinese electric car companies into account, that are already outselling Tesla.

AnthonyMouse
2 replies
14h21m

And even a Tesla optimist would presumably admit that maintaining a third of the EV market would be a huge win for Tesla, as the EV market becomes "the car market" going forward. Maybe that won't happen, but it's at least within the realm of possibility -- maybe some of the existing carmakers stick the transition and some of them fail, but Tesla remains the biggest one, that's not impossible.

But maintaining the 80% they had a few years ago, much less 100%? That's not optimism, it's fantasizing.

eru
1 replies
12h56m

Agreed.

Btw, I was talking about global electric car production. I don't know whether Tesla ever did 80% of global electric car sales?

AnthonyMouse
0 replies
12h27m

~80% was the US number from 2020.

fnordpiglet
1 replies
15h14m

Intel isn’t top of their game and haven’t been for many many years.

AnthonyMouse
0 replies
14h50m

Intel knows how to make software and libraries for their hardware, which is the thing people keep lamenting about AMD. Intel's current GPUs are mediocre but priced competitively for what they are, and Intel having more competitive hardware in the future is not implausible.

Which could lead to Intel realizing the opportunity they have. Create decent libraries that work across every vendor's GPUs. In the short term this helps AMD at the expense of Nvidia, which in itself helps Intel by preventing Nvidia from maintaining a moat. In the medium term Intel then has people using Intel's libraries to write code that will work on their future GPUs and then their problem is limited to producing competitive hardware.

rgmerk
4 replies
15h35m

Have you worked at a large (say 500+ employee) company that you would say isn't "deeply f'ed"?

I once read a fascinating corporate history of Xerox, and that company became deeply, deeply f'ed up in ways that are of their time but do have strong parallels to the issues I understand that FAANG, particularly Google, have.

Horffupolde
1 replies
15h19m

Could you please share any links or pointers for that? Sounds interesting.

rgmerk
0 replies
13h36m

One of them was called Xerox: American Samurai, and dates from 1986 (I read it when I was a teenager in the early 1990s, I think). There was another one from the early 1990s, I believe.

This book lauded Xerox's success at reforming its corporate culture, regaining a strong position in the photocopier market and even spent a chapter detailing their success in getting into electronic typewriters!

With the benefit of hindsight the company didn't survive its core product losing all relevance any better than Kodak did, but that was still some way in the future (if foreseeable by the 1980s).

That said, much of the material about a hugely bloated organisation, with a sclerotic bureaucracy and lots of cushy middle managers assembled through a previous period of explosive growth, turning out poor-quality product, sounds very reminiscent of some of what we now hear about the current big tech companies.

The title reflects another American obsession at the time - the idea that the US was "losing" to Japan.

eru
0 replies
14h41m

Have you worked at a large (say 500+ employee) company that you would say isn't "deeply f'ed"?

Perhaps Google until around 2010? Or Goldman Sachs until the 1990s?

dehrmann
0 replies
13h6m

Have you worked at a large (say 500+ employee) company that you would say isn't "deeply f'ed"?

You get a weird insider bias where all you see are bug reports and problems, so you get the impression the product is shit, even if 99% of customers love it.

pb7
3 replies
15h46m

Given my time in FAANG, I also speak from inside knowledge of how deeply f’ed these companies are

Please tell us more of your expertise and deep insight, FAANG employee #1,908,680.

Google has developed their own chips. Apple has developed their own chips. It’s really not that hard if your pockets are deep enough or the bottom line checks out.

When Apple launches their AI offering this year, it’s not going to need NVIDIA.

fnordpiglet
1 replies
15h14m

They have, over many years and not against a company at the top of their game and with considerable help. Apples move is an ARM chip, not an Apple chip. Googles chips are competitive in spaces like network infrastructure. TPUs are illustrative of their inability to provide a viable alternative.

Apple isn’t going to launch with an Nvidia killing alternative, I’ll bet you $1,908,680 it’s backed by Nvidia.

They will likely use their neutral chips for local models, but their data center stuff will be 100% Nvidia.

AnthonyMouse
0 replies
14h45m

The interesting question isn't whose hardware they use for the launch, it's what the public-facing software API looks like. Apple isn't likely to directly expose CUDA. At which point they're free to swap out the hardware with whatever they want at any time.

Also, Apple has a longstanding dislike for Nvidia and even if they weren't going to design their own chips at launch, they could be using AMD.

KaoruAoiShiho
0 replies
15h31m

Do you have any knowledge of what it runs on? I was pretty sure it was NVIDIA (accessed through cloud providers as a customer).

riku_iki
2 replies
15h49m

TPU is a thing and well competitive already for a long time?..

kccqzy
0 replies
12h44m

TPUs are made primarily to satisfy Google's own needs (especially YouTube), not necessary what GCP customers need.

fnordpiglet
0 replies
14h45m

Except they come tied to GCP and don’t offer as high precision training and inference. These are pretty major disadvantages.

microtherion
1 replies
15h30m

It's interesting that you would name drop Cisco. They exploded in market valuation in 1999/2000, having a near-monopoly on infrastructure that was in high demand due to the Web boom. Nvidia similarly profited from the Crypto/AI boom, but I wonder whether that is bound to end similarly.

fnordpiglet
0 replies
11h37m

All things end, and Nvidia will probably decay at some point. But they’ve had a long and strong run.

dave_sullivan
14 replies
16h8m

I'm personally not a fan of NVIDIA's drivers or the reliability of their hardware.

As compared to AMD or Intel? I wish there was real competition to Nvidia but there isn't. I'm not a fan of their defacto monopoly but they do have the best product on the market and their competition has been asleep for 10 years. AMD and Intel barely knew what deep learning was 10 years ago (and certainly did not appreciate the opportunity) and Nvidia was already investing heavily.

pas
5 replies
15h58m

Nvidia was already investing heavily

that sounds very interesting, can you link/share/describe some details on it? how much did they invest? how? into CUDA? what else?

dave_sullivan
2 replies
15h47m

My statements based on high level meetings I had at the time with all 3 companies when I had started an early neural network PaaS company and was looking for them to invest. Nvidia knew what they were talking about and were already moving that direction, Intel heard about deep learning somewhere but didn't believe there was anything there, and AMD didn't know anything about anything.

wslh
1 replies
15h37m

Seems like a tale already told on "Only The Paranoid Survive" by Andrew Grove. Now Jensen should add some chapters. BTW I just discovered that their web page/brand is fully invested in AI, the title is: "World Leader in Artificial Intelligence Computing".

I am curious about your PaaS because we were analyzing that business for fun first. A small thread here in HN: https://news.ycombinator.com/item?id=39329764

dave_sullivan
0 replies
14h45m

I am curious about your PaaS because we were analyzing that business for fun first.

Here's an old tutorial video of the product: https://youtu.be/V2q9hVdi80w

We were doing cool things and on the cutting edge but ultimately couldn't make a business out of it and weren't talking to the right people.

throw0101b
1 replies
15h21m

how much did they invest? how? into CUDA? what else?

CUDA 1.0 was released in 2007:

* https://insidehpc.com/2007/07/nvidia-releases-cuda-10/

* https://developer.nvidia.com/cuda-toolkit-archive

From SIGGRAPH 2007, "GPU computing with NVIDIA CUDA":

* https://dl.acm.org/doi/10.1145/1281500.1281647

"NVIDIA: The Era of the Personal Supercomputing":

* https://www.nvidia.com/content/events/siggraph_2007/supercom...

Before AI/ML was hot, and before even the Bitcoin paper was released. NVidia was investigating/experimenting/investing in the concept before there was any kind of 'killer app' for it.

pjmlp
0 replies
12h15m

And even better, NVidia understood not everyone wants to use plain old C for their GPGPU coding, and early on staring with CUDA 3.0 in 2010, introduced C++ support and PTX.

Later on they acquired PGI, which thanks to PTX, had C, C++ and Fortran compilers, thus adding Fortran into the mix.

Followed along by all the IDE, graphical debuggers tooling and library ecosystem.

Meanwhile Intel and AMD were doing who knows what at Khronos stuck in their "C is good enough" mentality, and barely released useful developer experiences.

HDThoreaun
3 replies
14h23m

As compared to a hypothetical product that is better than whats available today. With literally a trillion dollars on the line I find it very difficult to believe no one will come and scoop this opportunity up. The real value in GPUs is the datacenter segment which largely didnt exist, certainly not in that state it is today before the LLM take off. Takes time to develop products but theyll arrive eventually.

rezonant
2 replies
14h18m

Well there's not really a trillion dollars on the line. This is a valuation, not revenue.

All this says is that AI and LLMs are extremely over hyped, and the market believes Nvidia's tech is the only viable supplier of the platform LLMs run on.

These are things we already knew, so it's not surprising the market is quadrupling what it thinks Nvidia is worth.

HDThoreaun
1 replies
14h6m

Yes there is a trillion dollars on the line here. Market cap is what matters to investors end of the day, everything else is just influencing it. Yes the market is saying Nvdia is the only supplier for AI and LLM hype, but its also saying they will continue to be the only supplier long term and that seems deeply flawed to me.

rezonant
0 replies
13h0m

While there's certainly a feedback cycle involved, market cap is decided by investors in the large. The stock price is just reflecting buys and sells of those investors. People believe AI has a lot of money and hype in it, and Nvidia supplies necessary equipment for AI, so they buy Nvidia. As a result the stock price goes up, and the market capitalization goes up accordingly.

As a thought experiment, imagine buying 50% of a company on the open market and then seeing the price go up accordingly as the market does, then saying it must be even more valuable than I thought! And buying the other half of the company at the higher price. You caused that "value" by buying.

No one investor has a trillion dollar opportunity, and no competitor does either. Making an assumption that a competitor will come along and zero out Nvidia because they have a better AI chip is not rational. For one the value of Nvidia isn't solely based on this, and by the time you've made your AI chip the market is going to have changed.

Considering how insanely inflated the public's expectations of what LLMs can do, despite the fact that they are only a mirage of intelligence, it would probably be foolhardy to build a new AI chip to replace them.

I suppose for AMD they could see their stock increased by such a margin if they could just produce a chip that convinced the market, but were they not already trying to do that?

TillE
2 replies
13h1m

Ten years ago, Intel was the leader in chip fabrication technology. Things can change fast.

Intel has already been making good low-end GPUs (with lousy but rapidly-improving drivers). If they're smart, they'll keep at it.

hatenberg
1 replies
5h30m

Intel wasted the last decade because they elected a shareholder optimizing CEO who chose to prioritize shareholder payouts and attempts at controllig the ecosystem rather than pushing it forward over investments.

Now they're completely outclassed by TSMC and have to partner with UMC to compete.

relativ575
0 replies
4h37m

The point is that thing can change. Intel was king a decade ago. TSMC is on the summit at the moment. Intel can be back in a few years if their plan works out:

https://www.xda-developers.com/intel-roadmap-2025-explainer/

HenriTEL
0 replies
8h9m

Yep, I remember 8 years ago Intel trying to sell us their CPU solutions for AI. It was hard to make things work and in the end performance were not there. We were comparing against consumer gpu like the GTX 1080.

rgmerk
12 replies
16h5m

Not having done any GPU programming...is most of the code out there tied to the Nvidia architecture? I mean, if AMD or somebody else builds a better mousetrap, how much work would it be to switch?

cowsandmilk
10 replies
16h0m

Most of the AI and ML related programming uses frameworks that abstract away what brand of GPU you are using. They could care less whether it is nvidia or AMD.

latchkey
5 replies
15h55m

This is true for making things just work, however to really squeeze performance out of a GPU, you need to go lower level and that is tied to the architecture.

virtue3
4 replies
15h39m

This has happened before and it will probably go the same way. Software and compilers will make up the difference, or hardware will become so cheap and ubiquitous it wont super matter.

In 3-5 years what will a 10% performance difference matter to you? Then calculate how much that 10% performance difference is going to cost in real dollars to run on nvidia hw and then the fun math should start.

latchkey
2 replies
15h31m

Performance for GPUs isn't just speed, but also power efficiency. The complexity of GPUs doesn't lend itself to just being solved with better tooling. They are also not going to get cheaper... especially the high end ones with tons of HBM3 memory.

Given that data centers only have so much power and AI really needs to be in the same data center as the data, if you can squeeze out a bit more power efficiency so you can fit more cards, you are getting gains there as well.

When I was mining ethereum, the guy who wrote the mining software used an oscilloscope to squeeze an an extra 5-10% out of our cards and that was after having used them for years. That translated to saving about 1 MW of power across all of our data centers.

Let me also remind you that GPUs are silicon snowflakes. No two perform exactly the same. They all require very specific individual tuning to get the best performance out of them. This tuning is not even at the software level, but actual changes to voltage/memory timings/clock speeds.

eru
1 replies
14h37m

You are right to worry about power efficiency. Though do keep in mind that power is also fungible with money, especially in a data centre.

I suspect a lot of AI inference (thought probably not the majority) will happen on mobile devices in the future. There power is also at a premium, and less fungible with money.

latchkey
0 replies
1h53m

Though do keep in mind that power is also fungible with money, especially in a data centre.

Untrue. I have filled 3 very large data centers where there was no more power to be had. Data centers are constrained by power. At some limit, you can't just spend more money to get more power.

It also becomes a cooling issue, the more power your GPUs consume, the more heat they generate, the more cooling that is required, the more power that is required for cooling. Often measured in PUE.

spenczar5
0 replies
14h30m

We are so, so, so far away from compilers that could automatically help you, say, rewrite an operation to achieve high warp occupancy. These are not trivial performance optimizations - sometimes the algorithm itself fundamentally changes when you target the CUDA runtime, because of complexities in the scheduler and memory subsystems.

I think there is no way that you will see compilers that advanced within 3 years, sadly.

dheera
2 replies
15h16m

except AMD produces nothing remotely close to the compute capability of an H100. they only compete at the gaming card level.

edward28
1 replies
14h45m

They released the MI300 which beats a H100 but is close to a H200.

saberience
0 replies
10h43m

Actually, this hasn't been shown yet. AMD showed some "on paper" specs which could "theoretically" be faster than a H100, but they didn't show any practical tests which could be recreated by third parties. Also, some of the tests AMD ran on H100's were deliberately not using the correctly optimized software, massively slowing down the H100 performance...

As a result there is a lot of sceptism as to whether it's actually faster in real world scenarios. There are a few articles explaining this situation. Here is one: https://www.forbes.com/sites/karlfreund/2023/12/13/breaking-...

ric2b
0 replies
15h14m

And yet I see tons of stuff requiring CUDA.

gitgud
0 replies
13h5m

A lot of projects have trouble moving off of Nvidia's proprietary CUDA platform. An example is this [1] ML repo, the issue has been open for years...

[1] https://github.com/alicevision/AliceVision/issues/439

teaearlgraycold
8 replies
16h15m

Google is actually working on 1st party datacenter chips. Although I doubt those would be user-facing GPUs.

jeffbee
5 replies
16h5m

The last decade of Google datacenter engineering can be viewed as an elaborate plan to avoid paying either Intel or Nvidia any more money than was really necessary.

fnordpiglet
4 replies
15h59m

And, yet.

what_ever
3 replies
15h49m

Yet, what?

kccqzy
2 replies
12h40m

Yet, Google's customers still demand Google to pay Nvidia exorbitantly.

teaearlgraycold
0 replies
11h26m

Google has plenty of first party servers.

fnordpiglet
0 replies
12h4m

And intel.

ein0p
0 replies
15h7m

Nobody other than Anthropic and a few other large cloud customers like that really cares what Google is working on. They bet the farm on a DL framework (Jax) with 2% market share. That’s a very deep hole to climb out of, particularly considering the popularity of PyTorch in generative AI space.

Laremere
0 replies
16h5m

Google Pixels have a "neural core", and they have edge TPUs in addition to their data center TPUs. However Google hardware seems much less poised to take immediate advantage of the AI gold rush.

nicce
0 replies
11h27m

I'm personally not a fan of NVIDIA's drivers

A bit funny thing is that they are essentially software company which focuses on software quality, if you look at the stats how many people are working with software over there. And still it is not good enough?

nalllar
0 replies
15h54m

Their biggest competitor in the GPU space is AMD who have spent years chasing deadlock issues that won't stay fixed, playing whackamole, in between trying to do actual driver development. Following the amdgpu mailing list is not fun.

eru
0 replies
14h42m

Feel free to short sell NVIDIA?

sebmellen
19 replies
16h25m

And with a P/E ratio of 95, versus 25 and 59 for Google and Amazon, respectively!

jgalt212
13 replies
16h12m

agreed, it's very high, but with M2 growing again such outlandish valuations seem sustainable.

https://fred.stlouisfed.org/series/WM2NS

brcmthrowaway
9 replies
16h5m

What is the implication of M2?

pclmulqdq
8 replies
16h3m

It's the amount of money in the economy, and its growth is one measure of inflation. GP may be commenting that the "good times" of super-high valuations are about to come back thanks to inflation.

That isn't the financial datum that really matters, though - what matters for P/E ratios is the risk free rate (which establishes the discount rate for the time value of money), which is still very high.

jgalt212
5 replies
15h51m

what matters for P/E ratios is the risk free rate

Where's the research showing the empirical relationship between P/E ratios and the risk free rate?

Also, we are living in a a time of unprecedented monetary aggregate growth (for the US at least). I posit his why the yield curve has been inverted for so long and yet there is no recession in sight. The predictive power of asset prices seemingly no longer exists.

pclmulqdq
4 replies
12h27m

There's a ton of it: here's one person's regression. You can find the same in academic papers (although economics papers don't exactly deserve that label).

https://www.currentmarketvaluation.com/posts/sp500pe-vs-inte...

It's also just common sense if you understand company valuation.

jgalt212
3 replies
6h18m

It's not 1:1, and only fits over long periods of time. Furthermore, none of those studies encompass the great money printing period of COVID, and post COVID. And most of those studies model the 10Y rate vs stocks. The 10Y rate is not the risk free rate.

In 2000 the SP500 PE was 40, and fed funds was 6%. it's low 20s now, and fed funds is 5.25%

studies over more recent times have shown this correlation break down. It's probably the reason Ray Dalio retired.

FredPret
2 replies
3h1m

If a 10 yr T bill isn’t risk free, what is? A shorter term US Gov bond?

I mean realistically the US will just print the money it needs to pay that interest.

It’s not a consequence-free decision for them, but much better than defaulting.

jgalt212
1 replies
2h39m

10Y has duration risk. 1 week T bill have no credit risk or duration risk, hence this is the risk free rate.

pclmulqdq
0 replies
48m

That's not true at all. The "risk free rate" is a theoretical concept that represents the rate of return that you can get on capital without taking any risk. No bond alone is the risk free rate. Neither is the reverse repo rate or the rate of a money market. All of these are proxies.

FredPret
1 replies
15h48m

By risk-free rate, do you mean the yield on US gov bonds?

pclmulqdq
0 replies
12h25m

"Risk-free rate" is a theoretical concept that represents the rate of return you get without taking any risk. T bill rates, reverse repo rates, and money market savings rates are a practical proxy for it, but it's not any one of these.

throw0101b
2 replies
15h13m

agreed, it's very high, but with M2 growing again such outlandish valuations seem sustainable.

You know whose M2 has also been rising for decades? Japan's. And yet for most of that time the Nikkei has been flat (even negative):

* https://fred.stlouisfed.org/graph/?g=17sx4

China's M2 has also been going up steadily:

* https://fred.stlouisfed.org/series/MYAGM2CNM189N

What has the Shanghai Stock Exchange (index) been doing lately?

The UK's M2 has been going up continuously:

* https://fred.stlouisfed.org/series/MSM2UKQ

How's the FTSE 100?

https://fred.stlouisfed.org/series/WM2NS

Now let's overlay the S&P 500:

* https://fred.stlouisfed.org/graph/?g=1gvKR

A giant spike in M2 in 2020, and yet at the same time the S&P 500 dropped. M2 has been on a downward trend since April 2022, and the S&P 500 bottomed in ~October 2022, but has been rising since then—while at the same time M2 has been dropping.

jgalt212
1 replies
6h25m

you're missing the investor preferences bit. You could print $7T or whatever crazy amount Sam Altman wants, and if everyone puts that money under their mattress instead of stocks or spending, there will be no observable effect on the real or financial economy. In short, the transmission mechanism of money creation to real effects is laggy and and somewhat unpredictable. e.g. why NVDA and not gold?

throw0101b
0 replies
2h30m

you're missing the investor preferences bit. You could print $7T or whatever crazy amount Sam Altman wants, and if everyone puts that money under their mattress instead of stocks or spending, there will be no observable effect on the real or financial economy.

If you go through my posting history you'll see I've more than once mentioned that velocity is much more important than simple quantity.

A good analogy from Cullen Roche that I often use:

But also – why do so many people insist that inflation is an increase in the money supply? This makes zero sense. Here’s why – our economy is mostly a credit based economy. So, if I take out a loan for $100,000 then the money supply has technically increased by $100,000. But what if I don’t actually tap that loan? What if I borrow the money because, for instance, house prices just went up 25% and I want to have some cash around for emergencies? This doesn’t tell us anything about prices, living standards or really anything. But this is what so much of the money supply represents – money that has been issued and is just sitting around unused. Why is this useful? It’s like calculating your weight changes by counting how much food you have in your refrigerator. No. That’s potential calories consumed and potential weight gain. The amount of food in your fridge tells you little about your future weight changes just like the amount of money in the economy tells us little about the actual price changes in the economy.

* https://www.pragcap.com/three-things-i-think-i-think-i-see-d...

what_ever
1 replies
15h48m

Nvidia haven't released their latest earnings yet.

MuffinFlavored
0 replies
15h40m

NVIDIA Corporation Common Stock is expected* to report earnings on 02/21/2024 after market close.
tomoyoirl
0 replies
15h56m

Trailing or forward earnings?

refurb
0 replies
13h39m

The thing with P/E ratios is that they are backwards looking while stock prices represent future estimations.

Bankers put together detailed models to estimate the earnings per share in the future. Yes they are estimates, but informed as best as possible.

The model spits out an estimated $/share. If the current price is lower, they think the future value of the share is actually higher than the current price, so they buy or convince others to buy.

foolswisdom
0 replies
15h1m

That's a P/E ratio based on the past four reported quarters, over which time earnings at Nvidia increased so much that the earnings for the last reported quarter was over 4 times that of four quarters ago. Assuming that future earnings for the next year are consistent with the last reported quarter, they're valued at 45x earnings. But of course such a valuation still only reddit makes sense assuming growth (and the projected earnings for the next quarterly report is indeed expected to be 12% higher, giving a 40x p/e valuation). Consider that Amazon's P/E (using the last reported quarter as the baseline for the future - Amazon has had consistently increasing earnings over the last year, let's assume it doesn't decrease from here) is 43x.

FredPret
18 replies
15h52m

Nvidia has seen a fantastic, geometric increase in profit [1]. This is exactly the kind of thing growth investors hope for.

However, their PE ratio is currently 93. This is the kind of PE growth investors buy at, hoping for geometric earnings growth to kick in, not the situation after such growth has taken place.

Which means they'll have to keep 2x-ing that profit for a long time to come.

It could be doable but I don't see it. Even if they completely absorb Intel and AMD's market share, it doesn't make sense.

[1] https://valustox.com/NVDA

system2
11 replies
15h35m

Is it always about earnings? The market is all about speculation and trust. Unless a competitor obliterates them, I don't see their stock tanking more than 10%. Look at Apple, Google, or even Boeing. Boeing's doors are flying off the planes yet their stock is still higher than October 2023 crash. Nvidia can just go for a stock split and take care of the numbers quickly.

FredPret
9 replies
15h22m

It's about earnings (freshly made money) and equity (existing money) and growth potential (future money).

One way or another, they need to have money in order to distribute it to me, the investor (through dividends or buybacks).

At a valuation of 90x earnings and only $0.02 in equity per dollar of market cap, I have to pay an insane premium to acquire:

- an anemic stockpile of equity ($50b in assets - $22b in liabilities = $28b in equity. Yours for only $1500b!)

- a tiny cashflow of 1/90 of my investment

- admittedly big growth potential

So the question is - how big is that growth potential? I bet it's not high enough to justify 90x earnings.

system2
8 replies
15h1m

Unrelated question: Do you have any stocks you are following and expecting to go high soon?

__loam
6 replies
14h10m

Don't take financial advice from hacker news. Go to the bogleheads wiki, get a handle on your personal finances, then invest using a standard 3 fund spread in index funds. Picking individual stocks is just gambling.

saberience
2 replies
10h35m

Don't take this advice unless you want massively subpar returns. The idea that picking individual companies is "gambling" is quite frankly ridiculous. The information around NVidia being a great company has been around for years and years and was obvious to anyone who took a look at the company performance.

The same is true for companies like Netflix or Shopify. I invested in all three of the these companies for the first time in around 2013 and 2014, why? Because I did my research and due diligence and could see they were incredibly well run, operating in areas with lots of room for growth.

If I had just invested in index funds I would much, much less money than I do now.

If you're reading this comment and you're younger than 60, do not take this advice. Your risk tolerance should be higher when you're younger and buying good companies at good prices is not gambling, it's what Warren Buffet always did, and it's the best way to grow your net worth.

__loam
0 replies
8h1m

A lot of shit can happen to individual companies that is unanticipated or unlucky. For example, Boeing had been a pillar of American engineering for a long time. We had no reason to doubt that the producer of one of the most prolific and reliable jet liners would ever have any issues, but then it did. The facts are that it is incredibly hard to predict the trajectory of individual companies even with insider information. The hedge against acts of God is diversification, and the best way to do that is index funds.

Frankly your advice of do your own research and only invest in "good" companies is incredibly irresponsible advice. That you survived and did well is not proof that your strategy is good, just that you in particular are lucky. The most reliable way to grow and preserve wealth is diversity.

FredPret
0 replies
2h50m

This can work out so great if you do it well.

In the legendary books Securities Analysis and The Intelligent Investor they recommend this approach if you are serious about researching and they also recommend diversifying into a variety of stocks - certainly more than 3.

Personally I like your approach much more and it’s why I’m building https://ultimatestockpicker.com.

I feel like there’s a ton of money rushing into the SP500 and it’s juicing the valuations far too high while there are great companies out there trading at a PE of 5-10.

FredPret
2 replies
13h56m

This is great advice for most.

But the more people who do this, the more overvalued the indexes will become, and the more capital-starved the non-index companies will become (even though they are still very good businesses!).

__loam
1 replies
7h55m

Don't hate the player, hate the game.

FredPret
0 replies
2h49m

I’m not saying you’re wrong or should do it my way!

Just explaining why I’m taking the road less travelled.

FredPret
0 replies
14h35m

I'm always careful to calibrate my expectations to "one day far in the future this might work" versus expecting highs soon.

I recently exited positions in US Steel and Encore Wire which were selling for single-digit PE ratios and with >$1 equity per $1 market cap at the time. My plan was to hold them forever, but the price just shot up so much and there are other things to buy.

I like really boring businesses like banks, utilities, airlines that can trudge along for decades if they have to before I get my return. When you catch them in a bad news cycle, you can pick up a solid business for less than it's worth.

HDThoreaun
0 replies
14h10m

At the end of the day GPU copmpute is a commodity. I just cant see how nvdia has differentiated itself in a way that will last a decade plus. Theres too much money on the line for every competitor to be asleep at the wheel here. Eventually a competitor will have a compelling product and then nvdia will have to slash prices.

what_ever
2 replies
15h46m

Wouldn't 2x-ing their profit over the next year get the PE down to half? Why do they need to 2x their profit for a long time to come?

FredPret
1 replies
15h42m

Half is still crazy; 1/4 is what I would consider a little high (I'm a value investor). Especially for a tech company, I don't understand the rationale behind high valuations.

A high valuation makes sense to me if the asset will deliver a reliable cashflow over time, or if there's a high chance it'll start making much more cash in years to come.

More reliable / more growth = higher premium.

But tech companies come and go. Admittedly, Nvidia is more of an infrastructure company, but nobody can guarantee that they will be relevant 20 years from now, so even a PE of 20 seems risky to me.

what_ever
0 replies
55m

Value investors often miss exponential/crazy growth stories.

Ologn
2 replies
15h3m

What is their trailing P/E and what is their forward P/E?

Their trailing P/E has January 2023, where their earnings were 1.41 billion, and revenues were 6.05 billion. Their last quarter earnings were 9.24 billion, and revenues were 18.12 billion. Their earnings last quarter were 50% more than, not their earnings in the first quarter, but their revenues in the first quarter.

martinpw
0 replies
14h51m

This is the right way to look at it. The P/E ratio of 90+ includes two quarters of relatively low profit before things really took off in the most recent two quarters.

Annualizing the forecast earnings for the next quarter leads to a P/E of 38. Still high, but not outrageous: https://www.thestreet.com/memestocks/others/will-nvidia-stoc...

FredPret
0 replies
14h21m

What you're saying is equivalent to expecting the growth curve to continue for some time.

Forward PE hasn't happened yet and might never happen. Corporate earnings projections are numbers that come out of somebody's Excel and is justified by looking reasonable to an expert. It's a lot better than nothing, but I'm not betting the farm on that.

ralph84
14 replies
16h5m

Nvidia is still run by a founder, while the other two aren’t. I’d always bet on a founder over a McKinsey consultant or a Harvard MBA.

JoshTko
8 replies
15h41m

This is a silly take the two largest companies are headed by non founder MBA grads. Each who grew their respective companies ~10x.

Retric
4 replies
14h27m

Which is underperforming compared to Nvidia.

It’s not that founders are more skilled, it’s that successful ones who stick around have different priorities. The long term performance of their existing stock is worth more than hitting targets. Not a guarantee of success, but something to consider.

magicalist
3 replies
13h59m

It’s not that founders are more skilled, it’s that successful ones who stick around have different priorities. The long term performance of their existing stock is worth more than hitting targets.

We're still talking about the current first derivative of their stock price. Let's not go overboard reading those tea leaves.

Retric
2 replies
13h53m

The market is pricing in those tea leaves. Look at the P/E ratios of Nvidia (95) and Facebook (31) vs Apple (29) and Google (25).

outside415
1 replies
10h53m

PE is stupid. at least do forward PE or DCF... cmon.

Retric
0 replies
1h35m

Forward PE is tea leaves built on tea leaves.

rezonant
2 replies
14h27m

Meanwhile I'm of the opinion that Google started it's slide downhill during Larry Page's time as CEO.

Somewhat ironically, I think Google's brightest days had Eric Schmidt (MBA, "adult in the room") as CEO.

This isn't to say MBAs are always a good thing either, and certainly Google isn't a typical example.

VirusNewbie
1 replies
12h42m

Eric Schmidt started his career as a hacker at Bell Labs. He is an MBA in name only.

gregw134
0 replies
11h4m

Bell labs, xerox parc, and sun microsystems. Quite the technical resume.

Early in his career, Schmidt held a series of technical positions with IT companies including Byzromotti Design, Bell Labs (in research and development),[19] Zilog, and Palo Alto Research Center (PARC).

During his summers at Bell Labs, he and Mike Lesk wrote Lex,[23][7] a program used in compiler construction that generates lexical-analyzers from regular-expression descriptions.

Sun Microsystems In 1983, Schmidt joined Sun Microsystems as its first software manager.[19]

Wohlf
1 replies
14h56m

There's plenty of founders who ran their companies in to the ground, and even more who never got it off the ground in the first place. There's no guaranteed recipe for success.

Jensson
0 replies
14h50m

We are talking successful founders here, that is a very different group from all founders.

There's no guaranteed recipe for success.

Yes, which is why he said bet on, not know it will win.

kdnvk
0 replies
13h34m

What a minor thing to extrapolate into your world model.

hasmanean
0 replies
15h51m

Well a founder can make decades long bets.

Any other company would have had an annual review of progress or a review any time mid-level leadership changed hands. Assuming a 50/50 chance of survival each time, the odds of the project lasting 10 years would be 1 in 1024.

__loam
0 replies
14h51m

I will never understand the culture of founder worship.

ein0p
13 replies
15h12m

As much as I’d like to say it’s overpriced, its only potential competitor, AMD, is even more overpriced with an eye watering P/E of 324 as of right now, three times that of NVIDIA. Such numbers are stark raving insanity for hardware companies. For comparison, Apple, fully in possession of the world’s most valuable brand, and a gigantic money printer powered by a loyal, billion+ user strong customer base, and a deep pipeline of products, has P/E one third that of NVIDIA, and 1/10th of AMD. Make this make sense.

yCombLinks
5 replies
15h7m

The forward PE is 31 for AMD, and Nvidia is at 33. Apple's forward PE is 26. The market prices are based on future expectations. Nvidia is expected to increase earnings by something like 3x, and AMD by 10X

nerdponx
2 replies
14h5m

Apparently, AMD can remain irrational longer than retail investors can remain solvent.

yCombLinks
0 replies
13h51m

Right, I'm not saying they will hit those projections, that's what they are though

gpt5
0 replies
12h6m

To be fair, it's much easier to increase your profits when your profit margins are 5% (AMD) vs 50% (Nvidia)

givemeethekeys
1 replies
10h24m

How does one find information on forward P/E?

Are there historical forward P/E charts to get some context on whether the current prices are expensive / in-line with how tech is valued over the past ten.. twenty years?

Thanks!

airstrike
0 replies
5h34m

Generally, any forward metric is the consensus (mean) of analyst estimates, unless you have your own model for what you think it will be (in which case you probably are still callibrating it around the street's view)

Since you aren't going to have an account with every broker to get their research, instead you'd sign up for one of Capital IQ, FactSet or Bloomberg which aggrgate the info for you and are listed here in increasing order of price and quality

Banks would have the chart you're looking for but they're fairly easy to recreate if you have the data. (Source: I worked in M&A for nearly a decade in such banks)

HDThoreaun
2 replies
14h15m

Intel is jsut as much a competitor to Nvdia as AMD. I suspect they will work together on creating a CUDA alternative which is the biggest problem both of them face. If they can figure it out I dont see how nvdia isnt fucked as the money is in the big cloud purchasers who have every incentive to switch as soon as a legitimate alternative presents itself. Even if amd/intel dont beat Nvdia coming close with a product that "just works" for half the price is certainly on the table and will be compelling for the hyperscalers.

ein0p
0 replies
13h40m

Intel can’t even make a competitive CPU anymore, let alone a competitive compute-oriented GPU. It’s way behind on lithography, with no end in sight. Their GPUs also aren’t properly integrated with any popular DL framework. I honestly have no idea what they’re thinking. If I were them, at least PyTorch would work flawlessly with their current Arc offerings. It doesn’t - it’s a decidedly “enthusiast” affair for people with too much time on their hands. You don’t even need a huge team for this. 25-30 competent C++ folks for 9 months to a year is all it takes.

dehrmann
0 replies
13h3m

CUDA alternative

The should name it OpenCL.

linksnapzz
1 replies
14h48m

"Nowhere to go but up!"

ein0p
0 replies
10h58m

Up? Idk, I certainly don't have the nerve to buy into a position at this kind of P/E. But they don't need to run faster than the bear, and their next best competitor has the agility of a stoned sloth when it comes to the software side of things.

eru
1 replies
14h42m

Feel free to short sell both?

Fyrezerk
0 replies
4h41m

Is this the only reply you can come up with in this post? Maybe some people want to have legitimate discussions on valuations.

jumploops
4 replies
15h46m

The crypto bubble (circa 2017/2018 [0]) looks like an ant hill compared to the recent Gen AI-based gains!

$68/share in August 2018 would be $722/share today.

[0] https://www.fool.com/investing/2018/08/22/the-cryptocurrency...

dheera
1 replies
15h14m

well $1000 of crypto in early-2017 would be worth $50000 now if you didn't panic sell in 2018 so ...

HWR_14
0 replies
15h0m

They were talking about the effect of crypto mining on NVidias stock price.

mmaunder
0 replies
15h40m

It’s the guy collecting currency shells on the beach compared to a new Iron Age and the guy who has all the iron.

mizzack
0 replies
5h15m

There was also a 4:1 split in there. So, $68 -> $2888.

nostrebored
3 replies
15h48m

When you look at the moat for these companies, is it really justified?

I just think about ARM and Intel/AMD. I find it hard to believe that existing GPUs with their memory bandwidth considerations are optimal architecture and that in 20y we will have the same compute model

But the last time I did any work with cuda was a decade ago, so very rough ideas here

tomoyoirl
0 replies
14h57m

I’m not sure of all the details of the architecture, but my understanding is that there’s all sorts of “GPU Direct Storage” and massive on-chip CPU/GPU shared memory spaces on the new GH100s, plus all the Mellanox stuff for interconnects. At least in the datacenter space, anyway.

jillesvangurp
0 replies
14h46m

Exactly. Nvidia's current dominance is based on the notion that leading AI related libraries and tools depend on Cuda and run well on that. But that's not something that will last. Many parties are working on making these libraries work against alternative architectures and untangling them from Nvidia specifics.

Apple's laptops for example are pretty popular with people into LLMs. Why? They are good laptops with decent enough hardware and lots of people have them. The key point with that is that there's no nvidia hardware on Apple's platform and that a lot of work seems to be going into making sure Apple silicon is supported well in all relevant libraries and tools. And they are of course not the only ones. A lot of the bigger AI companies are using their own chips and platforms. Tesla, Google, Microsoft, Amazon, etc. And then there are Intel and AMD that are equally eager to grab more of the AI market.

Add to that the emerging open source ecosystem around LLMs and you have a nice recipe for vendor neutral libraries used by world+dog and users simply cherry picking the market for the best hardware that can run those libraries.

Nvidia has enjoyed a nice software moat for the last years but it won't last. Everyone is trying to engineer around them currently.

alecco
0 replies
9h27m

They are way ahead and gaining speed: https://news.ycombinator.com/item?id=39355625

And also the integration. The Nvidia datacenter clusters are ready as soon as you turn them on, unlike every other vendor until now requiring very difficult fine tuning for months.

xnx
2 replies
15h37m

Nvidia is a great company with a great and highly desired product. Unfortunately, NVDA the stock seems to have entered "meme stock" territory.

lxgr
0 replies
15h8m

It's at a P/E of 95, which is high but not bubble levels. Doubling their earnings twice seems achievable with the AI hype and everything.

Meanwhile, Arm is at a P/E of 1650. That does seem completely unrealistic to me, also considering that they aren't even at the forefront of high-performance ARM designs themselves (that would be Apple, and they have a perpetual architectural license, I believe).

The second they squeeze on the licensing fees too hard, maybe due to pressure from shareholders to deliver on that valuation, they'll catalyze the creation of a RISC-V based opponent.

dheera
0 replies
15h11m

At least their GPUs are free.

Buy NVDA -> sell NVDA -> buy GPU

I did the same with TSLA, got a free car

testless
2 replies
10h31m

Lofty valuation with a PE ratio of over 90 - even considering the growth and margins. Also almost no revenue growth in 2022/23 according to quickfs.net. Too expensive for my taste.

jsnell
1 replies
9h56m

You're being misled by the site. Those aren't really their 2023 results, but their FY2023 results. And Nvidia's fiscal years are just nuts, basically FY2023 is 2022.

Their 2023 results will be reported as FY2024, and aren't out yet. But just the first three quarters showed 50% more revenue than the entire FY2023.

testless
0 replies
9h10m

You are right, I was wrong. I have compared financial years, not actual years. There seems to be strong improvements with respect to revenue (66% TTM increase) and earnings per share (334% increase TTM).

smsm42
2 replies
10h38m

Could somebody explain to me what is the moat that prevents somebody else from taking on Nvidia? OK, they make GPUs. Why somebody else can't make GPUs? AMD took on Intel, Google took on Microsoft and Apple - is GPU market somehow harder to enter? I mean, we're talking piles of money from the looks of it - Amazon and Google do a ton of stuff, and NVidia only does (roughly speaking) one thing - and that thing is not even directly consumed by the customers, so "it's hard to migrate" shouldn't be a huge deal since it's not the customers that would be doing the migration. So what are the barriers that make Nvidia unique?

alecco
0 replies
10h15m

GPUs are very complex. Nvidia pioneers in everything related to GPUs. A rich software stack, the most sophisticated Tensor Cores, and bleeding edge features like 8 bit floating point (FP8) support. And they are working with FP4 next. This matters because by halving the data size it almost doubles the flops (see Hopper specs [1]).

The compute is so powerful it creates bottlenecks in data loading. So they have SXM, Nvlink, and since Hopper smart data async load to Tensor Cores (TMA).

It's so advanced the ML software hasn't caught up yet. And it's not trivial to tile and schedule properly at these levels. (See FlashAttention [2])

I wouldn't be surprised if they stay on Hopper for a while and just crank up the bandwidth and bundle more GPUs together. They already released H100 NVL which is basically 2 H100s. And the H200 with faster High Bandwidth Memory (v3).

AMD and Intel are way behind and have nothing even remotely close in planning.

[1] https://resources.nvidia.com/en-us-tensor-core/nvidia-tensor...

[2] https://crfm.stanford.edu/2023/07/17/flash2.html

Moldoteck
0 replies
8h11m

nvidia is unique because of cuda and all ml software written/optimized for it. AMD can add cuda support too (and they started working in this direction) but they'll not be able to out-compete nvidia, with this strategy the'll stay at 2'nd place (by buying amd cards for inference you are by default less competitive as business doing ML compared to those that can buy nvidia) and will just strengthen nvidia's cuda moat. Imo it's less about money and more about a ton software that was optimized for nvidia cards and software that was built on top of that optimized software, it can't be easily ported to run as efficient on AMD, it's tons of code and it's unlikely devs would just port it + port all the new stuff that is researched at exponential scale.

jedberg
2 replies
16h5m

Considering how much money Amazon and Google (and Microsoft and Oracle and....) are giving NVIDIA, this makes sense. Amazon and Microsoft in particular seem to be staking a pretty big part of their business on things that need GPUs.

And even though Amazon makes GPUs, they are still buying NVIDIAs by the truckload.

grogenaut
1 replies
15h37m

Amazon buys whatever they can sell to customers. You want windows? We got that. Oracle, got it. Amd, got it, Intel, got that. Mac, got that. Arm? Got that. Nvidia, got it. Their own arms, got em. Their own training and inference? Got it.

They buy things by the cargo ship load.

CSSer
0 replies
13h36m

Alright, alright! We got it already.

Aeolun
2 replies
15h55m

Hmm, AMD is worth 55B, and somehow almost keeping up. I'd consider that a success. I do think it's a bit sad they have basically no representation in the AI market though. Whenever it's about AI it's Nvidia all the way.

tempsy
1 replies
15h51m

AMD's market cap is 278B

Not sure where you got 55B

Aeolun
0 replies
12h34m

Wikipedia?

Huh, I see what you mean. Either I’m reading the numbers wrong, or wiki just doesn’t have that info and I plucked a random one out of the list.

2OEH8eoCRo0
2 replies
16h2m

Insanity. How much of their success/future hinges on TSMC's continued success/future? Kudos to them but the risk makes me uneasy.

modeless
1 replies
15h43m

Not that much I'd say. They could switch to Samsung or Intel. As long as they can use the same fabs their competitors use they'll be fine.

coffeebeqn
0 replies
15h20m

The early 30 series cards during COVID were made in the Samsung fab. Back in the days of no GPUs available anywhere for months

xyst
1 replies
10h38m

With the recent story of openai pitching VCs for a collective $5-7 trillion dollars and a GPU manufacturer becoming more “valuable” than diverse companies such as Amazon or Google.

I am thinking this market is due for a massive correction. It’s not sustainable and built on the “promise” or con it will continue to skyrocket YoY.

Just like the cryptocurrency/digital currency craze. This one (artificial intelligence) will also come crashing down. I give it maybe 1-2 years at this rate. We will look back and these types of stories will be the red flags people will point to when the market is about to sink.

lm28469
0 replies
9h47m

It's literally the same story over and over again, buy the hype, buy the futur hypothetical profits, disregard everything else including logic. Pure greed

Enron called it HFV, hypothetical future value

vramana
1 replies
13h44m

For now there doesn’t seem to be anything on the horizon that threatens Nvidia dominance that might turn it into another Cisco.

Mistletoe
0 replies
11h56m
resters
1 replies
15h46m

It seems like a good time to buy put options!

__loam
0 replies
14h7m

The market can stay irrational longer than you can stay solvent.

brcmthrowaway
1 replies
16h5m

Looks like Mom and Pop have caught on to NVIDIA stock..

odysseus
0 replies
15h40m

Congresspeople too ...

zmmmmm
0 replies
10h58m

Our organization tried to buy an A100 last year. We were informed 12 months wait. nVidia can literally just print money at this point for the next 5 years I think.

seydor
0 replies
11h56m

Finally i can safely invest

rvz
0 replies
14h45m

Not a single mention of the massive geo-political risk that Nvidia is in and the wider semiconductor industry. This is not FUD, as even Jensen knows it. [0]

Apple is doing a smart move in de-risking that possibility so far [1]. But Nvidia has only just become more dependent on TSMC.

The stock price looks like another great time to hit that sell button as everyone from your uber drivers and plumbers are chatting about Nvidia's stock going from all time high valuation, stock price, revenue and no competition (yet).

When a single company is carrying the stock market, it just takes one miss and the market gets upset.

[0] https://www.ft.com/content/ffbb39a8-2eb5-4239-a70e-2e73b9d15... or https://archive.is/MAZYk

[1] https://www.bloomberg.com/news/articles/2023-04-05/inside-ap...

refulgentis
0 replies
15h58m

I got a vision x LLM model running locally today, in addition to the voice recognition and voice activity detector and embeddings already done. It's astonishingly good and can run on phones released last 2 years. Told my friend it's finally time to buy NVDA puts, they'll pay (someday)

ptelomere
0 replies
13h43m

When there's a gold rush, sell the best shovels you can make.

Pretty much the business model of Steam, Amazon, nVidia, Nintendo, all to some degree, we can go on and on.

ksec
0 replies
4h44m

1. Microsoft: $3.115 trillion

2. Apple: $2.904 trillion

3. Saudi Aramco: $2.034 trillion

4. NVIDIA: $1.831 trillion

5. Alphabet: $1.820 trillion

6. Amazon: $1.803 trillion

7. Meta: $1.217 trillion

jauntywundrkind
0 replies
15h22m

Nvidia's nvlink is the most im actually most worried about.

Their chips are amazing and their software moat is impressive, but I see clear response & hungered competition from many on these fronts. There's still a sizable lead, but I believe in the competition here.

Nvidia buying Mellanox though means the entire rest of the planet is stuck on PCIe PHY (which includes CXL) or Ethernet. There's no other major interconnect presence left: Nvidia owns both the interesting rack & further scale nvlink & infiniband alrernatives. I don't think there's an insurmountable challenge here, but no one seems gearing up to compete, and there's no forum for that cooperation to compete together with that exists. Nvlink is fast, and even more important, it's very energy efficient per bit. As we go from many core to many chip, this is going to be a more and more distinguishing capability.

gmerc
0 replies
12h27m

Makes sense. take out AI and the stick narrative of most tech companies and their valuation would halve

fortran77
0 replies
14h44m

It's nice to see a company that actually _makes something_ be worth more than companies who just act as middlemen to extract a few dollars from transactions.

dagmx
0 replies
15h11m

At the time of posting this, NVIDIA has dropped below both companies.

I expect them to bounce back, but the headline as shared was not true at the time the article was submitted

__loam
0 replies
14h52m

Totally not a bubble guys.