I think the point of open cuda is to run it on non NVIDIA gpus. Once you have to buy NVIDIA gpus what’s the point. If we had true you competition I think it would be far easier to buy devices with more vram and thus we might be able to run llama 405b someday locally.
Once you already bought the NVIDIA cards what’s the point
Some people believe being able to build on fully-open software stacks has value in and of itself. (I happen to be one of those people.)
Another benefit could be support for platforms that nvidia doesn't care to release CUDA SDKs for.
Hear hear. Yes practically if you need to run a workload on a closed source system or if that’s your only option to get the performance then you have to do what you have to do. But in the long run open source wins because once an open source alternative exists it is just the better option.
As a bonus, with open source platforms you are much less subject to whims of company licensing. If tomorrow Nvidia decided to change their licensing strategy and pricing, how many here will be affected by it? OSS doesn’t do that. And even if the project goes in a random direction you don’t like, someone likely forks it to keep going in the right direction (see pfsense/opnsense).
This is just wishful thinking. Anything close to real professional use, not related to IT, and closed source is king: office work, CAD, video editing, music production, and those domains immediately came to mind. Nowhere there open source can seriously challenge commercial, closed sourced competitors.
Yes, in any of those domains one can name open source products, but they are far from "winning" or "the better option".
I think open source does tend to win, it just does it slowly - often when the big commercial name screws up or changes ownership.
ex - I think Adobe is in the middle of this swing now, Blender is eating marketshare, and Krita is pretty incredible.
Unity is also struggling (I've seen a LOT of folks moving to Godot, or going back to unreal [which is not open, but is source-available - because having access matters]).
CAD hasn't quite tipped yet - but Freecad is getting better constantly. I used to default to Fusion360 and Solidworks, but I haven't had to break those out for personal use in the last 5 years or so (CNC/3d printing needs). It's not ready for professional use yet, but it now feels like how blender felt in 2010 - usable, if not quite up to par.
Office work... is a tough one - to date, Excel still remains king for the folks who actually need Excel. Everything else has moved to free (although not necessarily open source) editors. None of my employers have provided word/powerpoint for more than a decade now - and I haven't missed not having them.
I would argue that PDFs have gone the opensource route though, and that used to be a big name in office work (again - Adobe screwed up).
I don't really do any music production or video editing, so I can't really comment other than to say that ffmpeg is eating the world for commercial solutions under the hood, and it is solidly open. And on the streaming side of "Video" OBS studio is basically the only real player I'm aware of.
So... I don't really think it's wishful thinking. I think opensource is genuinely better most times, it just plays the long and slow game to getting there.
I'm really sceptical that anything will happen in the CAD space bar massive state investment into open source infrastructure. Open CASCADE doesn't look to be catching up [1], while Solidworks continues to release foundational features like G3 continuity constraints, so the capability gap is going to widen over time.
I'd be glad to be proven wrong.
[1]: https://git.dev.opencascade.org/gitweb/?p=occt.git
Naw, just try to find a decent PDF editor. You will have a hard time. PDF display is fairly open, but PDF editing is not. PDFs are the dominant format for exchange of signed documents, still a big name in office work, and Adobe still controls the PDF editing app market.
I would love it if open source was winning in the imaging, audio or DCC markets, but it’s just not even close yet. Blender hasn’t touched pro market share, it’s just being used by lots and lots of hobbyists because it’s free to play with. Just did a survey of the film & VFX studios at Siggraph, and they aren’t even looking in Blender’s direction yet, they are good with Houdini, Maya, etc. Some of this has to do with fears and lack of understanding of open source licensing - studios are afraid of the legalities, and Ton has talked about needing to help educate them. Some new & small shops use Blender, but new & small shops come and go all the time, the business is extremely tough.
Office work is moving to Microsoft alternatives like Google Office products. That is not open source, not source available, and for most medium to large companies it’s not free either (though being “free” as in beer is irrelevant to your point). The company just pays behind the scenes and most employees don’t know it, or it’s driven by ad & analytics revenue.
Unix utilities and Linux server software are places where open source has some big “wins”, but unfortunately when it comes to content creation software, it still is wishful thinking. It could change in the future, and I honestly hope it does, but it’s definitely not there yet.
Counter example: blender. It may not be winning in video editing, but it has serious market share in 3d rendering. Different players are investing money in it and extend it with their own stuff.
Agree, blender is a contender.
Have you ever heard the phrase "the exception that proves the rule"?
fucking unreal engine 5 is open source, dawg!
Unreal Engine is source available. It is definitely not open source as you can't use it without a commercial license from Epic.
It’s commercial open source.
Anything else is moving the goalposts.
What if Linux itself? The plethora of open programming languages? Tools like OpenSSH?
Commercial, closed source products generally benefit from a monopoly within a specific problem domain or some kind of regulatory capture. I don’t think that means an open source alternative isn’t desirable or viable, just that competing in those contexts is much more difficult without some serious investment—be it political, monetary, or through many volunteered hours of work.
Another comment mentioned Blender which is a great example of a viable competitor in a specific problem domain. There are others if you look at things like PCB circuit design, audio production/editing, and a surprising amount of fantastic computer emulators.
I specifically mentioned not it related, so that rules out “Linux, programming languages, OpenSSh … fantastic computer emulators”.
In general you confirmed my point by saying that competing in domains is much more difficult. And open source isn’t a key to a win.
It is completely ok to use commercial software in a commercial environment. It isn't and shouldn't be the goal of open source to provide the best consumer product.
In the grand scheme of things I believe open source at least provides serious competition and that commercial software has its own work to do.
Also, a lot of not all professional work uses open source components. Research is a field where it shines and there it matters a lot.
Adobe has to work for its money as well as its competitors get more powerful by the day. And everyone hates their creative cloud.
The NVidia software stack has the "no use in datacenters" clause. Is this a workaround for that?
Specifically the clause is that you cannot use their consumer cards (e.g. RTX 4090) in datacenters.
That's why we run all of our ML workloads in a distributed GPU cluster located in every employee's house
The bonus is free heating for every employees household!
Free cooling also. You cannot really run a big GPU with external cooling. I needed rather big 15cm isolated cooling tubes to get the heat out of the building.
you joke but I've thought about doing this
The employees can also store their desktops in specially cooled, centrally located lockers at work if they want. And as a perk, we’ll buy and administrate these computers for them.
use the open kernel driver, which is MIT/GPL and thus cannot impose usage restrictions.
it's worth noting that "NVIDIA software stack" is an imprecise term. the driver is the part that has the datacenter usage term, and the open-kernel-driver bypasses that. the CUDA stack itself does not have the datacenter driver clause, the only caveat is that you can't run it on third-party hardware. So ZLUDA/GpuOcelot is still verboten, if you are using the CUDA libraries.
https://docs.nvidia.com/cuda/eula/index.html
Some of us are running llama 405B locally already. All my GPUs are ancient Nvidai GPUs. IMO, the point of an open cuda is to force Nvidia to stop squeezing us. You get more performance for the buck for AMD. If I could run cuda on AMD, I would have bought new AMD gpus instead. Have enough people do that and Nvidia might take note and stop squeezing us for cash.
> the point of an open cuda is to force Nvidia to stop squeezing us
Nobody is forcing you to buy GPUs.
Your logic is flawed in the sense that enough people could also simply write alternatives to Torch, which, by the way, is already open source.
Nobody is forcing you to buy a computer.
Nobody is forcing you to live under a roof.
Nobody is forcing you to eat.
Sorry for the harsh comment.
I just found it highly unlikely that Nvidia would change its ways due to this, and I don't really see how we're being "squeezed". Nvidia are delivering amazing products (as are AMD), and it is not going to be any cheaper this way.
Building this kind of hardware is not something a hacker can do over the weekend.
The squeeze is mostly within the segmentation of VRAM between products, it's basically a commodity and this week the spot price for 8GB of GDDR6 has varied from $1.30 to $3.50 [1].
Yet to get a card with 8GB more than one with comparable logical performance, you'd be looking at hundreds (or thousands in the case of "machine learning" cards) of dollars.
[1] https://www.dramexchange.com/
Nvidia is charging what they are entirely because there is no or very little competition. There isn't much else to it, if AMD/Intel caught up Nvidia would suddenly start selling their GPUs for way less...
What are you using P100s or something?
Step 1: Run on NVIDIA gpus until it works just as well as real CUDA.
Step 2: Port to other GPUs.
At least I assume that is the plan.
why not do this first? because the existing closed sourced CUDA already runs well on nvidia chips. Replicating it with an open stack, while ideologically useful, is going to sap resources away from the porting of it to other GPUs (where the real value can be had - by stopping the nvidia monopoly on ai chips).
I'm not involved with the project but I'd assume it's helpful as a reference implementation. If there's a bug on a non-nvidia GPU, the same test case can be run on Nvidia GPUs to see if the bug is in common code or specific to that GPU vendor.
The point might not necessarily be for consumers
Linus wasn’t writing Linux for consumers (arguably the Linux kernel team still isn’t), he needed a Unix-like kernel on a platform which didn’t support it
Nvidia is placed with CUDA in a similar way to how Bell was with Unix in the late 1980s. I’m not sure if a legal “CUDA Wars” is possible in the way the Unix Wars was, but something needs to give
Nvidia has a monopoly and many organisations and projects will come about to rectify it, I think this is one example
The most interesting thing to see moving forward is where the most just place is to draw the line for Nvidia they deserve remuneration for CUDA, but the question is how much? The axe of the Leviathan (US government) is slowly swinging towards them, and I expect Nvidia to pre-emptively open up CUDA just enough to keep them (and most of us) happy
After a certain point for a technology so low in the “stack” of the global economy, more powerful actors than Nvidia will have to step in and clear the IP bottleneck
Tech giants are powerful and influence people more than the government, but I think people forget how powerful the government can be when push comes to shove over such an important piece of technology
—————
PS my comparison of CUDA to Unix isn’t perfect, mostly as Nvidia has a hardware monopoly as it stands, but as they don’t fab it themselves it’s just a design/information at the end of the day. There’s nothing physically preventing other companies producing CUDA hardware, just obvious legal and business obstacles
Perhaps a better comparison would be Texas Instruments trying to monopolise integrated circuits (they never tried). But if Fairchild Semiconductors hadn’t’ve independently discovered ICs, we might have seen a much slower logistic curve than we have had with Moore’s law (assuming competition is proportional to innovation)
Besides how they've "opened" their drivers by moving all the proprietary code on-GPU, I don't expect this to happen at all. Nvidia has no incentive to give away their IP, and the antitrust cases that people are trying to build against them border on nonsense. Nvidia monopolizes CUDA like Amazon monopolizes AWS, their "abuse" is the specialization they offer to paying customers... which harms the market how?
What really makes me lament the future is the fact that we had a chance to kill CUDA. Khronos wanted OpenCL to be a serious competitor, and if it wasn't specifically for the fact that Apple and AMD stopped funding it we might have a cross-platform GPU compute layer that outperforms CUDA. Today's Nvidia dominance is a result of the rest of the industry neglecting their own GPGPU demand.
Nvidia only "wins" because their adversaries would rather fight each other than work together to beat a common competitor. It's an expensive lesson for the industry about adopting open standards when people ask you to, or you suffer the consequences of having nothing competitive.
I think the point of Linux is to run it on non-Intel CPUs. Once you have to buy Intel CPUs what's the point.
You have it exactly backwards. The original goal of Linux was to create a Unix-like operating system on Linus Torvald's own Intel 80386 PC. Once the original Linux had been created, it was then ported to other CPUs. The joy of a portable operating system is that you can run it on any CPU, including Intel CPUs.
I guess this framework was made by amd engineers.
Anyway I wonder why amd never challenged nvidia on that market... It smells a bit like amd and nvidia secretly agreed to not compete against each other.
Opencl exists but is abandoned.
The closed platform is not without its pitfalls.
Good luck getting a multi-user GPU setup going, for example.
It super sucks when the hardware is capable, but licensing doesn't "allow" it.
Yeah like running Linux on a MacBook…
CUDA is ubiquitous in science and an open source alternative to the CUDA runtime is useful, even if the use is limited to verifying expected behavior.