return to table of content

HDMI Forum rejects AMD's HDMI 2.1 open-source driver

broodbucket
84 replies
13h44m

On the one hand, this move ensures the quality and consistency of HDMI experience.

Does it though? Does it really?

I don't understand this move from HDMI Forum. They're handing a win to DisplayPort.

kelnos
60 replies
13h11m

Does it though? Does it really?

Of course not. It's just protectionism and rent-seeking.

I don't understand this move from HDMI Forum. They're handing a win to DisplayPort.

I don't think so, at least at this point. Most people don't have hardware that requires HDMI 2.1 in order to get full use out of them, and of those who do, not all of them use Linux and/or care about open source drivers.

Sure, that situation may change, and the HDMI Forum may walk back these requirements.

At any rate, for some reason DisplayPort has just not caught on all that much. You very rarely see them on TVs, and a good number of mid-/lower-end monitors don't have them either.

It's bizarre, really.

Rinzler89
31 replies
12h50m

>Of course not. It's just protectionism and rent-seeking.

Don't know why you're being downvoted but it's true. Especially when you see that the HDMI standard was developed by the cartel of TV manufacturers and major movie studios[1] when DVI and Display Port already existed but those didn't generate royalties or have DRM.

Despicable standard. There wasn't even a standards "war" like in VHS vs Betamax, or SD vs MemoryStick, or USB vs Fire Wire, to say that HDMI won over DisplayPort, it was simply shoved down consumers' throats since every TV, media player and games console only shipped with that port alone as they were manufactured by the same cartel that developed the HDMI standard.

So much for the so called "free market".

[1] https://en.wikipedia.org/wiki/HDMI#History

trilbyglens
19 replies
12h4m

Works exactly as a free market is designed to. The strong and coercive win. That's what market dynamics are really about. Monopolies form easily and naturally unless regulation stops them.

drchaos
7 replies
11h13m

One could argue that at least this specific tactic would not be possible without the state granting a monopoly on "intellectual property". Without that, nothing would hinder AMD from just shipping their already existing implementation.

Certhas
5 replies
10h50m

The irreducible state role in a free market is to enforce property rights.

Almost all free market fans I have seen think that this should extend to some notion of intellectual property.

Y_Y
4 replies
10h0m

I think the standard answer to your point is that you can recognise "intellectual property" without granting a (limited) monopoly. There are plenty of proposals floating around for copyright and patent reform that curtail or replace the ability of the creator/owner to unilaterally set the price and decide who can license the material and how they can use it.

Certhas
1 replies
7h37m

Mostly I think the person who mentioned that this is free markets working as free markets is largely right. You can't defend free markets on the basis that property rights are enforced by the state, and thus somehow changing the free market outcome.

I also think critizising intellectual property on grounds of granting a monopoly is muddling the language. If I write a novel I have exclusive rights to the novel. But I am not the only supplier of mediocre novels. I don't have a monopoly in a relevant market.

None of this contradicts the point that IP and patent rights are in desperate need of reform, or that they can play a central role in abusing a monopoly position (e.g. https://en.wikipedia.org/wiki/Orange-Book-Standard).

Edit: Maybe my post was unclear: I would agree that IP should be abolished. But this is not a position I have seen classical market liberals and other free market advocates take. Instead, they tend to favor strengthening all forms of property rights. If I am wrong on this point, I'd be happy to read some examples.

Y_Y
0 replies
7h20m

I don't think we disagree, I would just like to add that this subtlety about "monopoly" depends on the (subjective) existence of substitute goods. Maybe as a consumer I just want any old book to read and so an individual author has no market control. On the other hand you can imagine, say, a technology that's practically or actually unavoidable as an input for a particular business (suppose HDMI had no viable alternative), then the IP holder could extract super-normal profit and make the economy less efficient.

AstralStorm
1 replies
8h18m

Thing is, HDMI forum is not a monopoly. It's a literal cartel of a few corporations and other cartels. Other cartels pushing for it include MPAA.

hulitu
0 replies
27m

One could argue that at least this specific tactic would not be possible without the state granting a monopoly on "intellectual property".

Microsoft ? RIAA ? MPAA ? Google (AI, books)

obirunda
5 replies
10h50m

Monopolies form easily? That's funny, you should try and start one, seems quite profitable.

Seriously though, this is an oft repeated fallacy, and frankly irrelevant to the discussion.

IP laws are the actual culprit in facilitating the apparatus of the state for the creation of monopolies. Most people seem to embrace this double-think that IP laws are good while monopolies are bad. You simply don't get monopolies without IP laws. IP laws are the ultimate king maker and exclusively exist to perpetuate profits of the IP owner.

If your proposition of regulation is to disband the patent offices and repeal the copyright act, my sincere apologies.

Rodeoclash
2 replies
10h2m

If only the free market was even more free, all our problems would be solved!

atoav
1 replies
9h29m

The invisible hand of the free market will come and fix all the things! /s

If you talk to people who still subscribe to that notion, it quickly becomes clear that they value their miniscule chance to win the capitalist lottery more than the wellbeing of the many — the idea that markets balance everything to the advantage of everybody then seems to be just an excuse to be egoistic and without any care for others.

Don't get me wrong, nobody has to care for others and I am not going to be the person to force you, but if you don't care about others please stop pretending you are doing it for the greater good.

obirunda
0 replies
1h2m

You're conflating several schools of thought. Utilitarianism, which appears to be your basis for defining ethical behavior, underlies this reasoning behind compulsory government action.

This line of thinking is often repeated in election cycles and mindless online discussions, with mantras like "We justify doing something heinous because it serves 'American Interests'" or "We'll coercively tax one group and redistribute funds to another because they'll do something dubiously for the 'greater good'".

However, Utilitarianism is not a foundational principle of libertarian ideology. In fact, libertarianism often refutes and rejects it as applied to governments. It doesn't prioritize egalitarianism or rely on public opinion when defining citizens' rights.

The argument for a free market unencumbered by protectionist policies isn't about the greater good; rather, it's an argument for an ethical government grounded in first principles.

The "greater good" argument tends to crumble under close examination and logical scrutiny. Its claims on reason collapse as soon as you scrutinize them more deeply.

Notably, Utilitarianism has been the basis for nearly all modern-day dictatorships, which rely on a monopoly of violence to enforce the "greater good".

It's possible to support free markets while still caring for others – this is called altruism. It's similar to utilitarianism but without coercion and fallacies.

mindslight
0 replies
6h43m

Without imaginary property, AMD would have signed a similar contract - they would rather focus on their own products rather than reverse engineering the HDMI standards to create their own implementation. At which point AMD would be in the same position, unable to reverse engineer HDMI or adopt solutions from other companies who did.

Imaginary property laws most certainly encourage and facilitate monopolies and collusion, but they are not necessary to the dynamic. Such laws are essentially just the norms of business that companies would be insisting on from other businesses anyway, for which it's much more lucrative to assent and go along with rather than attempt to defect and go against them.

Another example of this effect is the DMCA - the tech giants aren't merely following its process verbatim, but rather have used it as basis for their own takedown processes with electively expanded scope - eg why we see takedown notices pertaining to "circumvention" code, or the complete unaccountability of Content ID. Google and Microsoft aren't significantly hurting themselves by extralegally shutting down a tiny contingent of their customers, meanwhile the goodwill they garner from other corporations (and possible legal expenses they save) is immense. The loser is of course individual freedom.

atoav
0 replies
9h56m

Getting rich is easy. You just need rich parents.

Two things can be true at the same time.

The truth is, if you are in the position to make the step towards becoming a monopolist especially in a new market it is not impossible to do so (and by the rules it should be).

Getting to that position isn't easy tho.

But from a consumer standpoint the only thing that matters is if you have monopolists or not — we don't care how hard it was for them to become one other than it might change the number of monopolists that force their crop down our throats.

account42
4 replies
9h23m

Actually these monopolies are enforced by the state via IP laws. Without IP laws any upstart could reverse engineer the protocols and provide an implementation with less limitations. But of course free market enthusiasts like to ignore that part and only rant against the government when it protects consumers from companies.

rini17
2 replies
8h59m

Yes, if there were no IP anyone could cheaply make a single-digit-nanometer-node custom ASIC to provide the alternative 4K-capable video hardware implementation. /s

ozfive
0 replies
8h11m

Single-digit-nanometer-node custom ASICs aren't really required to achieve this. Although there is higher latency this can and has been done on FPGAs at a company I worked for which designed and built custom AVOD systems for private jets and helicopters.

miki123211
0 replies
6h44m

Anyone? No, probably not. Some enterprising company in Shenzhen, who would sell the thing for $.25 a piece due to fierce competition driving prices down to cost of materials? Now that's more likely.

roenxi
0 replies
8h40m

There are a huge number of free market types who are against IP laws and they're a big part of computing culture. Names like the FSF [0] spring to mind. A market can't expose a fraction of its potential if people are banned from competing because someone else got there first. The only reason the software world did so well was because the FSF managed that inspired hack of the copyright system known as the GPL that freed up the market, in fact.

[0] https://www.fsf.org/

somat
10 replies
11h27m

To be fair, and note that I think of the hdmi foundation as the bad guys.

hdmi was not an alternative to display port, display port did not exist yet. it was an alternative to dvi, really hdmi is dvi with a sound channel and drm. And as much as I dislike the hdmi foundation I can see the benefit here.

as to hdmi vs display port... I have no idea why you don't see more display port, VESA has a proven track record as the nicer standards body, display port is a better system. probably just inertia at this point.

godzillabrennus
4 replies
10h33m

I think interface change fatigue is real. DisplayPort has been around but there wasn’t a compelling reason to use it when displays had hdmi ports.

People are also looking to USB-C as the next iteration in display connectivity because it does “all of the things” from a consumers perspective.

Dalewyn
2 replies
10h20m

Most people will use the path of least resistance.

Many people/organizations still use VGA and ketchup-mustard-onion cables to this day if they still do the job, let alone HDMI.

account42
1 replies
9h34m

Shouldn't it be mayo if you are going with a condiment theme?

Dalewyn
0 replies
5h21m

Very fair point and duly noted!

seba_dos1
0 replies
7h33m

At least video over USB-C is DisplayPort, so there's hope.

atoav
3 replies
10h1m

As a media tech guy (running the media tech department of a university, which includes a DCI conform cinema): absolutely everybody hates HDMI. It is unreliable as hell, both physically and as a protocol. It tries to be too much to too many people and most devices, including expensive "pro" gear includes unchangeable random weirdness like ignoring EDIDs or forcing them onto you, that is documented nowhere and you can only find these things out when you buy it.

Add to that the fact that consumers/users can break the picture/sound in 100 different ways on their devices and you get a veritable support nightmare.

I wish it was just DVI+ but it does so much more.

Rinzler89
1 replies
9h41m

Isn't this why VGA is still widely used everywhere? It always just works no matter what even when connector or pins are damaged since there's no digital handshake or error correction just a basic analog pipeline.

atoav
0 replies
7h44m

I don't know, at least here (Europe) VGA has pretty much died out in all but legacy applications. The true pro format would be SDI using BNC connectors.

But I guess HDMI is going to be replaced by USB-C in the long run. Especially since the "everything-connector" also doing Video makes more sense than the video-connector also doing everything.

paulmd
0 replies
2h25m

unchangeable random weirdness like ignoring EDIDs or forcing them onto you, that is documented nowhere and you can only find these things out when you buy it.

FWIW: https://www.store.level1techs.com/products/p/5megt2xqmlryafj...

Sadly this is not entirely a HDMI-specific problem either, he has a displayport feeder too. Also DisplayPort had many problems with disconnects/sleep state for many years, especially surrounding EUP Compliance/EUP Deep Sleep mode. I wouldn't say DisplayPort monitors were relatively bulletproof until the GSync Compatible generation finally rolled around in 2019-2020.

cardiffspaceman
0 replies
1h17m

I used a plasma panel, vintage 2004 (retired in 2016 with no noticeable burnin), that had a DVI connector with HDCP support. If it had not supported HDCP, I could not have connected my cable box to this panel.

arghwhat
9 replies
9h39m

At any rate, for some reason DisplayPort has just not caught on all that much.

DisplayPort won everything, except not becoming the physical connector for home cinema. Heck, even within those HDMI-exposing devices, DP won.

The vast majority of display drivers speak eDP. Few things actually implement HDMI, and instead rely on DisplayPort to HDMI converters - that's true whether you're looking at a Nintendo Switch or your laptop. Heck, there is no support for HDMI over USB-C - every USB-C to HDMI cable/adapter embeds a HDMI converter chip, as HDMI altmode was abandoned early on.

The only devices I know of with "native" HDMI are the specialized TV and AV receiver SoCs. The rest is DP because no one cares about HDMI.

However, seeing that home cinema is pretty much purely an enthusiast thing these days (the casual user won't plug anything into their smart TV), I wonder if there's a chance of salvation here. The only real thing holding onto DisplayPort is eARC and some minor CEC features for AV receiver/soundbar use. Introducing some dedicated audio port would not only be a huge upgrade (some successor to toslink with more bandwidth and remote control support), but would also remove the pressure to use HDMI.

With that out of the way, the strongest market force there is - profitability - would automatically drive DisplayPort adoption in home cinema, as manufacturers could save not only converter chips, but HDMI royalties too.

dathinab
3 replies
6h34m

The only real thing holding onto DisplayPort is eARC and some minor CEC features for AV receiver/soundbar use. Introducing some dedicated audio port would not only be a huge upgrade (some successor to toslink with more bandwidth and remote control support), but would also remove the pressure to use HDMI.

USB-C

I mean think about it

USB-C/DP alternative mode is good enough as upstream for most use cases (including consoles)and has some additional future feature potential, and still has some USB bandwidth left usable for various things including CEC

for eARC-like use-cases (i.e. sometimes audio+video upstream, sometimes audio downstream) you have a few choices (one needs to be standardized):

- always create a DP alt mod channel upstream, use audio over USB for downstream, technically that already can work today but getting audio latency synchronization and similar right might require some more work

- switch the DP alt mode connection direction or have some audio only alt mode, which either requires a extension of the DP alt mode standard, or a reconnect. But I think the first solution is just fine

as an added benefit stuff like sharing input devices became easier and things like Roku TV sticks can safe on some royalties ... which is part of where the issue is there is a huge overlap between big TV makers and HDMI share holders, I mean have you ever wondered why most TVs don't even have a single DP port even through that would be trivial to add?

which is also why I think there is no eARC like standard for USB-C/DP alt mode, it only matters for TVs and TVs don't have DP support

honestly I believe the only reasons why TVs haven't (very slowly) started to migrate to USB-C/DP alt mode is that most of their producers make money with HDMI

and lastly there is some trend to PCIe everything in both consumer and server hardware. In the consumer segment it had been somewhat limited to the "luxury" segment, i.e. Thunderbolt. But with USB4 it slowly ends up in more and more places. So who knows PCIe based video might just replace both of them (and go over USB-C)

arghwhat
2 replies
5h22m

and lastly there is some trend to PCIe everything in both consumer and server hardware. In the consumer segment it had been somewhat limited to the "luxury" segment, i.e. Thunderbolt. But with USB4 it slowly ends up in more and more places. So who knows PCIe based video might just replace both of them (and go over USB-C)

Thunderbolt/USB4 is not PCIe. It's a transport layer that can run multiple applications at once, sharing bandwidth based on use. This is opposed to USB-C Alternate Mode, where pins are physically reassigned to a specific application, which uses the pins regardless of whether it needs the bandwidth.

PCIe is then one of the supported applications running on top of the transport.

dathinab
1 replies
4h53m

I know, but this isn't relevant for the argument, if anything it's in favor of some future protocol replacing HDMI/DP/USB-C+DP alt while using the USB-C connector.

arghwhat
0 replies
4h2m

I was just pointing out specifically that there is no such thing as PCIe-based video - nor is there any need for that.

Support for USB4/Thunderbolt DP will proliferate, but there is still benefit to a DP altmode as it's free to implement (the host controller just wires its existing DP input lanes directly to the USB-C connector) and allows for super cheap passive adapters.

If USB-C ends up becoming the standard video connector as well, it will most likely be DP altmode as you then only need a cheap USB-C controller to negotiate the mode.

There isn't really any pressure to invent a new protocol. https://xkcd.com/927/

superhuzza
2 replies
7h53m

home cinema is pretty much purely an enthusiast thing these days (the casual user won't plug anything into their smart TV)

Except a gaming console, a laptop, a roku, apple TV...

Every single person I know has some external media source plugged into their TV, even my tech illiterate mother.

arghwhat
1 replies
7h19m

You’d be surprised by the number of users who are satisfied with the built-in media experience.

I’d say it’s most likely a large majority. Google TV is common, but people with an Android-powered TV are not the main target for those until the TV gets old and out of date. Apple users on Samsung TV’s might also get far with the built in AirPlay support.

Heck, even within enthusiasts there is a strong push to use the built-in media features as it often handles content better (avoiding mode changes, better frame pacing). Even I only use an external box after being forced due to issues when relying on eARC.

Very few people plug in their laptop to a TV, and laptops are not normally HDMI. Some laptops have a dedicated port with a built-in converter, but all modern laptops are USB-C which only exposes DisplayPort.

treis
0 replies
6h23m

I'm in this crowd. The TV apps work well enough and it's one less remote. The only thing I use the attached Chromecast for is to (rarely) mirror my phone screen.

pjc50
1 replies
9h32m

Introducing some dedicated audio port would not only be a huge upgrade

I'm not sure about that - suddenly there's a cost in board space and BOM, and they're not automatically linked together. Or do you just mean for audio output from TV to soundbar? I feel like USB would suffice for that if anyone could be bothered. Personally I use regular TOSLINK to a stereo amplifier and accept having another remote.

arghwhat
0 replies
7h12m

Heh, good point, USB 2.0 would absolutely suffice. You'd hardly need more than a standard audio profile either. Some TVs even support this already - recent Samsung models at least.

A specialized port could theoretically have a lower BOM cost through simpler silicon or port design, but USB 2.0 is free at this point so why bother.

Personally I use regular TOSLINK to a stereo amplifier and accept having another remote.

The problem with TOSLINK is not only the remote scenario (which I do think is absolutely a necessary feature for any kind of adoption), but also lack of bandwidth for uncompressed surround sound.

Large surround setups at home are uncommon these days, but soundbars with virtual surround is common, and some of us still manage to squeeze in a simple 5.1 setup.

shmerl
5 replies
13h1m

May be it can change if USB 4 will sneak in and supplant HDMI in those devices, since it can route both HDMI and DP.

ignaloidas
2 replies
6h44m

It cannot route HDMI, partly because HDMI is built upon antiquated principles and doesn't really fit besides more modern protocol designs. USB4 would need to get entirely redesigned for tunneling native HDMI.

Having a DP to HDMI converter on one end though, that's easy.

Pet_Ant
1 replies
4h21m

HDMI is built upon antiquated principles

I'm interested in learn more, in what way are they antiquated?

adrian_b
0 replies
2h36m

HDMI uses a digitalized form of the traditional TV signals. The format of the transmitted data still depends on the parameters that defined traditional TV signals, like video frame frequency, video line frequency, vertical and horizontal retrace intervals and so on. Such parameters are no longer essential for digital television and there is no longer any need to constrain the transmission of video signals with them.

DisplayPort uses a typical communication protocol that can carry arbitrary data packets, not much different from the protocols used on USB or Ethernet.

account42
1 replies
9h19m

Is HDMI over USB even a thing that any real devices support? But yeah, demand for mobile phone support might force TV manufacturers to adopt DP over USB.

2muchcoffeeman
4 replies
12h40m

I have fewer display port devices now than 8 years ago.

KeplerBoy
3 replies
11h16m

Don't forget about usb-c. Video over usb-c is almost always display port in disguise.

squarefoot
2 replies
8h46m

What about latency? Is it on par or at least in the same league compared to direct connection? Not an issue for most people, but gamers could disagree if it is too high.

seba_dos1
0 replies
7h31m

It's still direct connection, so there's nothing to compare there.

duskwuff
0 replies
8h33m

Performance is identical. DisplayPort Alternate Mode (which is what most displays use) isn't transmitting video data over USB; it's agreeing to use some of the high-speed wire pairs in the cable to transmit DisplayPort data instead of USB.

matja
2 replies
11h38m

Most people don't have hardware that requires HDMI 2.1 in order to get full use out of them, and of those who do, not all of them use Linux and/or care about open source drivers.

Arguably true, but I think that is changing all the time while there is a push towards open-source drivers regardless of the average user knowing/caring what that is, along with resolutions and refresh rates increasing.

I was affected by HDMI Forum's decision by buying an off-the-shelf 4K 120Hz monitor which refused to work at that resolution/refresh rate on an HDMI cable.

I was not expecting an arbitrary decision affecting software to be the cause instead of a hardware problem - which took me a while to figure out.

Now I know if I want to use my hardware to the full capacity, I need DisplayPort in future.

Hendrikto
1 replies
8h51m

off-the-shelf 4K 120Hz monitor which refused to work at that resolution/refresh rate on an HDMI cable.

I run a 4K 144Hz monitor over HDMI. Are you sure you don‘t just need a better cable?

matja
0 replies
8h8m

My HDMI cables work at 4k 120Hz with the same monitor with an NVidia card using closed-source drivers, not with AMD open-source drivers, because of the issue in the article.

mobiuscog
0 replies
9h56m

Linux gamers with modern TVs wanting VRR.

Maybe that's still a tiny amount, but it's likely the most common 'need'.

lexicality
0 replies
8h29m

At any rate, for some reason DisplayPort has just not caught on all that much. You very rarely see them on TVs, and a good number of mid-/lower-end monitors don't have them either.

I suspect all the nice features that make DisplayPort a better standard are harder to implement cheaply, eg chaining

account42
0 replies
9h39m

Most people don't have hardware that requires HDMI 2.1 in order to get full use out of them.

Most people maybe not but a simple 4K TV that can do > 60 FPS fits that criteria. Those aren't that rare anymore.

Cu3PO42
0 replies
12h24m

I don't think so, at least at this point. Most people don't have hardware that requires HDMI 2.1 in order to get full use out of them, and of those who do, not all of them use Linux and/or care about open source drivers.

I do, but this hardware doesn't have DisplayPort. I switched from Nvidia to AMD specifically for the open source Linux drivers, so I'm quite mad at the HDMI forum for this.

On the other hand, my next TV likely won't have DisplayPort, either, because almost none of them do, so it is indeed questionable whether this is going to loose them any mind share.

shiroiushi
12 replies
13h12m

They're handing a win to DisplayPort.

Is this useful, if all the relevant devices only have HDMI ports and not DP?

adrian_b
5 replies
12h57m

What I find very annoying is that a very large number of small computers and laptops have both DisplayPort and HDMI, but they have full-size HDMI connectors and only USB Type C DisplayPort.

Using Type C for DisplayPort instead of the good full-size DisplayPort connectors is less reliable (easy to disconnect accidentally) and it permits only shorter video cables.

More importantly, this blocks the Type C connector, which I need for other purposes, e.g. an external SSD. I do not want to carry a Type C dock, so I end using HDMI, even if I do not need HDMI and I do not want HDMI and even if in almost all cases the devices had enough free space for a full-size DisplayPort connector.

Even replacing the HDMI connector with a DisplayPort connector (so that the devices would have only full-size and Type C DisplayPort) is always a better solution, because there are a lot of cheap adapters from DisplayPort to HDMI, which do not need a separate power supply and they can even be incorporated inside the video cable. The reverse adapters, from HDMI to DisplayPort, are much more expensive and much bulkier, so usually they are not acceptable.

shiroiushi
2 replies
12h51m

It seems the issue with this open-source driver is supporting some of the highest-performance modes of HDMI (like 4K @ 120Hz). Would that even work in a DisplayPort-to-HDMI adapter?

adrian_b
1 replies
12h41m

While there are no fundamental reasons for any video mode to not work, most of the DisplayPort to HDMI adapters that are currently on the market do not support the latest standard versions of DisplayPort and/or HDMI, so when a very high performance mode is desired, it might not work.

However, the main use of adapters is when you travel and you find in your temporary office an HDMI-only monitor, or when you must use a meeting room projector. Such monitors or projectors seldom support high performance video modes.

account42
0 replies
9h4m

A problem is also that many video cards that do support HDMI 2.1 only support display port DisplayPort 1.4 which has less bandwith. This makes the sitution with the open source AMD drivers even more annoying because even with an active adapter that supports all the required features (which most don't) you can't get the full HDMI 2.1 resolutions/refresh rates that way.

eqvinox
1 replies
10h51m

The reverse adapters, from HDMI to DisplayPort, are much more expensive and much bulkier, so usually they are not acceptable.

That's because those are active converters — contrast DisplayPort has "DP++" which means the source port is electrically capable of transmitting either DP or HDMI signals; the graphics card can switch modes. The adapter is a tiny IC to signal doing that switchover and just wires the data lanes through. HDMI has no such thing, you need an active protocol converter IC to get DisplayPort.

(NB: there are also active DP→HDMI converters, they have a bit longer range than the passive ones. I had to use one of them for my home projector, it's on a 10m HDMI cable which only worked on a blue moon with a passive DP++ adapter. Funnily enough it doesn't work on my native HDMI port either, only the active converter gets it running reliably… might be a poor 10m cable ;D)

DP++ wasn't part of the original DP spec, but I don't believe any DP source hardware that doesn't support DP++ is being manufactured at this point.

adrian_b
0 replies
9h0m

The DisplayPort connector includes a supply voltage. While it is weaker than in USB, it is strong enough to provide power to an active DisplayPort to HDMI converter, which can have the appearance of a video cable that can connect a DisplayPort source to an HDMI sink.

On one of the HDMI pins there is a DC voltage, but it has other purposes and it is too weak to provide power for a video converter.

This is why an HDMI to DisplayPort converter always requires an additional external power supply.

eqvinox
2 replies
11h2m

FYI: USB Type-C ports are DP ports on most modern laptops. You just need the correct cable (or a display with Type-C connector.) I have one of these https://www.club-3d.com/en/quick-view/2470/

(actually works both directions, e.g. if you have a portable display with only Type-C connectors like this https://www.hp.com/us-en/shop/pdp/hp-e14-g4-portable-monitor — BUT it can't power the display, you need to use another connection on the display for that.)

There is no HDMI over Type-C (there was an attempt at it, but it died. Probably for the better of not having even more Type-C confusion and interoperability issues.)

globular-toast
1 replies
9h19m

The "relevant devices" is surely referring to the displays here. I would love to go DP for everything but the best I can seem to find is computer monitors with 1 DP input and usually 2 or more HDMI. For living room type displays you won't find DP at all.

eqvinox
0 replies
7h18m

For PC displays on geizhals.eu, out of 3327 products:

   157 (4.7%) have  2 DP inputs
  2446 (74%)  have  1 DP input
   724 (22%)  have no DP input
Including USB-C ports,

   949 (29%)  have ≥2 DP inputs
  1806 (54%)  have  1 DP input
   572 (17%)  have no DP input
Compare HDMI:

  1350 (41%)  have ≥2 HDMI inputs
  1791 (54%)  have  1 HDMI input
   186 (5.6%) have no HDMI input
I agree it could be better but I don't think it's as bad as you make it out to be. Looking through the devices that have no DP input at all, 488 of the 572 have VGA inputs, which I'd say indicates an older generation of devices.

"Consumer" electronics (i.e. TVs) is a problem though, I'll agree.

agilob
2 replies
12h55m

Well, yes. The only relevant device with HDMI that I have is a raspberry pi.

Fnoord
1 replies
9h1m

You don't own any monitor or TV?

agilob
0 replies
8h36m

Every of them runs on DP

sitkack
8 replies
11h40m

They're handing a win to DisplayPort.

And that would be bad how? DP is an excellent standard and royalty free.

HeckFeck
3 replies
10h23m

It's like what FireWire was to USB, but hopefully it has a better fate.

cassianoleal
2 replies
8h56m

Wasn't FireWire also massively encumbered in patents and very expensive licenses? I may be misremembering...

HeckFeck
1 replies
1h11m

If so it'd be different in that regard. I was thinking of it more as the better-engineered underdog that lost out to the more corporate-friendly option.

IIRC it had full duplex unlike USB 1/2, it launched well before USB with a fast 400Mbps transfer speed and its hardware controller was sophisticated enough that it could work without much intervention from the OS.

But looking into the history, the patent situation was indeed grim. Likely that's what kept it in an Apple and DV niche until USB caught up.

sitkack
0 replies
36m

Jobs wanted too much money for firewire and Intel wanted to get PC dominance by having USB everywhere. The lack of firewire adoption is mostly afaik on Jobs.

merlindru
1 replies
8h16m

Not bad, but it's hard to think of a reason why they're doing this

It's protecting your standard from being used by others when wide adoption is the only thing that differentiates your standard from others

i.e. they're shooting themselves in the foot

paulmd
0 replies
3h58m

"by failing to give away their product for free, this IP-licensing organization is really only losing in the end!"

eqvinox
1 replies
11h5m

I don't think there was an implication that this would be bad

repelsteeltje
0 replies
10h52m

I guess, bad for the HDMI patent pool members.

amelius
0 replies
7h50m

quality and consistency of HDMI experience

For me the experience is not so good, given that HDMI signals always require at least 2 very long seconds to be recognized by a monitor, often even more.

vzaliva
13 replies
11h29m

There was a post here on HN some time ago ( https://news.ycombinator.com/item?id=36681814 ), basically explaining that HDMI is terrible and DisplayPort is a much better technical solution.

So, perhaps people should favour DP instead of HDMI and gradually switch?

ThatPlayer
11 replies
11h18m

I don't think it's quite that simple. DP is still missing HDMI features like ARC and CEC, which are important for TVs. Even on my personal computer setup, I use the HDMI 2.1 ports on my monitor/GPU over the DP 1.4 ports because the DP port just doesn't have the bandwidth for 2560x1440 @ 240hz with 10-bit colour. That requires ~30 Gbit/s, more than DP1.4's ~26 Gbit/s.

Neither my monitor nor my GPU support DP2.0 which does have enough bandwidth. So until I upgrade both, I'm using HDMI. My computer is not outdated either, there's just nothing to upgrade to. None of Nvidia's consumer GPUs support DP2.0, and I can only find 2 monitors that support DP2.0. Anyone getting new hardware now will be in a similar situation to use HDMI2.1 over DP1.4 until their next upgrade.

gond
4 replies
8h53m

DP is still missing HDMI features like ARC and CEC, which are important for TVs.

ARC could also be considered as a bug, a hindrance, or both.

ARC and its various implementations would not exist if the HDMI Forum would not be so fanatically force copy protection on everything. The whole problem, or feature that ARC is or is not, would disappear with the reliance of protecting every stream. The alternative would be a full datastream, decoded, going back to the device in question. The prerequisite would be to remove the shitshow that HDCP is and allow full-blown HDMI-in and outputs, which is the exact opposite of what the Forum wants.

HDMI in its current implementation hinders technological progress in the audio segment by forcing everyone to output analogue signals after the decoding stage or not allow decoding at all.

gizmo
3 replies
8h26m

Don't you also need ARC because of video post-processing that adds frames of latency? The TV needs to send audio back to the receiver otherwise video and audio will not be in sync anymore. Receivers/amplifiers can process audio with practically no latency so it makes sense for them to be at the end of the chain.

AstralStorm
1 replies
8h14m

Ugh, DisplayPort already has the audio channel. As far as sync, neither protocol provides for effective reclocking or supplies the audio clocks, and you need VRR to provide sort of display clocks.

alt227
0 replies
6h57m

I admire your exasperation on this issue :)

toast0
0 replies
3h59m

You don't need ARC to address a/v sync. HDMI has (optional) metadata somewhere (edid?) where the display device indicates its delay and the audio device can adjust accordingly. It's helpful if the display device has fixed delay for this feature to be most effective; it's fine if there's different modes with different delays and the current delay is communicated, but some modes have variable delay depending on content which is terrible in several ways.

IMHO, ARC is primarily useful when the display device is also acquiring the content: it's running the TV tuner or internet streaming or content off a usb drive. It's also useful if you have a 1080p capable receiver and upgrade to a 2160p(4k) display and sources: if you put the receiver in the middle, you lose on video quality, but with eARC the display can route full quality audio from all your sources. Some sources do have two HDMI outs, so you could wire to the display and the receiver, but that's not very common.

globular-toast
3 replies
9h4m

ARC and CEC are only necessary because of this stupid situation where TVs are like displays with shitty media centres built in. ARC is only a tiny bit more convenient anyway; it's not that hard to run an audio cable back from the TV to an audio receiver and you'll be hiding the cable anyway so it matters not the slightest what it looks like.

In 2002 there was XBMC (later renamed to Kodi). Microsoft even had Windows XP Media Centre Edition in 2005. At that time it was perfectly possible to set up a media centre that could do everything. No need for shitty TV remotes and CEC. You would use a much higher quality remote of your choice. Oh how far we've come in 20 years...

xnyanta
0 replies
6h5m

it's not that hard to run an audio cable back from the TV to an audio receiver

Wait until you find out that many consumer sound bars (Sonos comes to mind) only support the latest and greatest digital audio formats over eARC.

toast0
0 replies
3h46m

it's not that hard to run an audio cable back from the TV to an audio receiver and you'll be hiding the cable anyway so it matters not the slightest what it looks like.

That's fine for regular ARC which is basically the same capability as spdif, ATSC audio and DVD audio. But there's no consumer audio cable that has the capacity for lossless surround except for HDMI, and then you really want eARC because otherwise you have one HDMI running from the receiver to the TV for video (and maybe audio) for sources that can go through the receiver, and a second HDMI that runs from the TV to the receiver for audio only for sources that can't go through the receiver (built into the tv like the tuner, network streaming, and playback from USB; and also devices that exceed the HDMI bandwidth of the receiver or don't negotiate to an appropriate video and audio format unless going direct --- I have a 4k Roku and a 1080p BluRay player that need different settings on the TV to work through my receiver, or I can wire one source direct to the TV and use eARC)

icar
0 replies
8h23m

It's still a perfectly valid choice.

eqvinox
1 replies
10h32m

I agree, but I also think your illustration of the problem is a bit off. The 90% gauss curve center part of customers doesn't need the tail end of display connector bandwidth.

However, devices have a lifecycle, and a lot of this hardware will still be in use in 2-3 years, where this will have moved into the center part of the gauss curve. Higher resolutions and HDR (which may push 10bit) will trip this much more than a 240Hz display [which ain't ever gonna' be mainline, really, considering we went down to 60Hz from CRTs with faster refresh rates]

CEC can be done over the DisplayPort AUX channel. I think there were attempts at an ARC equivalent but they floundered.

Another interesting question though is how much A/V connections in general will still be used in the "TV world" down the line… with everything moving towards more integrated networked appliances instead. E.g. streaming service "HDMI sticks" are now apps on the TV instead…

ThatPlayer
0 replies
7h19m

I agree that it's an issue very few customers are going to run into. But also that's where the differences in DisplayPort and HDMI are. For those 90%, they're equally served by HDMI and DisplayPort and will just use whatever they have.

Another 10% feature difference I do like on DisplayPort is Multi-stream transport for multiple monitors over a single cable. I don't think many people are looking to daisy chain big screen TVs.

RegnisGnaw
0 replies
5h57m

How do I switch? My TVs only have HDMI, should I spend another 5K+ to buy DP TVs?

sgjohnson
12 replies
7h12m

I feel like the only thing HDMI has going for itself is ARC.

DisplayPort is superior in every other way imaginable. Except for the fact that almost no TV supports it.

Low-end monitors also don't usually have them, but as far as computer monitors go, I'm not interested in the low-end ones.

As for TVs - just give me a dumb screen with ports. I'm going to attach Apple TV to it anyway.

nolok
8 replies
5h56m

HDMI is ruling because of momentum and ubiquity in the tv room.

No device output in DP, no device accepts it, so no pressure on device to accept/output it. I guess the license price is low enough.

On computers, it sort of evolved where DVI was, you get mort port, you get better feature set, it's just superior.

But in the non-tech market I think the "real" fight will end up being hdmi vs usb-c, both of them are evolving to the point where they feed everything ethernet included. HDMI has ARC and waayyyy simpler cable and port compatibility (one version to check), usb-c has power output and every single pocket device and laptop/tablet/...

sgjohnson
6 replies
5h26m

No device output in DP

Yeah, if we exclude basically every half decent GPU and ~70% of laptop USB-C ports in existence.

rahimnathwani
3 replies
2h36m

I use an HDMI to HDMI cable to connect my MBP to my 38" monitor (3840 x 1600, @ 85Hz). Would I get any benefit from using a USB-C to DP cable instead (e.g. running at the monitors maximum refresh rate)?

rowanG077
2 replies
2h4m

Unanswerable without specifying what laptop you have.

rahimnathwani
1 replies
1h58m

Sorry, I should have Googled before asking the question. I just did, and see that my monitor (Dell AW3821DW) supports only 85Hz over HDMI, but up to 144Hz over DisplayPort.

The laptop's spec page doesn't say what refresh rates are supported for external displays (except saying at least 60Hz): https://support.apple.com/en-us/111901

I'll buy a new cable right now :)

rowanG077
0 replies
58m

Yes that macbook pro supports outputting even 4K @ 144hz over displayport so it should work easily.

nolok
0 replies
4h10m

I was talking about the TV room device, as opposed to the computer devices.

ai_
0 replies
4h18m

Those generally aren't in the living room

toast0
0 replies
4h18m

But in the non-tech market I think the "real" fight will end up being hdmi vs usb-c, both of them are evolving to the point where they feed everything ethernet included.

HDMI ethernet and HDMI eArc use the same pins. eArc won, HDMI ethernet is pretty much dead.

bryanlarsen
1 replies
4h43m

Except for the fact that almost no TV supports it.

So frustrating. I'm using a 42" LG OLED TV as a monitor right now. Very nice monitor at half the price of the same panel in a "real" monitor. I'm driving it with an AMD card at 60 Hz for exceedingly stupid reasons.

macNchz
0 replies
4h21m

FWIW after reading though lots and lots of posts in the original bug report thread for this issue, I bought a Cable Matters “8K DisplayPort 1.4 to HDMI 2.1 Adapter” and it works perfectly to drive my Sony TV at 4K/120 from an AMD 6900XT on Ubuntu 24.04.

xnyanta
0 replies
6h8m

So true, I picked up the Samsung G80SD "Smart Monitor" and the deciding factor was literally just that it supports eARC, allowing me to use my Sonos Beam soundbar with my computer and supporting compressed audio formats like Dolby Atmos.

To make things even worse, this monitor supports sending back the ARC audio over DisplayPort, but only in stereo. If I use HDMI between the monitor and the computer, I get all of audio channels.

Sakos
11 replies
13h55m

Feels like it's time for governments to get involved. It's not reasonable for a ubiquitous format like HDMI to be restricted like this.

kelnos
3 replies
13h10m

I agree. It should be illegal to restrict people from making open source implementations of industry standards like this.

I don't expect that to ever happen, of course. But I can dream...

jakeogh
2 replies
9h58m

I'm with you in principal, but cleanroom reverse engineering is legal. The issue here is AMD signed the NDA's to read the secret spec and write the code, hence they cant release it.

The solution is to not buy proprietary standards[0], in this case, I'm looking for DisplayPort when I buy... and a big + to AMD for trying.

Hey Intel! Come back!

[0] Pile of comments here pretending it's sooo difficult.

account42
1 replies
8h54m

in this case, I'm looking for DisplayPort when I buy

That's an option if what you are looking for is a normal computer monitor. If you want a big TV then good luck finding one with DP, especially if you have other requirements (emissive pixels with real blacks, HDR, etc) that further limit your options.

jakeogh
0 replies
7h53m

Are you saying it's a lost cause? If so i'm totally switching back to Windows 2033 so I can play Simnpc in full res and buying one of those smart cars.

wmf
2 replies
13h32m

Between patents, trade secrets, and DMCA, the government is the source of the problem here. Arguably the FTC could step in here but I think they have bigger problems to tackle.

shiroiushi
0 replies
13h15m

Between patents, trade secrets, and DMCA, the government is the source of the problem here.

The government created the playing field. The only entity that can fix the situation is the government: they created a bad playing field, and they need to fix it.

anordal
0 replies
13h10m

But not all governments, thankfully. Remember DVD-Jon? He won the trial for breaking DVD crypto, because consumer rights stood above trade secrets.

anal_reactor
1 replies
11h43m

Seriously, we managed to standardize charging ports by law, maybe we can also agree on using an open standard for displays.

jakeogh
0 replies
10h35m

That is the real issue, because ultimately, this is about the memory hole. A re-upload of a camcorder copy of a VHS is no threat; it will be degraded (again) when the digital archivist who saved it's re-encoded copy[1] (from whatever video platform deleted it) posts it again. Intercepting a exact bitstream circumvents this modern camcorder copy deliberate problem, and would ultimately obliterate the non-arguments to keep the original file unavailable.

On the other hand, a signed sha3-256 digest along with the original[0] file before YT re-encoded it (and stripped it's metadata) is unobtanium for the plebs. It is the _most_ important data for the host. It's the first thing they backup. As far as I know, they (YT/Rumble/Tora3) never talk about it. Some would love to only serve hallucinated (when convenient) upscaled derivatives.

Power is threatened by persistent lossless public memory.

[0]: https://news.ycombinator.com/item?id=20427179

[1]: (Mr. Bean, 2024) https://www.youtube.com/watch?v=xUezfuy8Qpc

jeroenhd
0 replies
10h5m

I wonder if the HDMI forum can be considered a gatekeeper in terms of the EU's DMA. Their influence on the market is rather indirect, but I wouldn't be surprised if 80% of the EU uses HDMI every single day.

elihu
0 replies
10h7m

I think it's actually pretty typical for important electrical interfaces to not be public or royalty-free, as much as I wish that wasn't the case.

That's not to say the government shouldn't get involved. I think the bigger thing here is that if an industry group is specifically setting things up so that Linux is shut out of having high-end video support, then it looks an awful lot like cartel behavior -- industry incumbents are working together to lock out a competitor. Maybe it could be the basis of an anti-trust lawsuit?

Presumably Apple and Microsoft would have the most to gain. Microsoft is a member of the forum. Apple doesn't appear to be, but an Apple guy is on their board of directors.

I'm not a lawyer and I don't know how such a lawsuit would work. Who represents Linux in this case, since it's not owned by any one company. Linus Torvalds? AMD? And would all the companies involved in the HDMI forum be liable for the behavior of the forum (which would include AMD)? Does intentionality matter? I.e. if Linux was excluded accidentally rather than deliberately?

https://hdmiforum.org/about/hdmi-forum-board-directors/ https://hdmiforum.org/members/

shiroiushi
5 replies
13h12m

AMD should just leak the code and disclaim responsibility.

preisschild
4 replies
12h51m

Even if they were to do that, it wouldn't really be a long term fix. Who would maintain this "unofficial" GPU driver? AMD themselves can't.

shiroiushi
3 replies
12h44m

If it has that much interest, someone will put it on GitHub and maintain it. Of course, it'd be unofficial, but so was support for MP3s in Linux distros for many years: you had to download software from outside the US to make it work.

taneliv
2 replies
11h36m

Incompatible licenses would mean that no distribution would have kernels that support it, though. It would be a second class citizen, compiled via DKMS or something, and often broken on a lot of hardware.

The difference to broken MP3 support is that if your music file does not play, you can still browse the Internet and write emails and play games, but if your graphics driver is busted, you can do none of those things from GRUB menu. In the worst case recovery mode does not work either, and you just converted your laptop into a headless server.

Troubleshooting it by browsing instructions on the mobile is no fun.

shiroiushi
1 replies
11h27m

Why would the licenses be incompatible? AMD owns the code, so they can release it under whatever license they want, including GPL or BSD. They could even put into the public domain if they wanted. AFAICT, the code has not been released at all, so this point about licenses is simply wrong: it doesn't have a license at all right now.

As for a broken driver, that's an easy fix I think. From my reading of the article, there's already an existing driver, but it doesn't support HDMI 2.1 features. So it's simple: provide a fall-back driver, and require users to separately download the new driver (or maybe distros can package it themselves, I'm not sure about the legality). If something goes wrong with the unsupported driver, leave an option in the boot menu to boot in a "safe mode" that uses the old driver. So they won't get 4K @ 120Hz, but I'm sure they can live with that.

taneliv
0 replies
9h15m

Exactly like you are saying, if I'm not misunderstanding the situation: the code is unlicensed for anyone who is not AMD. That is incompatible with GPL2 license of the kernel. Without HDMI Forum's approval it can not be licensed, either, since the point is that AMD does not own the secrets contained within.

(If the driver is leaked, I would imagine it to be illegal to distribute it. Companies might elect to not actively support even the fallback mechanism, if it has no other use cases. Probably not a big hurdle and something an installation package should be able to solve, but a hurdle nevertheless.)

supermatt
1 replies
8h38m

What exactly does this mean? Isn't it just a case that the driver wouldn't be HDMI-certified, or are they actively prevented from distributing the driver?

layer8
0 replies
8h23m

My reading: AMD signed a contract with the HDMI Forum in order to get access to the HDMI specification and be allowed to create HDMI products (and use the HDMI trademark), which includes an NDA regarding the specification. An open-source driver would violate that NDA.

roshankhan28
1 replies
9h7m

with DP cable i can get 144 hz on my benq xl2411p. but with hdmi i can go max 90hz. i cant own 2 144hz display just because there is no way i can use two DP cables on my rtx 20270 super.

ThatPlayer
0 replies
8h48m

That's an issue with the monitor only supporting HDMI 1.4. A monitor that supports HDMI 2.0 would work with 144hz fine. I have the opposite issue where I run my 240hz monitor on HDMI 2.1 because it has more bandwidth than my RTX 4080's DisplayPort 1.4 ports.

It seems a common issue enough with that that model is specifically called out sometimes: https://forums.tomshardware.com/threads/how-to-connect-to-a-...

0points
1 replies
12h5m

Bye HDMI. You will not be missed.

WithinReason
0 replies
11h25m

we can only hope...

xxpor
0 replies
13h47m

Note this is from February

wvh
0 replies
8h50m

If possible, they should add a hook and individual download script like for DeCSS back in the day. Let them come at each of us individually for wanting to use the ports in our own hardware.

sylware
0 replies
7h49m

Everybody knew that could happen with HDMI based on their legality setup.

And it did.

That said, those guys usually play a "back and forth" game on the long run... so stay tuned.

You should have an eye on MPEG too, because those are the same "type" of people (and ARM ISA is not far behind...)

Even if I despise big tech on nearly all fronts, sometimes we can agree, and this is AV1 and DisplayPort.

And this type of behaviour namely not having a DisplayPort port could be a perfect regulatory (anti-competitive) project for EU, like they did with apple...

steelframe
0 replies
3h25m

Cool. Then I suppose I'll buy my dumb DisplayPort screens from rando Chinese knockoff companies that siphon off R&D that suckers like Sony and Samsung fund.

shmerl
0 replies
13h40m

HDMI forum is a corrupt cartel which sole existence is to make sure they can continue fleecing everyone on patent fees.

langsoul-com
0 replies
9h16m

Isn't amd in a pretty bad position with display related tech?

Hdmi rejected them, display port isn't ubiquitous enough, thunderbolt (usb c) is owned by Intel.

kristjank
0 replies
9h19m

I really hope HDMI snags hit some critical mass and it gets reliquished to the dying TV/home theatre domain where it can rot into obscurity. DisplayPort has its own issues, but they're much smaller than the constant industry fuckups HDMI produces. It's Oracle: the interface.

eqvinox
0 replies
11h6m

While we're here, does anyone know why professional displays [e.g. https://www.usa.philips.com/c-p/27B1U7903_27/professional-mo... ] frequently have 2 HDMI ports and only 1½ DisplayPort? (½ for the Thunderbolt port) This feels like some kind of standard port combination… do the display driver ICs only have one DP port? Are they using TV silicon?

I know HDMI is used in some AV production setups, but that feels like a very small niche to justify having 2 HDMI ports on a display like this?

[I'd rather have 2 DP ports and only 1 HDMI… or no HDMI at all]

MPSimmons
0 replies
6h44m

This occurred because in 2021 the HDMI Forum restricted public access to its specifications

Oh, okay. Fuck the HDMI forum, then.

Kon5ole
0 replies
11h34m

Bummer - some TV's are tremendous value for money as computer monitors (Small 8K tv's that sometimes sell for sub-1000 usd) but they tend to only have HDMI.

I got 8k/60 working in Linux using an nvidia card and a dp-to-hdmi adapter cable, but I have a feeling it's not meant to be supported (the same cable does not work in windows).