return to table of content

HDMI Forum Rejects Open-Source HDMI 2.1 Driver Support Sought by AMD

Jun8
60 replies
1d20h

For the longest time (like, the past 15 years) I’ve wanted a cheap small box that can overlay custom info, eg text, on the TV. The reason such a thing does not exist, other than projects like Bunnie’s NeTV (https://makezine.com/article/technology/review-hardware-hack...) is that decoding HDMI signals is illegal if you’re not in the club. I think the ELI5 reason is that content providers are afraid of unauthorized copying of DVDs (try taking a screenshot of a frame from your DVD player on your laptop).

I’d be willing to pay up to $199 for such a box if it has an open API to input overlay text and icons.

thwarted
8 replies
1d20h

While reading your comment I was reminded of the much touted key capabilities of the Video Toaster (late 80s), it's character generator and chromakey, used to do exactly that: overlaying text on video, for a fraction of the cost of what professional video production cost. And now, 30+ years later, we still don't have an accessible way to do that using modern video tech/protocols, but now it's about rent seeking and licencing rather than technical capability.

wmf
6 replies
1d19h

There's tons of stuff out there. The ATEM Mini Pro can do keying and a bunch more for $300.

In general, audio and video production have never been cheaper or more accessible. There's no conspiracy of The Man keeping people down.

Jun8
4 replies
1d19h

Thanks for the suggestion; however, this is a video production device which aims for different functionality than the broad one I had in mind.

The simplest useful app would be: watching a movie together with friends, have them type their comments on their phones and display them on my TV, in a lower third. The chat part is trivial but how to get that from a server to the video stream? Maybe there’s a simple way to do it using a device like the Mini Pro but when I search I do t see it.

buffington
1 replies
1d18h

While it's not the same as what you're asking for, you could do this pretty easily with a Raspberry Pi.

That feels like such an obvious thing to say that I hesitated even saying it, but then I started thinking about how funny it is. A solution to HDMI being so locked down is to buy a $20-ish computer, that does all the things a computer can do, so that we can feed the TV the video stream we want. Over HDMI.

Given that it's so easy to circumvent now, I wonder why anyone even bothers with HDMI any more, aside from inertia.

iancmceachern
0 replies
1d18h

Totally, or one of those hdmi usb capture devices that are similarly cheap

kalleboo
0 replies
1d13h

watching a movie together with friends, have them type their comments on their phones and display them on my TV, in a lower third

This is what twitch streamers do all day using OBS and twitch chat. Hang out with their followers and watch/react to YouTube videos and read chat comments.

iancmceachern
0 replies
1d18h

The Atem can definitely do this. There are youtube tutorials on it.

You use one of the many apps that can output your text, set it to output the background as green or whatever.

Then you can use the Atem to do one video stream inside the other, picture in picture, you can customize the size of the picture in the picture.

You can also have ot chroma key out the green (or whatever) from the text app, so it will only display the text, and the background will be the main video stream.

Bam.

raxxorraxor
0 replies
1d3h

There certainly is a conspiracy of industry players to keep people down. It is well documented and sadly part of our devices.

jethro_tell
0 replies
1d20h

That brings back some good memories. I had access to a video toaster setup a couple days a week after school as well as library checkouts of big vhs camera recorders. We'd get a stack of tapes, do the 1 day rental of a camera and record all over the neighborhood, then after school we'd take the tapes in to the studio and mix them together.

All dumb shit, but you know we were kids.

tim--
6 replies
1d20h

From what I understand (and I could be very wrong here) the biggest issue here is simply the amount of memory that is required in order to keep a copy of the current frame from a HDMI signal, and the fact that to modify anything in the HDMI signal, you need to decrypt (if watching a HDCP protected stream), decode each frame, and then finally re-encrypt the HDMI signal.

I was looking into what would be necessary to build a HDMI switch that allowed for seamless switching of video between different inputs, and basically there was no hardware for it. The closest chips that I could find was Analog Devices' ADV7626 and Lattice's SiI9777S.

Lattice CP9777 might be worth having a look at if you can understand anything about the datasheet :)

4rt
4 replies
1d19h

As far as I can see there's no need to re-encrypt the signal - just leave it as a non-HDCP stream?

There are HDMI splitters which silently strip HDCP to output to two screens at once - they don't advertise this feature but they do it anyway in order to function. In order to achieve the overlay you'd just need one of these and then a non-HDCP-aware overlay apparatus before the display.

https://www.reddit.com/r/VIDEOENGINEERING/comments/xme482/hd...

charcircuit
3 replies
1d16h

You won't be able to get keys for HDCP if your device isn't compliant

mrandish
2 replies
1d15h

I believe these kind of cheap, no-name converter/adapter boxes use leaked HDCP keys.

userbinator
1 replies
1d14h

The master key was cracked long ago. You can generate as many of your own unique keys as you want.

charcircuit
0 replies
1d13h

This only works for old HDCP versions though

mjevans
0 replies
1d19h

It would be a LOT more seamless if the output didn't have any HDCP to re-synchronize. Though the different frame timings (gen-lock is a term you want to search for) between the inputs and outputs so such a system might introduce presentation latency which would have to be accounted for in the audio streams.

goosedragons
6 replies
1d20h

Is there a reason a RPi or whatever with a cheap USB HDMI capture card isn't a good option? I guess there might be a minor quality loss but it should do what you want.

MenhirMike
4 replies
1d20h

HDCP usually is an issue if you can't turn it off on the source. There are ways around that, though those are more than $199 if you're dealing with an HDMI 2.x source.

0cf8612b2e1e
2 replies
1d20h

I thought there were cheap converters that could go HDMI 2->1?

MenhirMike
1 replies
1d18h

Back in 2019 I paid $399 for a Vertex2, but if there are cheap options now in 2024, that would be awesome!

evilduck
0 replies
1d16h

I bought a random HBVALINK branded splitter on Amazon last month for $29 that works perfectly for this purpose.

treflop
0 replies
1d20h

I bought a HDMI 2.0 splitter that strips HDCP a while back for $20.

tim--
0 replies
1d18h

Usually the lack of supporting 4:4:4 colorspace, and 4k video at anything over 30hz.

justsomehnguy
5 replies
1d20h

There is a lot of answers here already, but I would remind everyone, what if your target is 38-in-1 DVD/HD/whatever to resell for $5 the only thing you need is a $30 cam pointed at the display.

Yes, it's not pixel perfect. Thing is, the people who buys 38-in-1 do not care about pixel perfect picture. They just want an affordable way to spend an hour and half with their family.

buffington
3 replies
1d18h

Can you describe what a "38-in-1" device is? Is it even a device?

throwaway201606
1 replies
1d17h

38-in-1 is some random storage media (USB stick, DVD. SD card, CD ) with "38" (or 45 or 2, or 99, pick a number) media files (movie, TV show, YouTube series whatever ) in various resolutions, formats, modes etc sold for next to nothing in the street.

Can be played Android-type boxes or even on phones / tablets

Essentially sneaker-net but for watchable media.

Not something you will see in much of in NA or Europe but very common in Asia and Africa where bandwidth /internet and more important, electricity, is relatively expensive.

The 90 minutes with your family comment is spot on.

in NA and Europe, this is mostly because folks don't have access to continuous internet. You are in back-country or don't have service or whatever.

In the Africa / Asia case, its because you don't have internet and / or power.

Most of the time, this media is viewed off TVs running on batteries, charged by solar, off an USB port. The electricity budget (because battery) says you get "90" mins of TV a day (not exactly 90: the power is shared between lights, charging phones, would be be more time but because electricity is limited, "TV time" is limited).

So those "90" minutes are family time, we all watch the same thing together.

Point being, in that world, Cam or SD of whatever works, no need for HD, or UHD or 4k or 8k - it is completely worthless. The screens that the content is being watched on are 720p most of the time.

Something like this (edit replaced original link with a new one that explains what is happening better)

https://www.npr.org/sections/goatsandsoda/2021/11/10/1052926...

extraduder_ire
0 replies
1d16h

I think this is also the reason most "scene" releases of TV shows and movies were xvid encoded, and capped at 175MB and 700MB respectively for so long. So that they wouldn't break compatibility with burned CDs on existing players.

justsomehnguy
0 replies
1d11h

The reply by @throwaway201606 is on point, though I would add what it was extremely popular in Eastern Europe too, till the broadband/4G became ubiquitous.

I think the last time I saw a 9-in-1 DVD on sale there is somewhere around... 2018 probably?

Manabu-eo
0 replies
1d18h

Yeap, for most they try, the a-hole (analog hole) will remain unplugged.

Justsignedup
5 replies
1d18h

Hdmi has zero illusions about being ineffective. It was cracked the first day it was released.

It's a mechanism to force you to pay a licensing fee to manufacture hdmi devices.

It's why display ports are the way to go.

3abiton
4 replies
1d18h

I never gave display port any attention, what's the appeal? Open-source?

weberer
1 replies
1d9h

Also higher speeds. HDMI 2.1 has a bandwidth of 48 Gbit/s. But because of driver shenanigans in the linked article, us pesky computer users are stuck with HDMI 2.0, which has a bandwidth of 18 Gbit/s.

The latest version of DisplayPort has a bandwidth of 80 Gbit/s, and you get drivers on day 1.

Justsignedup
0 replies
2h14m

Thinking about it more, there's no down side to DP. Every card manufacturer should spend an extra $2 and ship with a DP -> HDMI adapter and drop HDMI universally. There's no down side. And that $2 will be well worth the licensing fees they don't have to pay, and the software they don't have to write.

preisschild
0 replies
1d10h

Its a less confusing standard too. The versions are sane.

Also i love that DP-plugs have that latch thing so the cable always stays attached

bluGill
0 replies
1d18h

Hdmi without the leagal issuses. There are other differences but the leagal is why it exists.

babypuncher
4 replies
1d19h

The really dumb thing is that nobody wants to copy DVDs or Blu-Rays via HDMI anyways. It's much more convenient to decrypt the discs directly with tools like MakeMKV.

However, I suspect that isn't the real holdup here since DisplayPort also fully supports HDCP

evilduck
3 replies
1d16h

Most of the stuff people want to preserve nowadays is never made available on a disc.

mfru
2 replies
1d13h

like?

crtasm
1 replies
1d

Content on streaming services - Netflix, AppleTV, Amazon, etc.

evilduck
0 replies
16h20m

Exactly. The same streaming services that have recently gained a reputation of straight up deleting entire movies and series.

refulgentis
2 replies
1d20h

It sounds like you've taken a deep long journey into the mess here: I'm curious, what does it take to be a member of the club?

I hope it's not just FAANGs and blessed hardware OEMs...but it also sounds likely it must be, you can't let just anyone in if you're worried about the specs leaking...but then again that sounds weird too, in that, they used to provide the spec more openly until 2.1?

metaphor
1 replies
1d20h

HDMI adopter registration details here[1]; in general, $10k annual fee + royalties and of bunch of legalese. Prevailing list of 2451 adopters and affiliates here[2]; yes, AMD (listed as unabbreviated Advanced Micro Devices) has been a registered Adopter since Aug 2006 and is currently listed as a 2.1b licensee.

[1]https://www.hdmi.org/register/adopterregister

[2] https://www.hdmi.org/adopter/adoptersaffiliates

gnu8
0 replies
1d17h

Is registering as an adopter the only way to gain participation in their decision making process?

bombcar
2 replies
1d20h

You can get devices that "hack" HDMI to various other unprotected signals such as SDI. So you get a Decimator or something, covert HDMI to SDI, and then do the needful.

Now that may not help you, as the devices that do SDI overlays run in the $1k range, but maybe something is out there like a ATEM SDI https://www.blackmagicdesign.com/products/atemsdi

mschuster91
1 replies
1d20h

Blackmagic's Mini Converters are at their core nothing more than SerDes chips and an FPGA that does the protocol translation. I _think_ that the recent models with USB-C power supply inputs also have the data pins connected to the FPGA.

Might be worth a try to check how hard BMD has locked down the FPGA bitstream.

bombcar
0 replies
1d19h

I know the Decimators would remove HDCP, as it's a known "trick" when trying to project from a Mac. But not sure which versions, and how well.

salawat
1 replies
1d20h

You are correct. Pay attention to the conspicuously lacking feature. Odds are there's industrial collusion to keep it that way.

hkgirjenk
0 replies
1d18h

Chinese don't care, if there is a market they will produce it.

echohack5
1 replies
1d20h

Maybe the closest thing to this is a Blackmagic box of some sort. I use a the Ultrastudio 4K to capture 4K60 output, pipe through OBS and pass through as well.

Will run you $1K though. Corsair/Elgato has some solutions in your price range but the devil is in the details of precisely what you're trying to accomplish.

evilduck
0 replies
1d16h

An HDMI splitter (i.e. an HDCP down converter/stripper) and a video capture dongle or card can be had for like $60, plus a computing running OBS. It’s not an elegant small box, but it’ll get the job done for lower resolution needs.

dtx1
1 replies
1d20h

I don't know about quality but I guess a Raspberry Pi, HDMI Capture Card and OBS would do the trick and are within that price range.

buffington
0 replies
1d18h

If it's to watch a movie, why bother with the capture card at all? Why not throw together a way to stream the video from the Pi doing the text overlay to all the remote participants.

Is the GP commenter suggesting that they'd like to have people in remote locations each load a copy of their DVD into a player at the same time so they can comment on it together? If so, then yeah, you need a capture card I guess, but the notion of doing that seems a bit bonkers.

tverbeure
0 replies
1d16h

I’ve made such a box myself. It cost me $20 in hardware and many hours of effort.

https://tomverbeure.github.io/2018/04/23/Color3-HDMI-RX-to-H...

Right now it’s just a proof of concept (check out the video with a moving overlay rectangle and the red 8 that is converted into a partially green 8), but it wouldn’t take a whole lot of work to add support for subtitles.

There are still some of these boxes for sale on Amazon, but supply is limited.

sottol
0 replies
1d20h

I don't know if the ~$20 HDMI USB capture cards work on a raspberry pi, but connecting one to a pi or laptop and capturing the signal to a v4l stream, overlaying in software and then outputting on built-in HDMI out might work?

nivenhuh
0 replies
1d19h

I have an HD Fury Arcana 2 VRR which overlays text (eg: bitrate information) on top of my normal HDMI signal. (https://hdfury.com/)

I don't think this behavior is customizable.. (but maybe with some hacking).

joemelonyeah
0 replies
1d15h

There are some no name USB capture sticks that take any HDMI signal and expose it as a standard USB UVC camera, no drivers required. This could be a start of a DIY project.

SushiHippie
37 replies
1d21h

Is there any reason I'd want to use HDMI instead of DisplayPort?

Strom
11 replies
1d20h

If you want big resolutions and big refresh rates. For example I have a 4K screen with 144 Hz refresh rate. That sort of bandwidth can only be delivered right now via HDMI 2.1.

Yes there are DisplayPort standards that can reach that, but in practice all the top end GPUs have only the old DisplayPort 1.4 version. That means the only choices are to use HDMI 2.1 or to use video compression over DisplayPort 1.4.

I will say that VESA has been very successful with their Display Stream Compression (DSC). Not that the algorithm is something special, but in terms of marketing. You never see monitor reviewers talk about the fact that using DSC will give you artifacts. VESA markets this compression as visually lossless, which every non-expert explains as actually lossless. In reality the losslessness is defined in the standard as when all the observers fail to correctly identify the reference image more than 75% of the trials. Beyond that, like with any compression, it will fail at high entropy images. Take for example this 2017 study titled Large Scale Subjective Evaluation of Display Stream Compression [1] which found that performance on some images was not visually lossless, however, those images were challenging images with high entropy.

--

[1] https://www.researchgate.net/publication/317425815_Large_Sca...

zokier
4 replies
1d19h

AMD 7000 series has DP2.1

Strom
1 replies
1d19h

Looks like some DP2.1 monitors were announced at CES this year too, so things are finally coming together!

preisschild
0 replies
1d9h

My Samsung G95NC already supports DP2.1

paulmd
0 replies
1d18h

the ability to use it at full speed is segmented to the workstation cards though

Quekid5
0 replies
1d19h

Not that I'm looking for a replacement right now, but I guess that confirms why I'm committed to Sparkle Motion^W^W AMD :)

izacus
4 replies
1d20h

That's great, but have you actually seen those artifacts ever?

(DSC also has a great property of making the image delivery significantly more resilient to cable issues and thus much more reliable for most people using high res monitors.)

Sakos
2 replies
1d20h

DSC also has the great property of causing my RTX 3060 to crash if I try to watch videos on an external monitor and do the wrong thing on my main display. This issue has gone unresolved for over a year now. HDMI, DSC and NVIDIA can go eat a big bag of dicks.

Sakos
0 replies
1d4h

Nope, affects several different monitors. It's also not flickering, the GPU driver will crash and stop sending output to the external monitor. At times it's even hung my entire system.

Strom
0 replies
1d20h

That's great, but have you actually seen those artifacts ever?

No, I use HDMI 2.1 without DSC.

Otherwise though, yes I have very good vision and work with images professionally. I do spot artifacts when they are there.

I get it that not everyone needs perfect images. I mean, people watching YouTube certainly won't be able to tell if there is an additional artifact on top of the low bitrate video they're watching.

preisschild
0 replies
1d9h

The AMD RX 7000 series supports DP2.1 already and there are already monitors that do too (Samsung G95NC for example)

rodgerd
7 replies
1d21h

Good luck finding an AV receiver or TV with DisplayPort.

This pretty much locks any media players into proprietary drivers.

Chabsff
6 replies
1d20h

DisplayPort to HDMI converters trivially solve that. Let the TV consume HDMI all it wants while still "just" outputting DP from the computer.

krs_
3 replies
1d20h

Are there DP -> HDMI 2.1 adapters available that supports 4k120hz with VRR and HDR? I've not seen any although I've not looked too hard.

krs_
1 replies
1d12h

Clicking through to the product page it claims that VRR/Gsync is not supported with that adapter unfortunately.

It is also a USB-C to HDMI adapter, not a pure DP adapter. I think only the 20-series of Nvidia GPUs and maybe some of the 5000-series AMD GPUs have DP capable USB-C outputs. And apparently the M1/M2 Macs that the first article is about.

CableMatters is a good brand so if anyone could do it it'd be them, but it appears it's not quite there yet. Maybe when DP 2.0 becomes more common on GPUs it'll be a solved issue.

simoncion
0 replies
1d8h

I think only the 20-series of Nvidia GPUs and maybe some of the 5000-series AMD GPUs have DP capable USB-C outputs.

Based on my experience, this doesn't matter. I have a video card with no USB-C outputs. I go from full-sized DisplayPort to a female <-> female DisplayPort coupler, to this [0] bidirectional DP <-> USB-C cable, which plugs into my monitor. It works great and even does 4K 60FPS uncompressed HDR no problem.

I see no reason why you'd be unable to then slap on a female <-> female USB-C coupler and then that USB-C -> HDMI adapter.

(There's also no reason you couldn't cut out the first DP cable and coupler and plug the DP <-> USB-C cable directly into the video card. I just have very long (fiber-optic) DP cables that the second cable plugs into.)

[0] https://www.monoprice.com/product?p_id=39240

harkinian
1 replies
1d19h

I'd rather not introduce another thing that can go wrong. Video outputs are already picky enough.

Chabsff
0 replies
1d3h

It's obviously far from ideal. I was just pointing out that "welp, proprietary drivers it is" is not the only way forward.

TedDoesntTalk
4 replies
1d20h

Does DisplayPort support audio?

izacus
0 replies
1d20h

Yes. (Although I'm not sure if it supports bitstreaming formats like Dolby Digital, Atmos and DTS.)

harkinian
0 replies
1d19h

Yes, with the caveat that some monitors that support both HDMI and DP will only take audio from HDMI inputs.

betaby
0 replies
1d20h

It does.

adrian_b
0 replies
1d20h

Yes, it does.

I am writing this in Linux, while looking at a Dell monitor connected through DisplayPort, and my loudspeakers are connected to an audio output of the monitor.

mianos
3 replies
1d21h

There are many more displays that support hdmi. It is a simpler protocol. Display port is better and more modern, it's packetised so the timing is not fixed by the source like udp. HDMI is fixed timing and used to be much simpler hardware.

tverbeure
2 replies
1d20h

HDMI is fixed timing and used to be much simpler hardware.

What do you mean by 'fixed timing'? The fact that the transmission data rate is proportional to the pixel clock?

We're talking HDMI 2.1 here, which uses FRL (fixed rate link) and thus has pixel clock decoupled from the pixel clock just like for DP, with data split into packets. There's not a lot of difference between DP and HDMI in terms of functionality and complexity.

mianos
1 replies
1d18h

Now I read the spec for 2.1 I see I am wrong and you are correct. I have only ever seen it from SERDES blocks with a 'continuous' clock.

Indeed, 2.1 seems similar to DP. It would require quite a bit more logic to do that.

I wonder how many TVs support that?

tverbeure
0 replies
1d16h

It is true that HDMI 2.1 requires more logic, but one of the annoying parts of the earlier HDMI versions is that a sink needs a PLL that covers a large, continuous frequency range. DP and HDMI FRL only need to support a few fixed frequencies, which is much easier for analog designers to design for.

Tade0
2 replies
1d20h

In some laptops the HDMI port is managed by the integrated GPU and the DisplayPort by the discrete GPU, so ultimately it affects which GPU you're using.

ZiiS
1 replies
1d19h

This seems very unlikly. The may be laptops where the HDMI can use either depending on the power profile.

Tade0
0 replies
1d6h

Yes, and they achieve that by routing the data through the integrated GPU.

I had a problem with Windows and AMD drivers where in this configuration the integrated GPU would run full throttle despite not doing anything serious, making the system run hot.

I "solved" it by using the DisplayPort - 10°C difference on the CPU and, more importantly, no throttling.

thomastjeffery
0 replies
1d20h

Most displays that are marketed as a "TV" instead of a "monitor" are exclusively HDMI. This is why many TVs from ~2020 don't have any 4K@120 input, despite actually displaying at 4K@120 (from the internal GPU that does frame interpolation), because HDMI 2.0 is limited to 4K@60.

fnordpiglet
0 replies
1d21h

Your device doesn’t have display port interfaces I guess. So for instance if you have a raspberry pi, you’re using HDMI one way or another.

alexsereno
0 replies
1d21h

Masochism

AzzyHN
0 replies
1d17h

A 65inch OLED isn't going to have DisplayPort input. Most laptops don't have DisplayPort output, just HDMI or type C which may or may not support DP Alt mode, and even then, that caps at 40gbps

mikece
30 replies
1d21h

"Needless to say, open-source Linux advocates should try to use DisplayPort instead if at all possible."

And at this point that's one of the several (many?) protocols that runs over a USB-C cable, right?

conradev
24 replies
1d21h

DisplayPort is packet-based and can be multiplexed with other USB-C traffic through a hub

HDMI is not packet-based and so when it uses USB-C it takes over the entire cable (alt-mode)

baby_souffle
17 replies
1d21h

DisplayPort is packet-based and can be multiplexed with other USB-C traffic through a hub

This is part of why DP is $$$ compared to HDMI. I would love to see DP start eating HDMI's lunch after this and the absolute shit show that was the HDMI 2.0 roll out but cheaper to implement is almost certainly going to be the driving factor when it comes to consumer grade TV / Displays and no console or other set top box maker is going to bother putting display port on their device if nobody's got a TV that can use it.

babypuncher
11 replies
1d19h

It won't happen unless VESA gets serious about adding home theater specific features to DP that HDMI has had for 20 years.

kiwijamo
10 replies
1d17h

For example...? My experience is that DP is excatly like HDMI from the end user's perspective but happy to learn more.

ThatPlayer
9 replies
1d17h

ARC and CEC come to mind. ARC probably isn't too important, but CEC is nice being able to control various boxes with the same remote. Or being able to control the TV's volume with another remote.

babypuncher
8 replies
1d16h

ARC is actually pretty important, especially now that eARC is a thing. It finally decouples your receiver's HDMI capability requirements from your TV.

simoncion
7 replies
1d6h

I see no __technical__ reason why you couldn't do CEC over the AUX channel... all that's required is for the software on either end to be updated to squirt and interpret the bits, and it's a low-speed protocol, so it'll fit just fine in the AUX channel.

"ARC" seems to be "Audio over HDMI using CEC for discovery and control", which (if you've bothered to run CEC over the DisplayPort AUX channel) you get the rest automatically with DisplayPort.

However, because both CEC and ARC are HDMI standards, you bet your biscuits that the HDMI Consortium will bar anyone who wants to use HDMI ports on their devices from shipping official firm/software that does the braindead-simple thing of running CEC over DP AUX, and having ARC-compatible firm/software.

As is nearly always the case (and when it's not, it's not for long) DisplayPort is totally capable of everything HDMI is, but the HDMI Consortium stands in the way of having commercially-distributed DisplayPort "home theater" products that are compatible with the nice-to-have HDMI features.

jasomill
4 replies
22h49m

eARC is important not only because cable management and CEC, but because it's the only high bitrate digital audio output on most TVs: the only other commonly available digital audio output, TOSLINK, is limited to two-channel PCM[1], AC3, and DTS.

[1] While the physical TOSLINK cable is able to support higher channel counts via ADAT[2], I'm not aware of any TVs with ADAT support.

Similarly, while 192 kHz / 24-bit TOSLINK support is common in pro audio and audiophile gear, the standard only requires 48 kHz / 20-bit. I imagine most TVs output 48 kHz / 20-bit, if only for the sake of configuration simplicity: TOSLINK is strictly unidirectional, so automatically negotiating format support beyond mandatory minima is impossible.

[2] https://en.wikipedia.org/wiki/ADAT_Lightpipe

simoncion
3 replies
21h28m

...[eARC is] the only high bitrate digital audio output on most TVs...

Great news! DisplayPort supports:

1–8 channels, 16 or 24-bit linear PCM; 32–192 kHz sampling rate; maximum bitrate 36,864 kbit/s (4,608 kB/s)
babypuncher
2 replies
21h0m

But it doesn't support any of the common bitstreamed audio formats like Dolby Atmos or DTS.

eARC lets your TV pass those along to your compatible receiver even if the TV itself can't make heads or tails of the format.

simoncion
0 replies
3h59m

eARC lets your TV pass those along to your compatible receiver even if the TV itself can't make heads or tails of the format.

DisplayPort Multi Stream Transport (MST) serves that role. Given that you'd only be passing along the audio data, rather than the video data, you could save money by putting the slowest available hardware (only BRR-capable) into the receiver.

Or, the TV could "just" pass along the audio stream to any plugged-in receiver. You don't gotta have a standard for that... just a standardized way to ask the TV to do it (assuming that you want to control the behavior and not have the conditional be "Is there a receiver plugged in? Pass along the audio and don't send it to the TV speakers.").

simoncion
0 replies
17h10m

But it doesn't support ... Dolby Atmos or DTS.

No, I think you're wrong about that.

DisplayPort v1.2 also adds new audio enhancements including the following: — Audio Copy Protection and category codes — High definition audio formats such as Dolby MAT, DTS HD, all Blu-Ray formats, and the DRA standard from China — Synchronization assist between audio and video, multiple audio channels, and multiple audio sink devices using Global Time Code (GTC)

MAT claims to require an Atmos-capable decoder, so that sure seems like Atmos to me. I dunno what DTS HD, but that sounds like a souped-up DTS. Also take note of the "High definition audio formats such as" statement... that list of formats is incomplete.

The quote comes from: <https://vesa.org/press/vesa%C2%AE-introduces-displayporttm-v...>

babypuncher
1 replies
1d

When I say DisplayPort needs these features, I don't necessarily mean they need the HDMI Consortium's specific implementation of them. I see no reason VESA couldn't implement their own alternatives to CEC, ARC, Auto Lipsync, etc. These HDMI features solve a host of problems that are unique to home theater setups, so any HDMI alternative that wants to supplant it needs to also solve them as well.

Like you point out, there's no technical reason DisplayPort cannot provide similar features. The issue is the lack of any standards for them built into the DisplayPort specification. Some of these features, like CEC, are 20 years old and could easily be improved upon in an ecosystem that doesn't have to worry about backwards compatibility.

simoncion
0 replies
21h9m

When I say DisplayPort needs these features, I don't necessarily mean they need the HDMI Consortium's specific implementation of them.

Ah, okay.

Well, DisplayPort handles EDID, DDC/CI, E-DDC & friends... that's the CEC-equivalents handled. VESA does have standards for remote control of displays... it turns out that that's a thing that people want to be able to do.

The issue is the lack of any standards for them built into the DisplayPort specification.

Nah. DisplayPort already supports everything (or just about everything) CEC can do.

The reason you don't see this stuff often making its way into TVs is because the HDMI Consortium gets in the way of folks who want to add DP ports to their TVs. The reason you don't see explicit support for HDMI Consortium protocols such as CEC in VESA standards is -again- because the HDMI Consortium gets in the way of folks who want to add DP ports to their TVs... so why bother? (Especially when actually supporting the protocol is trivial... squirt exactly what you'd send over the HDMI cable over the DP AUX channel, instead.)

If it was actually politically possible to have DP ports on TVs, then you'd see some of the more esoteric aspects of CEC (like manipulating a TV tuner) be quickly standardized into the VESA equivalents... assuming that VESA didn't just say "Oh yeah, y'all just go talk CEC to these new DP-equipped TVs. You already know how.".

tverbeure
3 replies
1d20h

This is part of why DP is $$$ compared to HDMI.

Why is that more expensive?

In terms of complexity, implementing DP vs HDMI 2.1 is not materially different. They both have fixed rate links, packets, Reed-Solomon forward error correction, DSC, etc.

But IIRC DP is royalty free, and HDMI is not.

refulgentis
2 replies
1d20h

It sounds like it might be more about "the market will pay more for the packet-based approach, and we shall charge what the market will bear"

tverbeure
0 replies
1d20h

Speaking about 'packet-based approach', the way it's presented in the popular press as if it's similar to Ethernet, is a pretty gross distortion, the impact of which is not nearly as much as people seem to think it is.

You can check that there in this presentation (https://www.vesa.org/wp-content/uploads/2011/01/ICCE-Present...), page 32 and 33.

The majority of DP traffic is still brute force video data, interspersed with heavily packetized secondary data.

Over the years, I've spent many hours wading through DisplayPort data debug traces, and I've always wondered what people were smoking when they called it 'packetized like Ethernet'. It's just not true. (And FWIW: even old HDMI can transport secondary data packets just the way you can with DP. It's how audio-over-HDMI is done...)

autoexecbat
0 replies
1d19h

Perhaps there is just more scale for HDMI making it easier to recoup r&d for new components that implement it

kimixa
0 replies
1d16h

And the HDMI is pretty much a direct digital translation of analogue CRT signalling, blanking and timing weirdness included. Which is also not how modern LCDs implement their internal screen driving, so needs a similar level of complexity to decode and convert.

I think the biggest difference in DP vs HDMI cost is simply scale - there's probably orders of magnitude more HDMI chips sold than DP.

wolfhumble
3 replies
1d20h

HDMI is not packet-based and so when it uses USB-C it takes over the entire cable (alt-mode)

What does this exactly mean? I have an USB-C 4x1 Hub with HDMI that works at the same time as using 2 x USB-A 3.0, 1 x USB-C. Thanks!

izacus
2 replies
1d20h

It means that your hub uses DisplayPort to talk to the computer and then has a converter chip to convert to HDMI.

Pretty much noone used HDMI Alt Mode on USB-C and is now deprecated IIRC.

wolfhumble
0 replies
1d20h

Okay, thanks!

justsomehnguy
0 replies
1d20h

Reminds me of Ethernet over HDMI.

cesarb
1 replies
1d19h

HDMI is not packet-based and so when it uses USB-C it takes over the entire cable (alt-mode)

AFAIK, when HDMI is used over USB-C, it's actually DisplayPort over USB-C; while a HDMI alt mode was specified, nobody actually implemented it, everyone instead implemented the DisplayPort alt mode and then used a DP-to-HDMI converter chip whenever an HDMI output was required.

DisplayPort is packet-based and can be multiplexed with other USB-C traffic through a hub

That's the case only for USB4 (and AFAIK earlier Thunderbolt 3). Other USB-C ports with DisplayPort alt mode simply use some of the USB 3.x pairs for raw DisplayPort, and whenever the use of all four USB 3.x pairs is required for DisplayPort due to the target resolution, then only USB 2.x can be available (USB 2.x has its own dedicated pair in the cable).

altairprime
0 replies
1d18h

Actual example of this:

Back in the Thunderbolt 3 era, it was up to each motherboard manufacturer to decide how many pairs they routed to USB alt-mode, so it's not necessarily a safe bet to depend on it working.

So, in one specific example, 4k60 over USB alt-mode DisplayPort is not supported on Apple's $4999 iMac Pro, as USB alt-mode is only assigned two pairs; however, 4k60 over Thunderbolt DisplayPort is supported, as Thunderbolt is assigned four pairs. The only way to get 4k60 out of that device is to use a Thunderbolt-only DisplayPort adapter, that has no USB mode at all.

This was resolved by USB 4 and Thunderbolt 4 both incorporating a more modern DSC (display stream compression), among other things as described above, but your mileage will vary much more wildly with USB 3.

whalesalad
0 replies
1d21h

I am using a displayport to usb-c to connect an amd gpu to an apple display and it works well.

gamepsys
0 replies
1d20h

You can run DP over USB-C. It's a use case they specifically support. However, if you are the cutting edge you will be using the actual DP cords. DP2.1 uses 80Gbit/s and Thunderbolt 3 USB-C is only 40 GBit/s. Once you start pushing high resolutions (at least 4K) and high framrates the entire system will be bottle necked by the cable. Top end displays are always designed to take advantage of the latest DP port, and use the entire bandwidth.

USB4 will push 120GBit/s. This will be enough for 8k 120hz/4k 240hz but not much more. This will probably be enough for high end consumer displays and GPUs for a decade out or so.

fuzzy2
0 replies
1d20h

Actually, it is the only (display) protocol. HMDI Alt Mode, while specified, never really manifested and is effectively dead. DisplayPort is the only alternative and is more flexible anyway.

Its packet-based nature is not relevant for DisplayPort Alt Mode, by the way, because it gets dedicated pins. Sometimes enough for two lanes, sometimes enough for four. It's hilariously consumer-unfriendly. Only Thunderbolt leverages the packet stuff and can transport up to eight lanes worth of DisplayPort.

macOS does not support MST.

Who is supposed to get this? I don't know.

TedDoesntTalk
0 replies
1d20h

You can run HDMI over ethernet CAT-6 cables, too.

Cu3PO42
0 replies
1d21h

Yes. You can run "DisplayPort Alt Mode" over USB-C. You can also transport DP over Thunderbolt over USB-C.

HDMI Alt Mode also was a thing, but is now deprecated.

lucasyvas
17 replies
1d21h

This seems like a bigger deal that it looks like - this flat out kills the legitimacy of HDMI in my mind. There's no reason to keep it.

baq
13 replies
1d21h

You’re right but I don’t think tv manufacturers care.

lallysingh
9 replies
1d21h

There are lots of TV manufacturers to choose from, thankfully.

olyjohn
6 replies
1d21h

Well if you find one that sells TVs with DisplayPorts, let us know.

gorjusborg
5 replies
1d20h

With all the garbage that comes on TVs now, this is just another reason to buy a monitor with an audio-out instead.

shiroiushi
4 replies
1d9h

I've never heard of an affordable 65+" monitor.

gorjusborg
3 replies
21h19m

I've never needed 65" TV :)

baq
2 replies
20h27m

Me neither, then I got one, now I need one ;)

shiroiushi
1 replies
17h30m

Yep, same here. Not going back.

I'm sure people said similar things when indoor bathrooms were a new thing, but there's always some luddite who says "we never needed those things before!"

gorjusborg
0 replies
3h57m

Everything is a trade-off.

The fact that you are comparing a larger television to having indoor plumbing shows a lack of perspective or good-faith argument on your part.

whoopdedo
0 replies
1d20h

Are there? This seems to be one of those illusion of choice industries where the different brands are only relabeling of the same factory reference designs.

Hamuko
0 replies
1d20h

How many TV manufacturers actually use DisplayPort?

akira2501
2 replies
1d20h

I remember when markets were considered through the will of consumers. I miss those days.

harkinian
1 replies
1d19h

To most consumers, DisplayPort is just "that thing that won't fit into my HDMI port for some reason." DP is royalty-free, but it seems like HDMI is cheaper to implement in the end because it's on every low-end device.

simoncion
0 replies
1d7h

Part of that is certainly kickbacks and other preferential deals offered by the HDMI Consortium.

It's certainly true that majority of TV manufacturers don't actually pay royalties/licensing fees to the Consortium.

riehwvfbk
2 replies
1d21h

HDCP

Cu3PO42
1 replies
1d20h

DisplayPort supports HDCP.

riehwvfbk
0 replies
1d19h

In theory, but I think it's common for devices to only enable it on HDMI ports.

silisili
14 replies
1d21h

Serious question, what if they just ignore the Forum and do it anyways?

xnyan
4 replies
1d21h

HDMI(®) is a trademark as well as a proprietary protocol. The only way to legally use the HDMI label is with approval of the forum.

toast0
3 replies
1d21h

Just call it a 19-bin display connector in that case. Or only put in dual-mode display port connectors (DP++), and let people get passive adapters (maybe bundle them for a bit; you can probably label the passive adapters as HDMI)

notpushkin
0 replies
1d20h

Just call it a 19-bin display connector in that case

Flipper × Raspberry Pi did exactly that just a couple days ago: https://blog.flipper.net/introducing-video-game-module-power...

Video Out port: DVI-D signal in 640х480 px, 60 Hz. The port supports a well-known video standard that we can't name due to copyright limitations :shush: The first letter is H, and the last one is I.
0xedd
0 replies
1d18h

It's funny how baby mentally drives these companies. Really, writing "HDMI" is an issue? But if I write HDMl, then that's fine? Come on. Find a real way to generate revenue rather than wasting everyone's time.

0xcde4c3db
0 replies
1d20h

This is pretty much an actual strategy. The academic/crowdfunded ULX3S FPGA development board went with calling it "GPDI" (general-purpose digital interface). Other strategies include labeling the connector with a little TV/monitor icon, calling it "digital video", or just having a little diagram in the marketing/documentation material with it connected to a TV with a vaguely HDMI-shaped cable.

Crosseye_Jack
4 replies
1d21h

At best: Get kicked out of the forum and can never support HDMI products going forward.

At Worse: Get sued for breaking NDA, lose, pay a metric fuck ton in damages, and get kicked out of the forum and can never support HDMI products going forward.

amlib
3 replies
1d20h

What if _someone_ makes an unofficial patch with HDMI 2.1 support? You could at least compile your own damn kernel with the damn thing supported.

Crosseye_Jack
2 replies
1d19h

If _someone_ released an unofficial patch for the official OSS driver it might get by, it might get "DMCA'ed", though DMCA might not be the correct takedown method for patent violations (I'm just presuming this _someone_ released only their own code and nothing from AMD, so AMD wouldn't have a copyright claim over the code itself), but hey thats never stopped companies mis-using the DMCA in the past! To file a valid counter notice to a DMCA, that _someone_ would have to give details which the forum could then use to sue the publisher of the patch.

But yeah even if the patch some how was made public, and it wasn't nuked out of orbit, ongoing support and bug fixes would be a pain in the ass. (Because as an example, no one from AMD would be allowed to touch the "patch code")

<edit> If AMD's planned patch was leaked for example, as AMD had not officially released it, its not yet "open source" and because of that, not yet public, and I'm sure there will be a clause in the terms that state that AMD would have to go on the offensive to get their code removed from any public repos. </edit>

When it comes to lawfare, The Forum wouldn't even have to be in the right (in a legal sense), just have a big enough war chest to make everyone else's life a pain in the ass!

amlib
1 replies
1d18h

Someone else in the comments has suggested that maybe an entity or someone in a country like France where copyright laws are less strict might be able to write and maintain the patch without fear of persecution, but as you said, those guys may somehow still make life a pain in the ass :(

userbinator
0 replies
1d14h

China would be a much better place for that, and Russia probably a close second.

rodgerd
1 replies
1d21h

"HDMI" is a trademark. You can't claim to offer HDMI in any of your products.

The patent pools around approximately all the codecs used for media delivery are heavily cross-licensed. That includes HDMI and HDCP, but also h.264 and h.265. Most likely AMD can't legally use hardware decoding or encoding of any popular codecs at that point. Good luck with game video, streaming, or playing media discs.

So it would cost AMD - for example - their entire PlayStation/XBox business. At a minimum.

dsp_person
0 replies
1d19h

What if they just replaced "HDMI" with "Mystery Port" in the specs and marketing materials?

mperham
0 replies
1d21h

They will get sued and lose.

baby_souffle
0 replies
1d21h

Serious question, what if they just ignore the Forum and do it anyways?

Probably makes it tricky to get HDMI forum's blessing for any future devices. AFAIK, the hdmi standards are public-ish so anybody can create a device that is HDMI compatible but you're only allowed to put "HDMI" on the packaging / marketing material with the forum's blessing.

Cu3PO42
7 replies
1d21h

This is upsetting on principle alone. But I'm also upset because I have my PC hooked up to my TV for couch gaming and was hoping to do that on Linux sooner rather than later when HDR works properly. I wasn't aware HDMI 2.1 was another problem... I guess Windows is sticking around for that.

boomskats
2 replies
1d19h

HDMI 2.0 is capable of doing 4k @ 120Hz, but only up to YCbCr 4:2:0 8bit instead of the full YCbCr 4:4:4 10bit (though I'm not sure how close wayland is to 10bit support). It will be fine for gaming, just not for everyday PC use.

FWIW I also didn't realise this until just now. I've been running my desktop at 4k@120hz recently for the buttery smooth neovide, but have been noticing that text rendering, especially syntax-highlighted text, looks awful. I'd seen the same oled panel render text way better in WSL/Windows (both using a custom pixel geometry[0] via Mactype, but also without), so I spent more time than I'm willing to admit to wrapping my head around custom pixel layouts and hinting in freetype. But no, turns out it was this all along.

If you want to see the effect of 420 vs 444 chroma subsampling on text rendering, this writeup[1] has some great test images and is well worth a read. Also, if you happen to have an LG OLED panel, you can get a little debug window that confirms your signal format and refresh rate, by pressing the green button on the remote 7-8 times.

[0]: https://github.com/snowie2000/mactype/issues/720 [1]: https://www.rtings.com/tv/learn/chroma-subsampling

Cu3PO42
0 replies
1d13h

I once had a bad cable, which forced a 4k@60 monitor into chroma subsampling. Or rather Windows decided it would rather enable Chroma subsampling than drop resolution or framerate.

I immediately noticed there was something wrong. I agree it's terrible for desktop use. I would probably also have blamed freetype if it only happened on Linux, though.

macNchz
1 replies
1d20h

FWIW I have my Linux desktop hooked up to a 4k/120hz TV via a $30 CableMatters DisplayPort->HDMI dongle that seemingly supports most of the relevant functionality, as far as I can tell. It even works very smoothly over a 50 foot active fiber optic HDMI cable. There is a ton of discussion about these dongles in the issue thread linked in the article.

Cu3PO42
0 replies
1d13h

That's an excellent point. Coincidentally I also have a 15m active fiber optic HDMI cable. If and when this is the last blocker, I'll definitely look at available adapters!

0xedd
1 replies
1d18h

There are different cable converters available. That said, if the hardware is user hostile don't use it. What are birthdays for if not throwing out trash and replacing it.

Cu3PO42
0 replies
1d13h

What is the hardware I would be replacing here? My TV? I'm not aware of any options with DP support. My GPU? I only just replaced my 3070 Ti with a 7900 XTX because of the open source drivers and much better Linux support. The converters are an excellent point though. Not sure why I didn't think of them...

nimish
4 replies
1d19h

Hopefully someone leaks the spec. Maybe it's already available?

Note that the latest VESA specs are also restricted.

FWIW, it's easily possible to sniff the control channel of an HDMI or DP connection. At that point one could attempt to reverse engineer the enabling features.

Manabu-eo
3 replies
1d18h

There has been an 80GB leak of Nvidia drivers and firmware source code not too long ago by Lapsus$. Isn't HDMI 2.1 implementation part of it?

indrora
2 replies
1d17h

Acknowledging these documents can taint your mind for life if you're not careful.

In the emulator scene, there is a set of documents that describe, at deep and intimate levels, the inner workings of the N64, released by a disgruntled SGI employee somewhere on USENET. It is common knowledge that reading those documents taints you from working on basically any graphics or game related source for the rest of your life.

This doesn't just apply for leaked things. There's people who've worked on the Deep Parts of Windows, MacOS, etc. that they are basically barred from making contributions to certain open source projects (e.g. Wine, AsahiLinux) as anything they do would likely involve secrets that are tainted with knowledge from their former employer.

Every graphics, emulator, game engine, and embedded guru on the planet has watched the Gigaleaks out of nintendo with caution, as they now have to be VERY careful where some things come from. If someone reads code from The Gigaleak, then contributes code to an emulator, the emulator may be tainted.

This came to a head when the PowerVR SGX drivers were leaked ( https://www.phoronix.com/news/MTg0NTQ ) and several developers eyes were burned as a result.

userbinator
0 replies
1d14h

Only the truly stupid would do this stuff under their real identities. Then again, a lot of OSS contribution is for ego anyway.

This came to a head when the PowerVR SGX drivers were leaked

...and some far-East modding communities managed to make unofficial Windows 9x and XP drivers using that. They will of course not tell you who they really are.

firen777
0 replies
1d17h

I understand the "clean room" reason behind avoiding tainting one's mind, but in court how do people distinguish plagiarism from "convergent evolution"?

extraduder_ire
4 replies
1d16h

Does anyone know what the HDMI forum want to achieve by locking down the spec like this? I can't remember any reasons given back when they decided to restrict publication of future spec updates.

weberer
2 replies
1d9h

Well Microsoft is one of the members...

TechnicalVault
1 replies
23h28m

Microsoft is pretty pro-Linux these days. I'd say it's some lawyer being overly paranoid.

prmoustache
0 replies
7h11m

acknowledging you can't make a dent in all markets without linux support != being pro-linux

justinclift
0 replies
1d16h

Does anyone know what the HDMI forum want to achieve ...

Wild arse guess, but it's probably a) as close to absolute control as they can get, and b) security (of some variety) through obscurity

betaby
3 replies
1d20h

Since, say France, has no software patents, can that work be done in France and released thereafter? ( thinking of VLC example now )

regularfry
0 replies
7h23m

I think it's unlikely to be a patent issue. More likely a contractual one. If you sign up to the HDMI Forum to get access to the spec, you will have to sign away certain rights. Those contract terms will have been put in place by the same people who think HDCP is a good idea.

If that's the case, someone needs to either leak the spec or reverse engineer the driver. One or the other will happen eventually.

nextaccountic
0 replies
1d20h

I think that nouveau can do that. But AMD can't touch this code because it is an US company

colordrops
0 replies
1d20h

Would probably involve retaliation outside of France if they did this.

coryfklein
2 replies
1d19h

Can AMD fork HDMI, add in driver support, and call it their own standard that just so happens to be compatible with HDMI?

I'm sure they wouldn't make such a move lightly, but is there something fundamental here about the laws or specifications that prevents them from doing this?

justinclift
0 replies
1d16h

Might be a patent mine field?

dvdkon
0 replies
1d19h

I think they could, simply by IP laws, but they have a contract with the HDMI Forum that probably doesn't allow that. They could cancel it (and reverse-engineer HDMI instead of going by the NDA'd spec), but I don't think they'd do that just for Linux support.

thomastjeffery
1 replies
1d20h

The HDMI forum ought to be subject to anti-trust action.

justinclift
0 replies
1d16h

Racketeering and/or cartel behaviour maybe?

megous
1 replies
1d19h

Is this about the movie industry again?

esarbe
0 replies
1d19h

Yes.

anticensor
1 replies
1d11h

At that point, just send DP signal over HDMI cable...

paulmd
0 replies
10h40m

the funny thing is that exists, freesync over hdmi works via precisely this mechanism (sending a DP signal over hdmi) since it predates the official introduction of HDMI Org VRR lol

water9
0 replies
1d18h

This is what makes me wanna pirate things

userbinator
0 replies
1d14h

Take their closed-source driver, run it through a decompiler and get an LLM to "understand" it, then write a "cleanroom" implementation? Might be one of the few good things this new AI stuff is good for.

nojvek
0 replies
22h44m

TIL: decoding HDMI signals is illegal. I had no idea. I assumed it was an open protocol.

But I guess I learnt something new today.

dusted
0 replies
1d20h

wow, hdmi really wants to die that badly ?

cultureulterior
0 replies
1d20h

Seems like you could have a separate shell organization that built this functionality by looking at existing drivers, and just provide the patch to the kernel.

brandonr49
0 replies
1d3h

And just like that I never purchased HDMI based products again.