return to table of content

Show HN: OBS Live-streaming with 120ms latency

Anunayj
30 replies
1d14h

Why is low latency livestream so hard, while at the same time Cloud Gaming Tech like Nvidia Gamestream and such can have such a flawless experience?

I've used Moonlight + Nvidia Gamestream with ~40ms RTT and couldn't feel a difference in competitive shooters, so total latency must be pretty low.

Does it have something to do with the bandwidth requirements? (1 stream v/s potentially hundreds)

Hikikomori
8 replies
1d10h

There's no way that people cannot tell the difference, I can with various streaming methods from my pc to my shield/tv, with wire in the same house. Mouse to photon latency of a good pc will be in the range of 10-20ms, best case you're doubling or tripling that.

booi
7 replies
1d9h

I can feel 40ms for sure and there’s no way you play a competitive shooter with a 40ms delay. Hell even a Bluetooth mouse gets annoying.

Maybe if you’re playing Microsoft Office it’s ok.

Guillaume86
4 replies
1d7h

Nah to be fair it's fine for a lot of games which are also played on old gen consoles with terrible gamepad to TV latency. Sure twitchy multiplayers are definitely not some of them. I'm not big on competitive multiplayer, only Rocket League and I can't do this over local streaming. Pretty much anything else I play is ok though.

ravenhappy
3 replies
1d7h

You, my dear Internet friend, are confidently expressing your lack of experience. No one who has played multiplayer LAN games, or low latency Internet games, could or would ever say that streaming gaming, such as the dead stadia, or moonlight, whatever, are comparable to the alternative, Nah, they couldn't.

the-smug-one
1 replies
1d6h

I don't think that I could feel the difference between 40ms and 10ms RTT when playing something like DOTA2 or AoE2.

Hikikomori
0 replies
1d6h

Most online games use client side prediction, so any input made by the client happens almost instantly on the client and it feels really good, and can be rollbacked if the server disagrees. If you stream your game remote with 40ms it will add 40ms to your input and that just feels bad (not to mention jitter, especially if you're on semi-congested wifi), but its not unplayable or even that noticeable in many games. Would I play some casual Dota like that? Sure. But not high ranked games.

Guillaume86
0 replies
1d1h

You conflate local streaming vs internet streaming, and I specifically excluded twitchy multiplayer games...

toast0
0 replies
1d2h

Maybe if you’re playing Microsoft Office it’s ok.

You're not going to be able to do the best combos with that kind of latency, but I guess it's ok for mid-level play.

fx1994
0 replies
1d5h

yeah, I just feel the lag and everything, even on monitors claiming 1ms I can feel it while playing FPS and it is really annoying to me if game is not fluent I will not play it

user_7832
7 replies
1d13h

Slightly tangential, do you think Moonlight (with say Sunshine) is good enough for proper work? I've used a few "second screens" apps like spacedesk on my iPad but generally when the resolution is good enough for text, it's too laggy for scrolling (and vice-versa).

(For more details, I'm planning to stream from an old laptop after turning it into a hackintosh. I'm hoping staying on the home network's going to help with latency.)

aappleby
3 replies
1d13h

Absolutely yes, especially if you use a GPU with a recent video encoder.

I have all my PCs connected to each other via Moonlight+Sunshine and the latency on the local network is unnoticeable. I code on my Linux workstation from my laptop, play games on the gaming PC from my workstation, etcetera and it is all basically perfect.

user_7832
2 replies
1d12h

Thank you! When you say GPU with a recent video encoder, do you mean them separately (i.e. don't use software/cpu streaming; and use an efficient encoder), or do you mean use a GPU that supports recent encoders? I'm afraid my Intel HD 520 isn't particularly new and likely doesn't support modern encoders.

aappleby
1 replies
1d11h

A GPU with a dedicated encoder for x264 or preferably x265/AV1. You can do without it, but you'll spend a core or two on software encoding and the overhead will add a few tens of ms of lag.

With full hardware capture and encode (default on windows, can require tweaking on Linux ) it's virtually free resource-wise.

ThatPlayer
0 replies
1d7h

You're off by one letter: the codec is h264/h265. x264 and x265 are the CPU encoding softwares for the codec.

tymscar
2 replies
1d12h

Without a doubt! I use moonlight for ten+ hours a week. I never use it for gaming and it never failed once.

user_7832
1 replies
1d12h

Thanks, that's great to hear/know! Would you be okay sharing your hardware (CPU/GPU) setup on the server (Sunshine) side? Thanks!

tymscar
0 replies
1d12h

Yes. I host both on my NixOS desktop that has a 13900KF and an RTX4080 (using nvenc and AV1), as well as from my MacBook Pro M3 Pro.

est
6 replies
1d12h

GPU would do transcoding, build network packet and copy data via PCI-E, all using hardware, avoid memory copy.

OBS+WebRTC is mostly software doing heavy-lifting.

Imagine if the camera would build WebRTC UDP packets directly and zero-copy to NIC, that would lower latency quite a bit.

KeplerBoy
4 replies
1d9h

I wouldn't be surprised to learn that Nvidia is doing exactly that on their cloud: Compressing the video on the GPU using NVENC, building a package around it and then passing it to a NIC under the same PCIe switch (mellanox used to call that peerdirect) and sending it on its way.

The tech is all there, it just requires some arcane knowledge.

kierank
1 replies
22h50m

This is premature optimisation. The bus bandwidth and latency needed to get a few Mbps of compressed video to the PC is microscopic. It's completely unnecessary to lock yourself into NVIDIA just to create some UDP packets.

KeplerBoy
0 replies
13h55m

I was talking about Nvidia's Cloud gaming offer (GeForce Now). For them it's certainly not a premature optimization.

imtringued
1 replies
1d8h

"arcane knowledge" is too strong of a phrase. You need someone who is familiar with Nvidia hardware and is willing to write software that only works on Nvidia hardware.

KeplerBoy
0 replies
1d8h

It is arcane as in information how all of this works on their specific hardware is not publicly available, but probably widespread within.

sharpshadow
0 replies
1d6h

Exactly this with „…NVIDIA GPUDirect for Video, IO devices are fully synchronized with the GPU and the CPU to minimize wasting cycles copying data between device drivers“.[1]

1. https://developer.nvidia.com/gpudirectforvideo

spongebobstoes
1 replies
1d13h

TLDR, there are a lot of moving pieces, but people are working on it at the moment. I try to summarize below what some of the challenges are

Bandwidth requirements are a big one. For broadcasts you want your assets to be cacheable in CDN and on device, and without custom edge + client code + custom media package, that means traditional urls which each contain a short (eg 2s) mp4 segment of the stream.

The container format used is typically mp4, and you cannot write the mp4 metadata without knowing the size of each frame, which you don't know until encoding finishes. Let's call this "segment packaging latency".

To avoid this, it's necessary to use (typically invent) a new protocol other than DASH/HLS + mp4. Also need cache logic on the CDN to handle this new format.

For smooth playback without interruptions, devices want to buffer as much as possible, especially for unreliable connections. Let's call this "playback buffer latency".

Playback buffer latency can be minimized by writing a custom playback client, it's just a lot of work.

Then there is the ABR part, where there is a manifest being fetches that contains a list of all available bitrates. This needs to be updated, devices need to fetch it and then fetch the next content. Let's call this "manifest rtt latency".

Lastly (?) there is the latency from video encoding itself. For the most efficient encoding / highest quality, B-frames should be used. But those are "lookahead" frames, and a typical 3 frame lookahead already adds ~50 ms at 60fps. Not to mention the milliseconds spent doing the encoding calculations themselves.

Big players are rewriting large parts of the stack to have lower latency, including inventing new protocols other than DASH/HLS for streaming, to avoid the manifest RTT latency hit.

mannyv
0 replies
1d5h

For HLS you can use mpeg ts, but mp4 is also an option (with the problem you talk about).

IMO one of the issues is that transcoding to lower resolutions usually happens on the server side. That takes time. If the client transcoded that latency would go away (mostly).

wmf
0 replies
1d13h

Cloud gaming is streaming from a server in a data center to one nearby client. Twitch-style live streaming is from a client, to a data center, to a CDN, to multiple clients.

redox99
0 replies
1d13h

A lot of it is buffering to work around crappy connections. Cloud gaming requires low latency so buffering is kept to a minimum.

numpad0
0 replies
1d12h

Because there are so many middlemen in series buffering frames, and also because access circuit between user terminal to nearest CDN is jittery too. The latency must be few times over max jitter for a par course user experience.

mavamaarten
0 replies
1d12h

In my head the answer is simple, moonlight is a one-to-one stream, while broadcasting to many clients at once is a whole different setup.

Sean-Der
27 replies
1d6h

I created Broadcast Box originally as a reference server to test OBS against. It was way easier for people to test my WebRTC/WHIP PRs against. Seeing people use it I am seeing the benefits/excitement more.

* low latency means you have a relationship with your audience. These intimate broadcasts are a new medium.

* Simulcast means it is way cheaper to run a streaming site. No more running ffmpeg/generating transcodes server side.

* AV1/H265/Opus means users with lower bandwidth can now broadcast. Users with enough bandwidth can stream at quality levels they couldn’t before

* UDP gives us IRL/Roaming streams. No custom setup for re-connects.

* Multi-track lets you send multiple video feeds or languages at once

* E2E Encryption means that P2P distribution could be a thing

dangoodmanUT
8 replies
1d5h

Just pouring through the code, one thing I think it's missing is transcoding to multiple (lower) resolutions as well so weaker connections can watch.

Is that correct? I took a brief look a the WHIP protocol and it seems like maybe it's just a matter of converting the frames before writing?

Sean-Der
6 replies
1d4h

This is all done client side. OBS sends up multiple renditions, the server is just in charge of forwarding the specific layer. I think this is better in a few ways.

* Lower latency - The extra decode + encode adds enough latency that you start to lose the real-time latency.

* Better Quality - You get generational loss from the transcoding. You get better quality only encoding one times. Streaming services also optimize for cost. Having broadcasters control the encoding quality of everything makes for a better experience.

* Security/Trust - Servers shouldn't be able to modify video at all. I would like for broadcast services to eventually offer E2E encryption. It feels wrong to me that a streaming service is able to modify video however it pleases.

ohthatsnotright
3 replies
1d3h

It feels wrong to me that a streaming service is able to modify video however it pleases.

This would significantly complicate services such as Twitch and YT Live going to server-side ad insertion if the source video were E2E. I think server-side ad insertion is likely the only method available for providers that isn't able to be circumvented by ad-block or DNS blocking plugins.

spockz
1 replies
21h2m

Why? The advertisements could still be added as separate video layers? Sure, they are also easier to be circumvented. But I’d rather supported something like this than end up with real time edited video to insert ads or even worse have “influencers” promote something without ever having really tried it.

ohthatsnotright
0 replies
7h36m

If it's a separate layer or different server you can bet that someone will figure out how to remove it from the DOM, hide it, or prevent it from even loading.

pjc50
0 replies
6h28m

Again, I think that's a "feature" a lot of users would prefer to design out.

pjc50
0 replies
6h26m

I would like for broadcast services to eventually offer E2E encryption

Is this even conceptually possible? I suppose you could sign the stream, but if you want to hide it how would you prevent the server from simply adding itself as a viewer?

(also, if you do this, start a countdown for getting raided as a CSAM distributor)

mannyv
0 replies
23h23m

There is by definition a layer of trust between the broadcaster, the server, and the client.

That said, you could use TLS from the server to client, which is what many services do (since it's actually quite difficult to use http these days for streamed content due to browser polices).

mannyv
0 replies
1d5h

I believe that's handled on the OBS side.

oarsinsync
7 replies
1d2h

Is there a reason why I'd want to use Broadcast Box instead of mediamtx, which features RTMP, RTSP, WebRTC/WHIP, and (LL)HLS in a single small binary?

Sean-Der
5 replies
1d

mediamtx is a great project! Use w/e works best for your use case. I just encourage people to use.

* Free Software

* Project is operated/managed by individuals

* Use Open Standards/Protocols

thnkman
2 replies
23h7m

I think this is a highly underrated comment. It seems today, people wont create for free any more, just for the joy of creating and distributing some thing you care about. It's always please donate x to y. Or subscribe to get access to premium content.

Sorry for hijacking your comment :)

tekknik
0 replies
7h42m

this seems like a reasonable comment on its face, but then it reads selfishly. why should a developer donate time when a house isn’t donated to that programmer?

jack_pp
0 replies
18h46m

Economy is shit, what do you expect. Very few humans have the financial luxury to release free software and not try to monetize it at all.

oarsinsync
1 replies
10h45m

mediamtx is a great project! Use w/e works best for your use case.

I agree, I'm just struggling to differentiate what BB offers that MMTX does not, so I can identify if there's a USP. If it's a passion project to scratch a personal itch, that's also great!

Also, can you share your latency measuring methodology?

jprjr_
0 replies
3h36m

I think Broadcast Box may have implemented WHIP/WHEP before MediaMTX so it tends to be one of the first results when looking up a WHIP-capable server. Plus it has a public instance so if you want to test WHIP real quick, it's pretty easy to just point OBS somewhere instead.

MediaMTX used to be "rtsp-simple-server" which really undersold what it was capable of (even back then it could ingest RTMP and output HLS and WebRTC).

Overall I think MediaMTX has more features and can do everything that Broadcast Box can.

pjc50
0 replies
6h0m

I didn't know this existed, so thanks.

breck
2 replies
1d1h

These intimate broadcasts are a new medium.

You hit the nail on the head here. I mean live streaming is already quite big but it seems like it's going to be going up 10x - 100x next few years.

It's really an amazing thing that makes the Internet more human.

tekknik
1 replies
7h37m

I feel the opposite. I’ve watched streaming as my primary form of media for over 10 years now, and it seems a majority of the small, “intimate” broadcasters that were fun to watch have had to get real jobs, or are sailing the world, or what have you. To me, it seems like it’s dying in favor of short form doomscroll videos.

breck
0 replies
39m

I’ve watched streaming as my primary form of media for over 10 years now

My goodness, you are living in the future. What else are you into? Where can I follow you (blog, twitter, warpcast, etc?)? Love to follow early adopters.

Flockster
2 replies
1d3h

How does it work with H265? My last info was, that WebRTC only supports H264 and VP8. It would be great to stream H265 via WebRTC.

Sean-Der
0 replies
1d3h

H265 support is available in Chrome now! Launch with these flags

`--enable-features=WebRtcAllowH265Receive --force-fieldtrials=WebRTC-Video-H26xPacketBuffer/Enabled`

I haven't used it myself yet.

SahAssar
0 replies
1d3h

WebRTC is codec agnostic as far as I know, it's up to the peers to negotiate the codecs used. Of course browsers will have more narrow support, but can sometimes activate more codecs via flags or you can use a non-browser client.

mapcars
1 replies
20h12m

Hi, this looks great but I tried to do the setup as described in the readme using OBS and streaming to your server and I saw 3-4 seconds delay, how exactly can I reach sub-second?

Sean-Der
0 replies
19h10m

What are your encoder settings? Mind joining the discord easier to debug I don’t actively check HN sorry :/

mannyv
1 replies
1d5h

I always wondered why broadcasters don't transcode on the client side.

RTMP can't handle it, but SRT (and apparently WebRTC can). It would reduce latency for sure. Of course, it does require a good network connection on the client end, but so does streaming.

pjc50
0 replies
6h25m

I suspect it's a desire to minimise CPU/GPU usage, especially if you're trying to stream a game from a PC through OBS.

kragen
11 replies
1d11h

you know, almost every time i try to talk with my family on jitsi, there's some kind of glitch. they can't see my screen, or i can't see theirs, or they can see it but only in super low resolution, or they have the camera turned on but i can't see it, or we all get kicked off, or something. can broadcast box allow us to use obs studio (or some other free, open-source software) to stream to each other, without relying on a proprietary server? i don't need 100k+ clients, i'd be satisfied with reliable connectivity between 2–4 clients! and i could run a server outside of nat

i'm not going to get 120ms latency though. i'm in argentina, they're mostly in the usa, and i have 200+ milliseconds of latency over the internet to anything in the usa

if broadcast box isn't what i'm looking for, is there something else? i already know about zoom, google, and teams, but those all make us vulnerable to proprietary servers

taskforcegemini
3 replies
1d11h

is your jitsi self-hosted? if so, where is it located?

kragen
2 replies
1d11h

i'm using jitsi.milliways.info, but i don't host it

jonathantf2
1 replies
1d3h

looks like that's hosted in DE - might not be the best latency? would try self hosting it on a box in Florida or something to try and get the best connection to you both

kragen
0 replies
1d2h

latency is totally tolerable, so i'm pretty sure the audio and video are flowing directly between our machines and not via .de

Sean-Der
3 replies
1d6h

Lots of options! More and more WebRTC SFUs are adding WHIP support. Check out https://galene.org/ it lets you do OBS in and has things like chat. Might be better if you are streaming to family and want chat. I would try and find a VPS that is best geographically located.

Cloudflare if you want this quick. Even though it is proprietary you have no vendor lock in, just change your WHIP URL in OBS if things aren’t going right.

kragen
2 replies
1d2h

apparently 'an sfu' is 'is a media server component capable of receiving multiple media streams and then deciding which of these media streams should be sent to which participants [of which the] main use is in supporting group calls and live streaming/broadcast scenarios.' according to https://bloggeek.me/webrtcglossary/sfu/

i suspect webrtc implementations in browsers are the source of many of the problems but i don't know how to debug that

mannyv
1 replies
21h6m

In HLS the client decides what stream to use. I presume in WebRTC it’s a negotiation.

Now I have to read the protocol. I was thinking the SFU was just a proxy, but it sounds more like it also handles some signaling.

kragen
0 replies
19h5m

i think in webrtc your code can decide what stream to use or to transmit in a turing-complete way; it doesn't define anything like that

bubblesnort
1 replies
1d11h

Jitsi is actually one of the better options. My only beef is that it got bloated with features I don't need.

kragen
0 replies
1d11h

i suspect that the problems with jitsi are maybe problems with webrtc in general, because i've experienced very similar things on the proprietary services. my limited experience with obs studio, though, has been flawless, and it seems to be the main daily driver for lots of people who make their living streaming?

gorbypark
0 replies
1d11h

Having discovered this 15 minutes ago I can't really speak in absolutes, but firing up OBS and using their test server, I think this would allow you to do exactly what you want. It's trivial to get OBS set up to stream to this, and while I haven't tried it, the docs say that it supports multiple streams to the same stream key. I guess it would just show each stream side by side in the client?

I think you could run your own server, run an instance of the front end and distribute the instructions to setup OBS to your family.

gkhartman
11 replies
2d11h

As someone unfamiliar with video broadcasting latencies, how does this compare to alternatives? Also, what are the hardware specs used to achieve the 120ms measurement?

Sean-Der
10 replies
2d2h

The alternative today is ~few seconds.

This is from my T420 Thinkpad on Linux. The only significant input I can think is my RTT to the server.

``` Reply from 165.227.221.230: bytes=32 time=37ms TTL=49 Reply from 165.227.221.230: bytes=32 time=37ms TTL=49 ```

Terretta
7 replies
2d

Some firms were broadcasting live feeds with sub-second (~200ms) latencies to broad audiences not long after the dot com bust.

There've always been (a majority of) broadcasters that had seconds to half minute (!) delays. Very few understood, or even today understand, how to tune every single touchpoint (including what's safe to do, and what one must not do.)

Having an ultra low latency “video toaster” / broadcast mixer is a critical piece for sure.

For us the motivations to work on this started with "backstage feeds" synced with live TV that needed to look simultaneous to the home cable viewer (think World Wrestling Entertainment) as well as Wall Street calls that needed to be (perceptually) in sync with dial-ins. And of course, Times Square NYE ball drops!

In reality, almost nothing matters that way. For vast majority of content, the viewer is consuming only your stream, without some more real time channel at the same time. In most cases, the viewer cares far more about visual quality than the delay.

Sean-Der
2 replies
2d

It drives me crazy that we have had the technical ability to do these things, but just not the demand. I believe it is a chicken/egg problem. It is one of those things that you don't get until you try it yourself.

I hope if more broadcasters (technical and non-technical) realize what is possible they will ask for it more!

For vast majority of content, the viewer is consuming only your stream

Agree! I think it is that way because you can't have intimate/interactive streams yet. When it becomes possible I hope to see more streaming to smaller/more connected audiences.

willis936
0 replies
1d5h

You see it everywhere you look. Audio is my favorite pet example. We hit the limit of human perceptibility in the 1980s. The marketing continued to push for two more decades. Now we have a situation where the average consumer is less educated and the enthusiasts are least educated of all.

We push technical advance until it hits a cliff in returns. We will never get there. There will be no 10 million mile drivetrain in a car. There will be no multi-generational fridge. There will be no house that withstands a tornado.

Software feels like it should be different because it's "free", but really any advance beyond the cliff is a great gift and not something to take for granted.

throwup238
0 replies
1d13h

I don't think it's a chicken and egg problem, it's an ease of use problem on the side of the provider.

An HLS stream is so simple to set up with passable performance that few other protocols can compete.

ta1243
0 replies
1d1h

If I'm watching the world cup final penalty shootout and I hear my neighbours cheering before I'm watching the player running up that basically breaks the entire game.

Some streaming services are 30 seconds plus behind OTA, at that level twitter and half a dozen news apps have pushed the fact the goal went in while the ball is still in the other half.

mannyv
0 replies
1d13h

The advent of live betting means that they have to ensure that live viewers don't have too much of an edge vs stream viewers.

kierank
0 replies
1d6h

Backstage feeds go in uncompressed via HD-SDI and have a latency of lines (microseconds to single digit milliseconds). Totally different ballgame.

dylan604
0 replies
1d14h

Cable has a large latency too in the seconds range compared to OTA broadcasts.

Even live news has large a large latency with remote interviews. Some of them are so bad it's uncomfortable watching the anchor have the patience to wait for a response before stepping on the remote feed.

jjbinx007
1 replies
1d13h

Vdo.ninja is also webrtc and low latency but I'm not sure of exact numbers.

Sean-Der
0 replies
1d5h

vdo.ninja is a fantastic project, can’t recommend it enough.

It made the dream of ‘co-streaming’ available to home users and not just those with hardware/pro setups.

freedomben
8 replies
1d2h

Much more of a meta question, but why is live-streaming preferred by some people?

Live streaming just seems to have so many downsides to me:

1. Requires real-time presence

2. No editing (meaning less efficient use of time for the viewer)

3. No client-side speeding up/skipping irrelevant parts

4. No possibility of index or table of contents

What are some use cases where live streaming is better?

fwip
1 replies
1d2h

Interaction with viewers is the primary one.

There is also the "event" nature of streaming, where the audience is excited to react to things together in real-time, like when people Tweet when watching the Superbowl or a show's season finale. Even for mundane streams, the "chat" will happily talk amongst themselves reacting to whatever the streamer is doing, which can feel like hanging out with friends.

There's also times where the streamer is discussing something that is happening now or just happening (world events, the Superbowl, a game update, etc), where viewers are excited about the content right now and don't want to wait for the traditional record->edit->release cycle.

freedomben
0 replies
1d2h

Ah thank you, this is the missing piece! That makes a lot of sense.

soulofmischief
0 replies
1d2h

Some viewers prefer livestreaming, and we see streamers catering to those audiences. It depends on the streamer, but often times there is much stronger sense of community when there is a live, topic-oriented chat, especially when the streamer engages with the chat. This is not something you can satisfactorily replicate with prerecorded streams.

Streaming is also much more low-cost to produce, editing can often represent an unwanted source of complexity and loss of creative control.

ratedgene
0 replies
1d2h

Live interaction with a community is pretty big.

pjc50
0 replies
6h12m

The very important thing in modern internet media: parasociality with the audience. Only by being live can you have a conversation with "chat", whether that's individual chatters or the hivemind of very large audience streams.

There is a special skill of streamers who can play an action game on one half of the screen and read enough of chat on the other half to make and respond to jokes, at the same time.

Another consideration: youtube and twitch apply different music licensing considerations to live, so almost anything that involves music MUST be live and unarchived.

jmyeet
0 replies
1d1h

Think of the difference between watching a live sporting event or a concert vs watching a recording of the same event. The latter might satisfy you but there's something different about the former.

Live streaming is inherently mor einteractive and there's a shared experience simply because you can't speed up.

I guess you could see that live-streaming is experience-based. VODs are results-based. Not strictly the case but there's a trend.

femtozer
0 replies
1d2h

I see one significant upside: combined with a chat, you can ask the content creator questions directly

bookofjoe
0 replies
23h40m

FWIW my YouTube live streams that go on for hours sometimes attract a total of 10-50 viewers who click in for even a second.

No one EVER interacts.

My 15-second YouTube Shorts of my cat get around 400 views, sometimes as many as 10,000!

Go figure.

filleokus
7 replies
1d12h

What's the state of the art in distributing WebRTC to 100k+ clients?

When I was more into the low latency streaming space a few years ago, it felt like WebRTC was there when it came to << 1 second latency, but the infrastructure to actually distribute it was not really. I think Cloudflare (and maybe some other vendors) were working on creating some standard, has it landed? Can I run my own horizontally scalable WebRTC broadcaster (are there open source implementations)?

Something like Low-Latency HLS or CMAF was at like < 5 second latency, but was on the other hand stupidly easy to distribute widely (just static files on a plain old CDN / http server).

disqard
2 replies
1d12h

I'm unaware of WebRTC being used for this sort of "multicast" context, especially with such a high "fanout" -- all I've used/built are p2p contexts like video/data.

Not knowledgeable here, but also interested if someone actually knows the answer to your question...

rattt
0 replies
14h9m

Microsoft's now shut down Mixer (formerly Beam.pro) primarly used WebRTC and had streams with up to 100k viewers at some point all via WebRTC, but can't find much info about it anymore now sadly.

gigachadbro
0 replies
1d6h

I previously was involved in a service where WebRTC was used for a fairly large number of N streams to M viewers. Like ~75 streams total and 500+ viewers. Solution space was public sector incident command (think police officers with multiple cameras on their body streaming to command center).

Sean-Der
2 replies
1d6h

Open Source doesn’t have an instant scalability offering yet. I hope someone takes it on!

Quite a few companies offer this a service though. Cloudflare, Twitch’s IVS, Dolby Millicast, Phenix RTS…

LiveKit did a write up on how they built theirs https://blog.livekit.io/scaling-webrtc-with-distributed-mesh...

mannyv
1 replies
1d4h

Interesting that "but a particular session can only be hosted by one instance"

Why is that? Is that a protocol issue with WebRTC or an implementation issue with the way WebRTC servers are written?

There are lots of ways to share session state across servers.

Sean-Der
0 replies
1d4h

To build this quickly I would chain Broadcast Box instances. Have it look like a n-ary tree and spin up another broadcast box instance when you need it.

`OBS -> Broadcast Box -> Broadcast Box -> Broadcast Box -> Viewer`

To start I would do WHIP between the servers as well. More optimization can be done, but it would be a good quick start.

gorbypark
0 replies
1d11h

I think state of the art in this space is WHIP/WHEP, which is what this project is implementing. I'm guessing 100k+ clients would be pretty difficult for Broadcast Box (out of the box), however Cloudflare has a beta stream service that's has a tagline of "Sub-second latency live streaming (using WHIP) and playback (using WHEP) to unlimited concurrent viewers." https://developers.cloudflare.com/stream/webrtc-beta/

ranger_danger
6 replies
1d3h

IMO This has the same problems inherent to WebRTC itself, which is that a large percentage of users in the world are behind symmetric or CG NAT and so TURN relay servers are needed, but never actually defined by the application, because there are no good free ones. Personally I've never been able to use any WebRTC-enabled service for this reason, nor anyone else I know.

Sean-Der
3 replies
1d3h

My public instance runs on a world addressable host and doesn't have TURN. I have no problem using it from my phone and it is behind a CG NAT. No TURN/Relay servers are needed.

I've never been able to use any WebRTC-enabled service for this reason, nor anyone else I know

Is this a browser/agent issue? You aren't able to use Google Meet or Zoom in the browser?

ranger_danger
2 replies
1d3h

Live-streaming in one direction doesn't need a relay because the users aren't uploading... And google/zoom use their own relays.

Sean-Der
1 replies
1d2h

Google doesn't use any STUN/TURN servers I just went and confirmed. The following is the RTCPeerConfiguration used.

```

https://meet.google.com/***-***-\*\*, { iceServers: [], iceTransportPolicy: all, bundlePolicy: max-bundle, rtcpMuxPolicy: require, iceCandidatePoolSize: 0 },

```

ranger_danger
0 replies
1d2h

It may not use it if you don't explicitly need it, there could be out-of-band detection that adds a TURN server or some other OOB relay if it's needed. I've also seen comments online that say Chrome in particular can support TCP for WebRTC which would negate the need for a relay (as normally only DTLS over UDP is used). But based on my understanding of how WebRTC and NAT works, using the typical UDP approach for bidirectional communication over symmetric or CGNAT absolutely should not work, barring some other method of NAT traversal such as a browser-based UPnP client or such.

toast0
1 replies
1d2h

Good, free TURN servers probably don't exist because the protocol is basically open proxy as a service --- who wants to run that without limits?

But something like this is a selective forwarder, so as long as you can find someplace to host it that can get an ip:port to listen on, all your users/compatriots should be able to connect to it, and you're ready to go.

ranger_danger
0 replies
1d2h

There are free VPN services though. But yes you're right, I forgot to qualify my statement saying that I was referring to bidirectional WebRTC... which I had assumed the author was referring to as well since they explicitly mentioned "WebRTC comes with P2P technology".

imtringued
1 replies
1d8h

My ancient latency claims of at most ~200ms (in practice ~100ms) are based around this test:

https://humanbenchmark.com/tests/reactiontime

Yes. Me clicking the mouse, the software transmitting the mouse click to a datacenter, software rendering the web browser using AVX2, software encoding the stream, sending it to the local browser and decoding it on the screen, shining photons in my eyes, me clicking the button a second time (which also needs to be transmitted to the datacenter), gets me around ~400ms over WebRTC on the reaction time benchmark vs ~200ms on the local computer. I'm not even trying. It's a janky as hell solution that is about to fall apart the moment you look at it funny.

Also, I hate ffmpeg for streams that last longer than a day. The latency creep of streams lasting weeks is horrible.

ta1243
0 replies
1d1h

I've had streams lasting years without any latency creep using professional encoders

half-kh-hacker
1 replies
1d4h

We've shown that many measurements of latency [...] ignore the full capture and playback pipeline

In the repo linked in OP is a screenshot showing a wall clock next to its playback on the streaming site -- that's end-to-end to me. So how is this relevant?

kierank
0 replies
1d2h

Because latency is a distribution and these photos are often selected at the best-case P0 end of all the encode/decode processes whereas actually what matters is the worst case P99.

A proper implementation will make sure the worst-case latency is accounted for and not cherry-pick the best case.

to11mtm
0 replies
16h27m

Can you show a connection between this site and the (AFAIK separate from UK OBE/OBS) Open Broadcaster Software itself?

While the page provides some insightful comments I don't see how it relates to implementation details short of 'they did not build a hardware card'.

thih9
2 replies
1d7h

Assuming I run this on a vps, what would be the bandwidth cost?

cheschire
1 replies
1d7h

Eight.

Could you add more to your question? Where did you already do research? What providers are you considering? How much bandwidth is “free” with the tier of VPS you’re paying for? Etc

This is a question that sales reps spend lots of work hours trying to help clients work out.

blitzar
0 replies
1d4h

Clearly the answer is 43.

monocularvision
2 replies
1d3h

I see the following statement in the README:

“You could also use P2P to pull other broadcasters into your stream. No special configuration or servers required anymore to get sub-second co-streams.”

I currently have this setup for doing a co-stream with a friend and it is terrible:

1. Friend is running OBS to capture his gameplay. 2. Friend has OBS streaming to a Raspberry Pi I have running at my house. 3. The Raspberry Pi is running nginx configured to accept the RTMP stream. 4. I run OBS on another machine to capture my gameplay, add overlays, etc. 5. My OBS has an input source using VLC to capture the stream from the Raspberry Pi.

The setup is awful. Video is pretty delayed and it often just stops working. I would love to look into this project but after reading through the README, I am unclear how I would use this for my setup. Any pointers?

Sean-Der
1 replies
1d3h

I would have your friend stream to Broadcast Box. You can run it on the Raspberry Pi!

Then I would pull that source into your OBS. You have two options

* Pull Broadcast Box as a browser source

* Use my PR that adds WHEP Sources https://github.com/obsproject/obs-studio/pull/10353

For now I would do Browser Source (is easier/no custom builds). In the future when WHEP is merged that is the way to go. If you get stuck jump in the Discord and happy to help debug! https://discord.gg/An5jjhNUE3

monocularvision
0 replies
1d

Sounds great! We will give that a try. Already hopped on the Discord, I will let you know how it goes.

mckirk
2 replies
1d5h

If this does what it says it does I'll be a very happy user. Playing RPGs 'together' with somebody over the internet isn't that much fun if they are a second or more behind in what's going on. I've actually looked for a solution to this problem (low-latency P2P streaming) quite some time ago, and couldn't get it to work with just OBS because of strange bugs and other issues, so I really appreciate you including this use-case :)

mrguyorama
0 replies
1d5h

Steam has this functionality built in, including streaming keyboard and mouse and controllers over the open internet. It's called "remote play together"

Sean-Der
0 replies
1d4h

I am excited for you to use it!

If you run into bugs or have any cool ideas join the discord and would love to chat https://discord.gg/An5jjhNUE3

eqvinox
2 replies
1d9h

Sub-second Latency

Broadcast Box uses WebRTC for broadcast and playback. By using WebRTC instead of RTMP and HLS you get the fastest experience possible.

Nothing in RTMP prevents you from achieving low latency; it's the software stack around it that determines latency for both RTMP and WebRTC. Only HLS does have some built-in deficiencies that cause extra latency.

imtringued
0 replies
1d8h

WebRTC is basically just signalling around RTP in a browser compatible manner. The problem with RTMP is that nobody is using flash based video players anymore so "RTMP and HLS" as a bundle is non-negotiable in web browsers.

Sean-Der
0 replies
1d6h

I agree, if you are in a perfect network! If you have any packet loss, changes in throughput, or roaming it breaks down.

RTMP for me is preventing that next generation of use cases.

piyushtechsavy
0 replies
1d9h

Broadcasting over WebRTC can be if the source is single. But in case of a scenario of multiple source, like take a conference of 100 people as example, will it be smooth?

nubinetwork
0 replies
1d6h

So can I use this to record multiple streamers, add commentary and tracker overlays, to stream it back to twitch? I've been wanting to make something like this for a while...

iod
0 replies
1d2h

People interested in this project might also be interested in Cloudflare's webrtc streaming service¹ as a cloud hosted solution to this same problem. "Sub-second latency live streaming (using WHIP) and playback (using WHEP) to unlimited concurrent viewers." Using the same OBS WHIP plugin, you can just point to Cloudflare instead. Their target pricing model is $1 per 1000 minutes.² Which equates to $0.06 per hour streamed.

¹ https://developers.cloudflare.com/stream/webrtc-beta

² https://blog.cloudflare.com/webrtc-whip-whep-cloudflare-stre...

Laesx
0 replies
1d11h

I've been using OvenMediaEngine for years which seems to be the same thing as this, but this looks pretty promising I'll give it a shot