So that is "432 Mbit/s per laser, and 9000 lasers total". I don't know you guys but I find that statement much more relatable than "42 PB/day". Interestingly, they also say each laser "can sustain a 100Gbps connection per link" (although another part of the article even claims 200 Gbit/s). That means each laser is grossly underused on average, at 0.432% of its maximum capacity. Which makes sense since 100 Gbit/s is probably achievable in ideal situations (eg. 2 satellites very close to each other), so these laser links are used in bursts and the link stays established only for a few tens of seconds or minutes, until the satellites move away and no longer are within line of sight of each other.
And with 2.3M customers, that's an average 1.7 Mbit/s per customer, or 550 GB per customer per month, which is kinda high. The average American internet user probably consumes less than 100 GB/month. (HN readers are probably outliers; I consume about 1 TB/month).
Netflix uses 3-7GB an hour. The average person is spending 4-5hrs a day watching TV. I’d say most are above 100GB/month.
But that’s me.
Who has 4-5 hrs a day to watch television? ..or am I completely out of touch?
_actively_ watch? Probably not many. Having it on as background noise however? 5 hours is pretty easy
Is that still a thing with young people? I associate leaving the TV on in the background as an older generation thing.
I'm middle aged. This kind of background noise sounds terrible to me.
Maybe I just grew up in a quiet place.
John Von Neumann liked to do math with the TV on as background noise. Genius.
I heard all those stories about Von Neumann working like that. According to a biography, his wife once designated a room as his office and he became very angry about that since it was too quiet for him to work there.
Personally I need almost complete silence in order to get anything done, his abilities in this regard always fascinated me.
Same and same and same, but I know exactly why I won't leave the telly on - I'm very susceptible. It grabs me. Even though I have no interest in ads or even 95% of programming. It's not a pleasant feeling.
I grew up being accustomed to having the TV as background noise but stopped watching it when I moved out. Now, when I visit my parents, it's honestly quite difficult for me to focus on conversation - there's a machine in the corner making deliberately attention-grabbing sights and sounds. So I think your experience is normal & I empathise with the generation that complained about TV ruining family life.
When you have permanent background noise in the form of Tinnitus already it's an improvement.
It is terrible, yet quite some people rather have that distracting noise, than hear their own thoughts.
For some it is just the illusion of having more people around them, though.
There are people who like to hear other humans blathering on all around them. Then there are sane people.
We're outnumbered.
If Millennial still = young then yeah, YT or something in background on TV, doing something on laptop (dev, or photo editing or other) and then occasionally phone over laptop as well to reply to chats and stuff.
I would kill for some decent high res wide fov AR glasses.
Millennials are currently in their late 20s to early 40s.
I mean I use youtube lets plays or twitch streams for that, but yes its still a thing.
It's a normie thing coping with unbearable ringing emptiness of mind. My sister and niece (7yo) do it.
watch, or leave running as background noise …
Any recommendations for shows that make good background noise? I wish they had more concerts.
For me it depends on what I'm doing. During working hours I have Soma FM playing at a low volume. Otherwise I'll probably have cooking videos or history documentaries playing as the background noise.
the ones i see most often are: The Office, Futurama, Simpsons, Friends, Brooklyn 99, How It's Made (my favourite)
kids these days mostly use youtube or twitch for background noise i think
Happy british baking children! I dont know what its called but it is on netflix and they are indeed happy, british and they bake. Or just put on Asianometry if you need to focus a bit more. I must have been through his back log a dozen times at this point. There is something about that mans voice that helps relax and focus like nothing else
Shouldn't audio (radio) suffice for that?
"What's a radio?" the kids ask.
According to historical Nielsen data[1] from 1991 to 2009: most Americans.
Even back to 1950, for per household data, it was above 4 hours.
[1] https://www.nielsen.com/insights/2009/average-tv-viewing-for...
Thats per household, not per person. That's different. And households also tended to get smaller.
The 1991 data and on was 4+ hours per person (older than 2). 7-8 hours per household.
They didn't have per person for the 1950 to 1990 data, only household (pdf in the link).
Children, sadly.
Families sharing an internet connection. Kids watch 1 o 2 hours each, mom and dad another hour each.
Yep, but that data originates from the providers network and never leave the providers network, so they probably don't count it towards your usage the same way.
I don't think that breaks net neutrality either, which the FCC seems to be reimplementing
Edit: see https://openconnect.netflix.com/en/
This obviously has no relevance for starlink which does not have local datacenters for cdn purposes. All that bandwidth is going through the satellites right before it reaches the user.
I wouldn't be surprised if starlink doesn't at least experiment with making the satellites a big bunch of CDN nodes.
Imagine they put 10TB of flash memory on the satellites and run virtual machines for the big CDN companies (cloudflare, Google, Netflix etc).
I reckon that 10TB is still big enough to service a good little chunk of internet traffic.
I guess the problem is that most of the useful bits of that 10 TB are going to be most of the time somewhere far away from the target audience.
You have to share that 10 TB with everything on that satellite's orbit.
Definitely sounds like a no-brainer / reasonable next step.
Most ISPs have CND appliances in their racks to save on uplink bandwidth. And from a satellite perspective the uplink (in this scenario: the downlink from the satellite to the gateway) definitely is the expensive bottleneck.
You want to avoid congestion and every bit of caching could be helpful.
Then it comes down to the mass and power budget (and the reliability of flash drives in space) - but that doesn't seem too terrible.
All my data usage is over LTE and NR. On one line it mostly gets used for streaming video (YouTube,plex,twitch) and averages around 500GB/mo. I rent a line to a friend and he's doing over 10TB/mo on mostly machine learning stuff and astronomy data.
T-Mobile absolutely counts all data used over the network, my voice lines go QCI 9 (they are normally QCI 6) when over 50GB of any kind of data usage each month, the home internet lines are always QCI 9. I don't have congestion in my area so it does not affect my speeds. This is QoS prioritization that happens at physical sector level on the tower(s).
They absolutely count it the same way. Comcast just gives me a number for bytes used, with a limit of (IIRC) 1.2TB above which they start metering. Our family of four comes dances around hitting that basically every month. The biggest consumer actually isn't video, is my teenage gamer's propensity for huge game downloads (also giant mod packs that then break his setup and force reinstall of the original content).
I think a few hundred GB for a typical cord-cut household is about right.
Do you have a source on the 4-5 hrs?
https://www.statista.com/statistics/420791/daily-video-conte...
300+ minutes a day for TV + vMOD (streaming services). Since no one actually watches TV anymore, at least not through traditional TV, I summed them.
I think the average Instagram or TikTok user must be using more than 100GB/month. And if you count YouTube and Netflix, it's probably more than that.
Is resolution going to peak? Like speeding on a highway are there diminishing returns? On the other hand, bandwidth availability seems to also drive demand...
It should. At some point you are beyond any difference a human eye can detect on a tv or monitor you’re sitting less than 10ft away from.
It probably won’t though because capitalism means there has to be a reason to sell you a new widget and 3D was an utter failure.
not for awhile. apple vision / oculus will stream (4k/8k) 3d movies.
https://developer.apple.com/streaming/examples/
Two things:
Resolution is always determined by angular resolution at viewing distance, even for analog TVs(they were smaller and further away), and also,
Videos on Internet is always heavily compressed - the "resolution" is just the output size passed to the decoder and inverse of minimal pattern size recorded within, technically not related to data size. Raw video is h * v * bpp and have always been like low to dozen Gbps.
Just my bets, the bandiwth may peak or see a plateau, but resolution could continue to grow as needed for e.g. digital signage video walls that wraps around buildings.
Sure, but "4k" is still being used as a differentiator for streaming companies in how much they charge. Even then they serve up some pretty compressed streams where there's room to do less of that for a noticeable notch in quality.
There's of course a limit. The "native" bitrate equivalent of your retina isn't infinite.
Next step though is going to be lightfield displays (each "pixel" is actually a tiny display with a lens that produces "real" 3D images) and I assume that will be a thing, we shall see if it does better than the last generation of 3D TVs/movies/etc. That's a big bump in bitrate.
There's also bitrate for things like game/general computing screen streaming where you need lots of overhead to make the latency work, you can't buffer several seconds of that.
The next gen sci-fi of more integrated sensory experiences is certainly going to be a thing eventually too. Who knows how much information that will need.
When more bandwidth becomes available, new things become possible, sometimes that are hard to imagine before somebody gets bored and tries to figure it out.
When I'm futzing around with ML models, I'm loading tens of gigabytes from disk into memory. Eventually something like that and things orders of magnitude larger will probably be streamed over the network like nothing. PCIe 4.0 x16 is, what 32 GBps? Why not that over a network link for every device in the house in 10 years?
This is being downvoted but it's probably about right.
My smart TV used 483 GB in the last 30 days
Most customers aren't served by lasers, their data goes up to the satellite and down to the nearest gateway. Lasers serve customers out of range of a downlink gateway, and the traffic probably travels the minimum hops needed to get to one.
But with lasers, it makes sense to route your packets via space. For example traffic to a different continent would be faster (and cheaper) through space. Furthermore, I assume lasers have more capacity than gateways, so they could increase capacity of one satellite by bundling with more gateways.
I thought that Starlink always "landed" to a base station back in the same jurisdiction? I think relaying through space could open a regulatory can of worms.
What kind of worms?
All countries have strict regulations on radio waves, whether that's sending or receiving. The UK for example requires a license for base stations that stipulates things like geographical boundaries, etc.
You can't freely blast radio waves into a country without falling subject to its varying regulations, but the regulations for "pre Starlink" satellite broadband/phones/etc are fairly well established.
Well maybe it makes sense for US costumer to send their traffic down from Starlink in Canada and then via fiber to the USA? I do not really see the problem if the traffic is encrypted and forwarded.
Bypassing spying, geofencing and other regulatory stuff, perhaps? Also curious what the can of worms might be.
Unfortunately, the routing to make this feasible doesn’t exist yet. Users need a single IP address from a range that’s homed at a single PoP. Starlink doesn’t support user-user connections through the mesh, you need to go all the way out to your PoP, then over to the other users PoP, then back through Starlink to that user.
The way Starlink satellites are in orbit, the same satellites will remain "ahead" and "behind" you in the orbital plane. Those laser links (specifically!) will remain relatively persistent. This arrangement is similar to Iridium FYI.
FTA: "in some cases, the links can also be maintained for weeks at a time"
FTA: "in some cases, the links can also be maintained for weeks at a time"
I think there is a lot of variance. The article also states about 266,141 “laser acquisitions” per day, which, if every laser link stayed up for the exact same amount of time, with 9000 lasers, means the average link remains established for a little less than an hour: 9000 (lasers) / 266141 (daily acquisitions) * 24 * 60 = 49 minutes
So some links may stay established for weeks, but some only for a few minutes?
I would guess that the links between satellites on the same orbit stay for weeks, but the ones that cross between orbits have to constantly re-established.
Correct.
I believe Starlink (like Iridium) doesn't even try to establish connections "across the seam," ie the one place the satellites in the adjacent plane are coming head on at orbital speed.
This make side-linking easier because the relative velocity is comparatively low, but in general you unavoidably still need to switch side-link satellites (on one side) twice per orbit. Hence 49 minutes: this average must be calculated per connection not per second, so the front/back links (plus random noise) count less, so it only drags the average from 45 minutes up to 49 minutes.
I believe Starlink (like Iridium) doesn't even try to establish connections "across the seam," ie the one place the satellites in the adjacent plane are coming head on at orbital speed.
The slide showing the multiple possible paths traffic can take seems to disagree with this statement?
The first slide says "9000+", suggesting that the number of space lasers is slightly over 9000. I feel like that's an important distinction.
Most likely it's a reference to the "it's over 9000!" meme.
Data might get counted multiple times as it takes many laser hops to reach its destination.
Good point.
Just because they seem grossly underused, there are probably plenty of other non-ideal constraints like power usage for instance.
Thermal management is also a tremendous problem in space. All power generated must be radiated away, and satellites effectively sit inside a vacuum insulator.
I'd be interested in what the sustained power/thermal budget of the satellites is.
That means each laser is grossly underused on average, at 0.432% of its maximum capacity. Which makes sense since 100 Gbit/s is probably achievable in ideal situations (eg. 2 satellites very close to each other), so these laser links are used in bursts and the link stays established only for a few tens of seconds or minutes, until the satellites move away and no longer are within line of sight of each other.
I think I agree that each laser is grossly underused on average, but if you read the article, there's quotes about the uptime of these links. They're definitely not just "used in bursts [of] a few tens of seconds or minutes".
Yeah 1TB seems average for anyone in IT who is really into data.
I'm kinda pissed their is no local ISP competition in my area....and iv tried reaching out to companies with little success...or they say were expanding to your area soon but will not say when.
10GB symmetric fiber isn't hard. Hell I'd use more bandwidth if I could but I'm stuck with no fiber atm
Don't forget that every communication protocol has fixed and variable overhead.
The first is a function of the packet structure. It can be calculated by simply dividing the payload capacity of a packet by the total number of bits transmitted for that same packet.
Variable overhead is more complex. It has to do with transactions, negotiations, retries, etc.
For example, while the theoretical overhead of TCP/IP is in the order of 5%, actual overhead could be as high as 20% under certain circumstances. In other words, 20% of the bits transmitted are not data payload but rather the cost of doing business.
My parents moved in and, being old, stream TV all day (instead of cable) and end up using about 40 GB per day with 1080p. We keep hitting our max of 1.2 TB set by our cable company (because there are others in the home!).
I should probably see if my router can bandwidth limit their mac addresses...
Dead internet theory (alive and well!)
There’s probably redundancy in the links. In other words, A sends a MB to B which sends it to C, that’s 1 MB of information transmitted to customers but 2 MB of laser transmission.
I’d have guessed they count “delivered bytes” not “transmitted bytes” and then you need to take into account each leg of the transfer. Which for starlink is at least two (for the simple bend pipe situation) and up to potentially something like ?20? (for a “halfway around the globe, totally starlink” connection). The latter is probably statistically negligible, but even the factor two would give ~2% utility. Which, taking into account, that at least 2/3 of the orbit time is spend out of reach of anywhere useful, this would give something like 1 in 10 possible bytes being transmitted. Which is much better than I’d have guessed if asked blindly
Where did you get that 100GB/mo number from? 4K streaming eats up data transfer quickly. Comcrap & friends knew what they were doing making arbitrary data caps that sounded like a big number at the time. Wireline data caps should be illegal.
There is one key issue of keeping lasers aligned for long durations between satellites and even between a satellite to a ground station. There are vibrations in satellites and even a tiny bit of that vibration translates to beam misalignment. Am not an expert though. That could explain the bursts.
So it's hard to sustain the theoretical 100GPS connection for hours let alone days across 2 end points which are in constant motion.
if you stream you use a lot more than 100gb/month. I use around 1tb with a family of 3.
"Customer" may refer to households, not individuals, in which case it could be numerous internet users soaking up data per customer.
i think even more relatable is how many customers they can handle at say 200 mbps