Hacker News new | past | comments | ask | show | jobs | submit login

isn't 2.55 Megabits/s for Google Fiber a bit low? I imagine Netflix throttles the bandwidth. After all, there's no point downloading the whole movie if a person decides to switch to something else after 10 minutes.

If throttling is happening, what is Netflix testing then?




My guess is that Netflix is reporting the average stream bandwidth accessed by customers of each ISP. So if video X has streams at 500kbps, 1Mbps, 2Mbps, and 5Mbps, then a higher percentage of Google Fiber users being able to reliably stream the 5Mbps feed instead of just one of the lower-bitrate ones would result in a higher average.

So there would be a built-in ceiling of the highest bitrate Netflix offers. And based on this data, I'm guessing that for a bunch of their content, their streams max out around 2-3Mbps (which is a reasonable bitrate for 720p film/TV content).


The Ars article biffed, it should be megabytes per second (MBps vs Mbps). It's kind of confusing on the Netflix graphic, because the headings are in all caps, so you just have to use context/knowledge to differentiate.


From the mention of a "variety of encodes" affecting their numbers, my interpretation of this figure is the average bandwidth used to stream a video to a subscriber, and in that case 2.55MBps sounds shockingly high for an online streaming video bitrate. 2.55MBps would put it a bit above over-the-air broadcast MPEG2 HD rates, whereas not everything on Netflix is HD, and I would assume they're using more efficient codecs than MPEG2. Especially since, from my past experience, I know that they have HD streams that are playable on sub-6Mbps connections.

EDIT: and as tedchs points out, the other provider numbers definitely reinforce the interpretation of it being Mbps: I don't think I've ever seen AT&T advertise a non-Uverse DSL speed of above 6Mbps, which wouldn't let it hit 1.5MBps.


It doesn't have to be 2.55MBps averaged over the entire stream. It could be 2.55MBps during bursts of buffering.

That being said... Listing the headers in all caps (without so much as clarifying in the accompanying post) was a mean thing to do...


I'd argue that deciding to name 8 bits a byte and making the abbreviation of the latter simply the capitalization of the former was an ill-considered choice, and has led to much confusion, consternation, and gnashing of teeth over the years.


We should adopt the telecom term octet, and just get rid of the term byte.

Then again "Megaoctet" doesn't quite flow, but there would be no confusion between a bit and an octet however you abbreviate it. We're still left with is deciding whether to multiply by 1000 or 1024 for each magnitude though.


If we could just agree to use only one or the other, we'd be fine!


Read my reply to tedchs, I believe he[assuming] is still reading the numbers wrong, including the mobile numbers.

I'm not sure about your numbers, as I'm finding a bunch of conflicting information online. From what I've seen, an HDTV quality stream would be around 2 MBps, but I've also seen numbers closer to what you're reporting.

It's conceivable to me that Netflix could do a quick speedtest that would exceed the video bitrate when testing your connection for HD-ness.

In any case, I might be wrong. Netflix should really clarify this. There are people arguing about the same thing in the comments to that article.


A plea from all those who try and understand which acronym is being used - we will not think less of you if you use Mbits/sec and MBytes/sec - and I suspect the rate of confusion will be somewhat reduced.

(There are those who aren't aware that Mbits/second is the "SI" sense of "Mega, and think that there are not 1048576 bits/second in a Mbit/second, but, we'll leave that for another day)


> (There are those who aren't aware that Mbits/second is the "SI" sense of "Mega, and think that there are not 1048576 bits/second in a Mbit/second, but, we'll leave that for another day)

To be fair, that 2.4% KB/KiB delta isn't terribly interesting until it's into the "giga" or "tera" prefixes (where it compounds to 10-12%). Given you see way more than that in bandwidth fluctuation anyway, it's even less relevant.

Plus we always measure our files in byte-suffixed units rather than -bit, which increases the confusion by a factor of 8.

That's how I "lost" half a terabyte in my NAS :( Knew that going in, but that's still a crazy amount of storage to apparently disappear due to a naming convention.


Actually, my read of it is megabits-per-second is correct, when you consider that the mobile connections at the bottom are going to get approx. 700 kilobits per second (kbps) consistently. Google's performance here is representative of the fact that most people do not have gigabit connections all the way to their Netflix-streaming devices, for example if they are using Wi-Fi.


No, WiFi or your cheap local home router do not play into this. The only possible interpretation is that Google Fiber allows netflix to on average deliver higher bitrates compared to competitors. The reason you don't see higher speeds is because there is no point in streaming at a higher bandwidth than the content actually offers.

All in all, these are terribly misleading statistics. At best, they suggest that Googles competitors can't get their shit together to even allow for one HD stream.


It's not uncommon for people to try and stream 2 Netflix movies at the same time though a single internet connection.


I believe you are still getting the numbers wrong.

There's no way Google Fiber only hits 2.55 Mbps consistently. That's slower than the slowest speed advertised by crappy ATT DSL that would be barely be fast enough to watch Netflix at all.

Regarding the mobile speeds, 700 KBps sounds about right for someone on a 3G (LTE or HSPA (T-Mobile)) connection. I just did a test right now, and I get about 900 KBps

Also, regarding your last point, most people probably at least have 802.11g by now, which maxes at 54 Mbps, waaay above 2.55 Mbps.


Lots of content on netflix isn't encoded in hd which means the average stream rate includes many streams that can't go above 1.xMbps. And their top stream rate for hd last time i checked was something like 6Mbps - clearly ruling out anyones average stream from netflix being 2MBps.


No way it's megabytes per second. WiFi G mode provides a theoretical cap of 6.75MByte/s, and we all know you'll never see that. Protocol overhead combined with interference and range, and you're lucky if you can keep it above 1MByte/s.

Yes, WiFi N is proliferating, but it still has a host of caveats. A single-channel N device with a 20MHz span, instead of 40MHz, still only gets 9MByte/s theoretical max.

Yes, I know, you can use a wired connection. You can have dual-channel N and you can use 40MHz span, but we're talking about average rates. Most folks I've seen don't use wired. Most N devices are single-channel, and I read again and again that most 2.4GHz networks will actually perform better in 20MHz mode (due to interference)


The highest rate for netflix streaming is ~600kb/s (5Mbps). Since many movies are not in HD, and compression efficiency varies according to image complexity, that's right on the bat. You're not going to use the other 995mbps for video streaming unless you send uncompressed 4k frames. Even monstruous 8k video (7680x4320) compresses to ~600mbps.


I found it odd too that it does only slightly better than the others, but I think that's the max speed Netflix needs on average. If say Netflix were to stream 4k content, then Google Fiber numbers would show something like 100 Mbps, and the others would be left in the dust, since even with their max speeds they can't handle that.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: