Hacker News new | past | comments | ask | show | jobs | submit login

> Requests to Facebook's servers accounted for 1.5 MB of the 2.4 MB tranfered by the ENTIRE page. 87 network requests, 35 javascript files injected and it didn't even load all the comments! (I had to click on a "Load more comments" button to load the rest of the comments.)

This is an issue for a really small segment of users. Not really a cause for concern in 2017. Especially when most of that stuff is already cached.

> Why the hell do you need 37 javascript files and 1.5 MB to load three comments?

That's oversimplifying it. You are loading 37 assets to support a social commenting platform.




> This is an issue for a really small segment of users.

Not really. I'm from India; while my home network connection is pretty good, when I'm travelling anywhere outside the metros, the mobile network speed is abysmal. However quite a few news outlets employ Facebook's comment system. You can imagine what a nightmare it'll be for users to load all that with a 2G connection.

> You are loading 37 assets to support a social commenting platform.

37 Javascript files out of which, I'm willing to bet, most are tracking agents employing every trick in the book to record all activity. Wasn't there a recent article that discovered that FB continuously sent home the position of your cursor when you're scrolling through your newsfeed so that they can place ads better?


Your experience on 2G networks is already degraded; removing an asynchronous comment system will not magically fix it or make it better.

> 37 Javascript files out of which, I'm willing to bet, most are tracking agents employing every trick in the book to record all activit

You would lose that bet.


My solution is much easier. I block all known FB IPs via provisioning rules on my phone, and never have to see (or pay for) any of that crap.


You're being downvoted because a large segment of HN is engineers who prefer sites like http://motherfuckingwebsite.com and http://evenbettermotherfucking.website

Basically, what it comes down to is that if all 37 requests are done in parallel, then you're only loading the page as slowly as the slowest. Sadly, this is rarely the case. Most browsers will only make 6 concurrent requests from a server, which you can see in Firefox via the about:config page and the network.http.max-persistent-connections-per-server. Worse, some of that JS is probably downloading more JS, which means that you will have multiple requests going out.

On mobile, this means you're running 8-30 second delays for each "group" of requests, especially if you have poor reception. Good news is that after that initial latency, the actual transfer should go fairly fast.

On the other hand, a social commenting program should need at a minimum 1 JS file, 1 CSS file, 1 for the actual data (3 comment texts, 3 base64 encoded profile pics at a small resolution). So FB is running at ~30x the optimum (87 assets, of which 35 are JS).

If you think it's not that big a deal, I'd suggest taking a trip abroad or just out to some smaller towns in your area, and really experience why users feel the internet has not really gotten faster since the 90s (keeping in mind users don't care about data size, they care about latency).


Agreed with everything you said except

> keeping in mind users don't care about data size, they care about latency.

Users do care about data size since there are still a lot of people on data limited contracts even in the UK which has a fairly healthy mobile market.

What people don't necessarily realise is the connection between the two, I'd quite like it if the mobile browsers had a running total of the total amount of data transferred in that tab on each new visit/refresh.

I'm a programmer and I still don't know which pages are heavy or not (given fixed bandwidth the time to interaction would be a clue but mobile internet latency is all over the map so you can't tell if it's a big site/page or just the mobile internet shitting the bed).


You are quite correct. Data caps do make users concerned with the amount of data transferred, but only post hoc, or as a proxy for latency.

Thinking about it, there's got to be a plugin for firefox/chrome that show the data size of each page. For mobile, you can install firefox which allows you to install plugins (pretty much essential for blocking ads in mobile browsing).


I use Chroma Android variant and it has the option to show inbound/outbound traffic on the status bar updated a few times per second (and finally battery state as an actual percentage) that stupid little battery picture is bloody useless (hmm, it's just below halfway, so around 45% I guess..checks 27%..bad UX).

For me that is enough, if I load a page and see it sit at 200kb/s for more than a second or two I'll often nuke the page. I have 1Gb of data on my phone package (simply don't use mobile data much, I have unlimited fiber at home and the office).


> Basically, what it comes down to is that if all 37 requests are done in parallel, then you're only loading the page as slowly as the slowest

Doesn't apply here. Facebook is using async and downgrades the experience for mobile clients.

> Most browsers will only make 6 concurrent requests from a server, which you can see in Firefox via the about:config page and the network.http.max-persistent-connections-per-server.

The limitation is per host not per server.

> On mobile, this means you're running 8-30 second delays for each "group" of requests, especially if you have poor reception.

We are talking about an asynchronous implementation.

> On the other hand, a social commenting program should need at a minimum 1 JS file, 1 CSS file, 1 for the actual data (3 comment texts, 3 base64 encoded profile pics at a small resolution). So FB is running at ~30x the optimum (87 assets, of which 35 are JS).

That's the payload for the desktop clients. Furthermore, try finding a website that follows 'the optimum' standard you are describing.

> If you think it's not that big a deal, I'd suggest taking a trip abroad or just out to some smaller towns in your area, and really experience why users feel the internet has not really gotten faster since the 90s (keeping in mind users don't care about data size, they care about latency).

If you are still using 90's technology, you won't be able to consume most of the current websites.

An asynchronous widget will not be the 'breaking point,' far from it.

The scenarios you are describing will not trigger the loading of these widgets.


> Per host not per server

You are correct, but I will be a little more terse with words, per hostname (domain). There is also a max connections limit as well which is generally 10-17.

2014: http://sgdev-blog.blogspot.com/2014/01/maximum-concurrent-co...

Today: http://www.browserscope.org/?category=network&v=top


I know, it's not like huge numbers of people view web pages on low-powered devices with limited battery life connected via an unreliable, metered wireless internet connection.


> I know, it's not like huge numbers of people view web pages on low-powered devices with limited battery life connected via an unreliable, metered wireless internet connection.

Love the sarcasm. What devices are you talking about here?

Name ten popular websites that will function on low-powered devices with limited battery life connected via an unreliable, metered wireless internet connection.

Heck, give me a site that will properly work on Motorola Razr.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: