Comments on: Web pages are getting more bloated, and here’s why https://www.pingdom.com/blog/web-pages-getting-bloated-here-is-why/ Website Performance and Availability Monitoring | Pingdom Sat, 21 Dec 2013 08:47:45 +0000 hourly 1 https://wordpress.org/?v=6.4.4 By: Anirudh Mathur https://www.pingdom.com/blog/web-pages-getting-bloated-here-is-why/#comment-2019 Sat, 21 Dec 2013 08:47:45 +0000 http://royalpingdom.wpengine.com/?p=10519#comment-2019 Extremely informative article . Thanks a lot for providing answers to
the questions i was pondering on for a long time so simply & effectively !!!

]]>
By: PBX https://www.pingdom.com/blog/web-pages-getting-bloated-here-is-why/#comment-2018 Tue, 24 Jul 2012 07:35:55 +0000 http://royalpingdom.wpengine.com/?p=10519#comment-2018 In reply to Lennie.

I agree with you on this.

]]>
By: Andrew Johnstone https://www.pingdom.com/blog/web-pages-getting-bloated-here-is-why/#comment-2017 Tue, 27 Dec 2011 13:16:42 +0000 http://royalpingdom.wpengine.com/?p=10519#comment-2017 The interesting part is that increasing broadband connections speed will not dramatically increase the speed at which pages will load and render. The problem is with the expense of establishing tcp connections in the first place. Additionally all browsers limit the number of simultaneous connections that can be made. As highlighted by the table below.

HTTP/1.1 HTTP/1.0
IE 6,7 2 4
IE 8 6 6
Firefox 2 2 8
Firefox 3 6 6
Safari 3,4 4 4
Chrome 1,2 6 ?
Chrome 3 4 4
Chrome 4+ 6 ?
iPhone 2 4 ?
iPhone 3 6 ?
iPhone 4 4 ?
Opera 9.63,10.00alpha 4 4
Opera 10.51+ 8 ?

http://www.stevesouders.com/blog/2008/03/20/roundup-on-parallel-connections/

There are a number of ways that attempt to reduce tcp connections and both apply to the above. Firstly http keepalives are aimed to reduce the tcp connection overhead, additionally http pipelining aims to reduce the round trip time for http requests.

http://en.wikipedia.org/wiki/HTTP_persistent_connection
http://blog.amt.in/what-is-http-pipelining-why-is-it-important-f
http://www.blaze.io/mobile/http-pipelining-big-in-mobile/

The other aspect is that the analysis collected is not very descriptive of what it is based on. For example, of those sites, how many were correctly using etags and gzip compression?

]]>
By: Pete Markiewicz https://www.pingdom.com/blog/web-pages-getting-bloated-here-is-why/#comment-2016 Fri, 09 Dec 2011 21:03:40 +0000 http://royalpingdom.wpengine.com/?p=10519#comment-2016 I’d be VERY interested to find historical data on web page downloads going back further than the HTTP archive (about 1 year). It seems to me that greater efficiency afforded by HTML5 and CSS3 might paradoxically lead to increased resource use, as described by “Jevons’ Law” or “Jevons’ Paradox” for oil and other energy consumption in the real world. Does an increase in efficiency lead to greater resource use?

Comparing the web of 10, even 20 years ago to today would be VERY instructive. Was the “world wide wait” of the 1990s worse than today? Or, have we developed UX that hides increased downloads (as described in this article http://www.uie.com/articles/download_time/)

]]>
By: Roger Burke https://www.pingdom.com/blog/web-pages-getting-bloated-here-is-why/#comment-2015 Tue, 06 Dec 2011 23:53:44 +0000 http://royalpingdom.wpengine.com/?p=10519#comment-2015 Ineresting stats from Pingdom and would be a problem if they were static pages being sent for the server. What we now have is Ajax & while that may increase the javascript load it will actually decrease the traffic.

Also the browser is now being used more to service pages through javascript which again reduces the traffic.

So while the individual page size may be increasing, it would be interesting to see how long they stay up before being replaced.

]]>
By: Lennie https://www.pingdom.com/blog/web-pages-getting-bloated-here-is-why/#comment-2014 Sun, 04 Dec 2011 00:54:58 +0000 http://royalpingdom.wpengine.com/?p=10519#comment-2014 Something else I noticed, which in theory is a good thing but obviously impacts performance of your not using a CDN: https is used more for external resources now.

]]>
By: Lennie https://www.pingdom.com/blog/web-pages-getting-bloated-here-is-why/#comment-2013 Sun, 04 Dec 2011 00:37:04 +0000 http://royalpingdom.wpengine.com/?p=10519#comment-2013 What I really hate are the large numbers of redirects (like url shorteners) and how all these extra Javascript libraries are loaded from so many different domains.

Many sites now have 10+ domains on every single page, that is just stupid. It doesn’t make any sense to me.

Also I have an improvement request for Pingdom, add a (optional) accept encoding gzip deflate header to your monitoring service it is more realistic.

]]>
By: Brian Manning https://www.pingdom.com/blog/web-pages-getting-bloated-here-is-why/#comment-2012 Fri, 02 Dec 2011 13:47:17 +0000 http://royalpingdom.wpengine.com/?p=10519#comment-2012 One interesting thing to see would be the relative size change of videos over that time. That’s a lot of bandwidth, and people are serving higher quality videos, yet at the same time codecs are advancing and becoming more effecient. Just would be an interesting additional study for someone.

]]>
By: Ted Goas https://www.pingdom.com/blog/web-pages-getting-bloated-here-is-why/#comment-2011 Fri, 02 Dec 2011 13:37:54 +0000 http://royalpingdom.wpengine.com/?p=10519#comment-2011 has anyone mentioned Micro JS yet? Popular core JS libraries are big (jQuery, jQuery UI without customization are over 100k each. A lot of site don’t use everything included in them.

Using MicroJS can help ensure site authors only add what they need and don’t get too much extra code.
http://microjs.com/

]]>
By: Dan https://www.pingdom.com/blog/web-pages-getting-bloated-here-is-why/#comment-2010 Fri, 02 Dec 2011 13:32:04 +0000 http://royalpingdom.wpengine.com/?p=10519#comment-2010 About the JS libraries, I mean sure shims and polyfills are bad in a sense because they create overhead that shouldn’t be there but it is just a fact of life if you are using a new technology. I enjoy writing jQuery, yes I can write vanilla JS. The one big thing you do with JS is interact with the DOM so to me $() or document.getElement.byId() I mean its not a huge difference until you do it 50 times then it does become a huge difference. That is only one example of write less do more. So I wouldn’t really blame libraries especially if you use the CDN version it is probably cached anyway even if they are a new user so the overhead is next to nothing since you won’t be using a http request unlike yoursite.com/site.js which wouldn’t be cached for a new user. Unless you really need something really can’t get with a library why would you bother using more kb download time as well as more development time. If I really need what vanilla JS brings I could do a large amount of it with a server side language which in turn won’t hit me on initial download or processing power,granted it will hit me for a http request later on but it may be 2 or 3 in small succession instead of 20. Thanks.

]]>