Banner Ad: Please Fix Your Pacing Algorithm
Decorative Header Image

This web page is fat

Tim Berners-Lee

Tim Berners-Lee, Creator of the World Wide Web.Photo Attribution: / CC BY-SA 2.0

Do we really need to keep calling them web pages? What started out as an open document format for the exchange of data between researchers has quickly evolved into a new application platform where most of the computing happens in the cloud instead of on your desktop computer. The WorldWideWeb was created in 1990 by Tim Berners-Lee while he was working at CERN. His objective at the time was to enable his scientific colleagues to exchange research information electronically in a commonly consumable format. The Web allowed researchers to link their documents to one another via a hyperlink. This acted as an active cross-reference within the document itself. No longer would a person reading a paper by one researcher need to dig up another paper referenced in the document, they could just click on the link and go straight to it.

For a couple years the Web developed quietly. Pages were small and purposeful; browsers were few and eventually dominated by Mosaic. But in 1993 Netscape Navigator showed up at the party with some seed money and a business plan. It would take a year or two to unseat Mosaic, but eventually it enjoyed the widest user base on the Internet. The party, for Navigator, lasted until 1997. By that time the average web page size was about 44 Kilobytes according to a survey conducted at Georgia Tech. Netscape introduced an on-the-fly load style during its reign, which rendered the page as elements were downloaded as opposed to waiting until it was completely downloaded. This would allow a user to read the text of the page while the graphics were loading, a major breakthrough for the mostly dial-up user base.

Web page developers began to add external supporting files beyond graphics such as Stylesheets and JavaScript as the support for such features grew among the browsers. Internet Explorer 4, considered by some to be a major breakthrough in browsers, solidified the Document Object Model (DOM) as a standard for manipulating and styling the pages; giving authors the ability to create early in-browser applications that could process some of the data on the client, rather than the server. Page sizes doubled to nearly 100 Kilobytes by 2003 and the rise of broadband probably owes some thanks to this fact.

AjaxBy 2004 the first browser war was over and Microsoft claimed victory and stopped upgrading their browser. But hidden inside of Internet Explorer 5 and 6 was a little gem of an object called XMLHttpRequest. This API would allow scripting on the page to request data from a server and render it to the page without reloading the page itself. This major innovative leap was quickly adopted by the new batch of major browsers as Ajax development became the basis for the modern Web. As with previous innovations this caused the web page to bloat up again, reaching nearly 300 Kilobyte download for the average page by 2007.

Ajax is full of benefits and drawbacks. On the one hand the browser must download the code in order to perform Ajax request and have a functional application; on the other, the browser doesn’t have to download all of the data that could be needed for that in-browser application during the initial page load. The Web is being loaded up with larger and larger data sets and Ajax might actually be stifling the growth of Web page size. Combined with slimmer data object, such as JSON, this new paradigm is slimming down the bloat of data transfers. Browsers, however, are still having trouble rendering the data through several layers of interpretation, the first being the web application developers very own. This has the effect of stalling the browsers on occasion as new data is brought into the page and presented to the user. The latest browser developers are striving to produce the fastest engines to execute these in-browser applications and the fat code that accompanies them. Since the pages of these applications are often times never fully reloaded it seems that the average page download for now and into the future is infinite.

Leave a Reply

Your email address will not be published. Required fields are marked *