Pruning Web Sites for Higher Yield and Easier Maintenance

Recently I was chatting with a friend that works with trees. The topic of tree suckers came up. As he was describing their destructive nature and the pruning you must perform to remove them I could not help to think of web development. According to HttpArchive.org the average web site is peaking around 1.7MB and over 90 HTTP requests. My personal survey shows an even worse reality. Web sites are too large and the nutrients needed to make a site flourish are being drained. Study after study prove faster sites have better conversion rates, that means they make more money. 

tree suckers

My arborist friend explained a tree sucker forms as an offshoot from the tree's root. They spring up from the ground surrounding the tree's base. The suckers are attached to the tree's root, draining vital nutrients from the tree. The tree's top is deprived the sustenance needed to grow and maintain a healthy life. Web sites suffer from the same issues, in the form of unnecessary resources. Healthy sites load fast and can be maintained.

Duplicate Library References

One of the many jokes in the the JavaScript space is if one jQuery isn't enough add another 2 or 3 jQuery references. Sadly this happens more often than you would think. I routinely examine web sites and high profile, high traffic ones, that include multiple jQuery references. This is where the page loads the jQuery library more than once. To make matters worse often the page loads different versions and from different locations.

Too many jQuery References

The problem of course is loading the same thing more than once. Just because the browser has already loaded the library does not mean it will not load the additional reference. The browser loads the all instances, and evaluates each script. This means the content needs to be downloaded a 2nd or 3rd time. Every time a script is loaded the browser evaluates the script. Every JavaScript, used or not, takes time to be evaluated. This is time taken away from the page being usable. Right now jQuery is hovering just over 90kb minified, which is a pretty heavy load when double or tripled by excess references.

Loading different jQuery versions is also problematic because the last version loaded is the version used. Unless your dependent code is loaded and executed before the next jQuery version is loaded. This makes troubleshooting problems hard or difficult to know which version your function used. A common issue is loading the latest jQuery version, only to later overwrite the newer version, containing important bug fixes,   with an earlier version.

As a developer it can be frustrating to have code work in a controlled environment and then transfer the code to an application only to have it fail. You think you are using the same foundation (jQuery version) when in reality you are using a version from 2 years ago. I see this happen in large team environments where different teams are responsible for different components. Team A 'trusts' the page's master layout to include the core jQuery library. Team B adds anther jQuery reference to their local component for a variety of reasons. Team B's code ships and is not updated or code reviewed and a year or two later problems crop up. This is not only a performance issue it is a maintenance nightmare.

Another product of poor code review, testing and communication is referencing the same library from multiple locations. Let's modify the previous example. The master layout references the latest jQuery from Google's CDN. Team B's code references the latest version from the jQuery CDN. To complicate matters team C references the latest version on the Microsoft CDN. Not only are three instances loaded and evaluated they are all loaded from different locations. None of these references benefits from HTTP caching, plus who is to say these three CDNs all host the exact same version at the same time?

Too Many HTTP Requests

Another very common nutrient draining practice is too many HTTP requests. When I push an application to production I have a JavaScript file. The script is built as part of my development process and deployed to the production environment. While developing I load dozens and dozens of tiny JavaScript files making it easier to debug. But when I push to production those files are bundled and minified, into a single file. This Blog for example has a single JavaScript file containing about 35kb of minified JavaScript. There are 22 script references in development. More often than not when a web site is pushed to production the 22 script references are maintained. This means the browser must make 22 different HTTP requests, stop everything it is doing 22 times and evaluate each script before loading and evaluating the next script.

Today it is too easy to include a bundle and minify step in your project's build. I prefer GruntJS, which even has a watcher extension that continually runs your Grunt scripts, saving you the step to manually run them. I have a whole chapter dedicated to using tools like Grunt to build your web application in my new book, High Performance Single Page Web Applications.

too many JavaScript Requests

A common scenario is individual references to every jQuery plugin used in the site. Many sites use dozens of plugins, creating dozens of extra requests. Often these requests are 1-2kb, which can be very inefficient. Plugins tend not to depend on each other, just jQuery. So if you need a place to start my advice is to bundle and minify all your plugins into a single production file. Run some synthetic tests with each browser's network waterfall or using the performance timing API. If your application is public then use a tool like http://webpagetest.org to compare the results between the unbundled and bundled versions. I think you will notice a performance bump.

not bundling JavaScripts

Unminified JavaScript

I have already mentioned it, but all production JavaScript should be minified. This is the process of removing unnecessary comments and white space. A good minifier also optimizes the code by changing long variable names to single letters, removing unused variables, etc. Often my development or debug script versions are cut in half for production. The average web page now includes around 300kb of JavaScript. I often see large brands loading 1MB or more. Assuming the JavaScript is not minified the average site could save about 150-500kb when a page is loaded. This can improve perceived load times by several seconds.

not minifying JavaScript

Summary

These are just three examples of web suckers, nuisance weeds that drain a web site's health. They cause pages to load slower and add to the long term maintenance overhead. They attach themselves to the site through lazy or careless programming practices. Each one is easy to remove, but are often ignored by development teams. Unless you look for them you may not see them, even though they are in plain sight. These issues are not jQuery issues, they are developer and DevOps issues. There are many other forms of web suckers, these are just three places every development team should take a few hours pruning today. Failure to clean up the web site's base can cause the site to start dying from the top (the user experience) down. When the site dies, so are the jobs needed to build and maintain the business.

Share This Article With Your Friends!