Pruning Web Sites for Higher Yield and Easier Maintenance
Recently I was chatting with a friend that works with trees. The topic of tree suckers came up. As he was describing their destructive nature and the pruning you must perform to remove them I could not help to think of web development. According to HttpArchive.org the average web site is peaking around 1.7MB and over 90 HTTP requests. My personal survey shows an even worse reality. Web sites are too large and the nutrients needed to make a site flourish are being drained. Study after study prove faster sites have better conversion rates, that means they make more money.
My arborist friend explained a tree sucker forms as an offshoot from the tree's root. They spring up from the ground surrounding the tree's base. The suckers are attached to the tree's root, draining vital nutrients from the tree. The tree's top is deprived the sustenance needed to grow and maintain a healthy life. Web sites suffer from the same issues, in the form of unnecessary resources. Healthy sites load fast and can be maintained.
Duplicate Library References
Loading different jQuery versions is also problematic because the last version loaded is the version used. Unless your dependent code is loaded and executed before the next jQuery version is loaded. This makes troubleshooting problems hard or difficult to know which version your function used. A common issue is loading the latest jQuery version, only to later overwrite the newer version, containing important bug fixes, with an earlier version.
As a developer it can be frustrating to have code work in a controlled environment and then transfer the code to an application only to have it fail. You think you are using the same foundation (jQuery version) when in reality you are using a version from 2 years ago. I see this happen in large team environments where different teams are responsible for different components. Team A 'trusts' the page's master layout to include the core jQuery library. Team B adds anther jQuery reference to their local component for a variety of reasons. Team B's code ships and is not updated or code reviewed and a year or two later problems crop up. This is not only a performance issue it is a maintenance nightmare.
Another product of poor code review, testing and communication is referencing the same library from multiple locations. Let's modify the previous example. The master layout references the latest jQuery from Google's CDN. Team B's code references the latest version from the jQuery CDN. To complicate matters team C references the latest version on the Microsoft CDN. Not only are three instances loaded and evaluated they are all loaded from different locations. None of these references benefits from HTTP caching, plus who is to say these three CDNs all host the exact same version at the same time?
Too Many HTTP Requests
Today it is too easy to include a bundle and minify step in your project's build. I prefer GruntJS, which even has a watcher extension that continually runs your Grunt scripts, saving you the step to manually run them. I have a whole chapter dedicated to using tools like Grunt to build your web application in my new book, High Performance Single Page Web Applications.
A common scenario is individual references to every jQuery plugin used in the site. Many sites use dozens of plugins, creating dozens of extra requests. Often these requests are 1-2kb, which can be very inefficient. Plugins tend not to depend on each other, just jQuery. So if you need a place to start my advice is to bundle and minify all your plugins into a single production file. Run some synthetic tests with each browser's network waterfall or using the performance timing API. If your application is public then use a tool like http://webpagetest.org to compare the results between the unbundled and bundled versions. I think you will notice a performance bump.
These are just three examples of web suckers, nuisance weeds that drain a web site's health. They cause pages to load slower and add to the long term maintenance overhead. They attach themselves to the site through lazy or careless programming practices. Each one is easy to remove, but are often ignored by development teams. Unless you look for them you may not see them, even though they are in plain sight. These issues are not jQuery issues, they are developer and DevOps issues. There are many other forms of web suckers, these are just three places every development team should take a few hours pruning today. Failure to clean up the web site's base can cause the site to start dying from the top (the user experience) down. When the site dies, so are the jobs needed to build and maintain the business.