5 Things That Don't Have to Be Hard in Single Page Applications

Last week I came across a Blog post by Sourcegraph's Quinn Slack on 5 painful things about client-side JS or single page web applications. As I read Quinn's points and comments I could not help but think yes he is right, but wrong at the same time. I think he originally titled the article with Angular in the the title, which lit a fire under the framework's apologist, making him change the title. I have opinions about Angular and the other popular MV* frameworks, but I will reserve that for another post or three.

I want to respond to Quinn's points one by one. I think each one reveals common mistakes developers new to the single page application experience make. I hear and talk to developers all the time who make similar comments. I also consult many companies about their current SPA investments and what is going wrong with them. These five points are just some of the common problems I observe. Each can be easily fixed, but they require a perspective adjustment. In fact developing a good mobile, touch first, performant web application requires a complete mind-shift from traditional web techniques. This is why I wrote my latest book, High Performance Single Page Web Applications.

Bad Ranking in and Twitter/Facebook Previews

How do you ensure search engine spiders can read your content when using a Single Page Application architecture? Fortunately Google gave us the guidelines with their _escape_fragment_ policy. The SourceGraph article talks about the technique of generating a static or server-side rendered version of each page. This is effectively what Google wants. Quinn seems unsure of Google's guidelines:

"And if Google deems your alternate site to be too different from your main site, it will severely penalize you. You won’t know you’ve crossed the line until your traffic plummets."

I am one of the last people you would see putting faith in Google's honesty I have to take them for their word here because this is the only guideline they have. After hours and hours evaluating options I think this technique holds the best technical merit as well. Like he said this can be a complicated process to author and manage. As I think about requirements I understand what he is saying, but I want to respectfully disagree based on my experience.

Let me define my perspective on this first. Getting to the point I am currently at was not obvious at first and the journey did take some time to tinker. A good server-side MVC architecture and a rich framework like ASP.NET or NodeJS helps accomplish the task. Chapter 8 in my latest book is dedicated to the server's role in a SPA. A portion of the chapter is dedicated to supplying the traditional server generated content we have been doing since the beginning of the web. That should be a clue, we have been doing this since the beginning of the web.

The trick is detecting when the _escape_fragment_ querystring parameter is present and if the browser supports a few modern standards. If the querystring parameter is present and some basic modern browser APIs are not supported then serve the core site. If no parameter is present and the browser supports these key APIs then give them the SPA, other wise redirect to the best version. This is called serving a core site, a term I first heard coined in a presentation by The Guardian's Andy Hume. In his talk he discusses how The Guardian detects if a browser cuts the mustard, or supports the basic APIs or what I previously described. Here is the test and redirect logic I use in my SPAs.

        var ef = "?_escaped_fragment_=";
        if (!('querySelector' in document)
             || !('localStorage' in window)
             || !('addEventListener' in window)
            || !('matchMedia' in window)) {
            if (window.location.href.indexOf("#!") > 0) {
                window.location.href = window.location.href.replace("#!", ef);
            } else {
                if (window.location.href.indexOf(ef) < 0) {
                     window.location.href = window.location.href + ef;
        } else {
            if (window.location.href.indexOf(ef) >= 0) {
                window.location.href = window.location.href.replace(ef, "#!");

The rest does involve modifying the way the server renders the markup based on the _escape_fragment_ presence or not. The cool thing is if a browser is out of date you can still provide it a viable version, just without the cool modern features.

Flaky Stats and Monitoring

Quinn's next complaint involves tracking view changes with analytics packages. He does mention Google Analytics with a BackBone add-on, but obviously was not happy. What I found more interesting was the following statement:

"Most analytics tools require error-prone, manual integration to use the HTML5 history API (pushState) for navigation. This is because they’re unable to automatically detect when your app navigates to a new page using pushState."

All modern browsers support and have supported the History API for a few years. This API extends the classic back button experience and API access to a visitor's history. One of its advantages is you can store data using the History pushState function. Based on Quinn's comments and my personal experience I think he is right about the History API not providing the appropriate interface for good analytics. In fact I will take my objection farther by saying the History API is completely worthless. I must have spent a few hundred hours trying to make the History API something worthwhile and in light of other modern browser features like localStorage and IndexDB the History API is a waste of time. Instead I think if Quinn would have abandoned the History API early and relied instead on the Hash Change event he would have found traffic much easier to track.

Here is the function built into my SPA library to push a view change to Google Analytics:

pushGA: function (path) {
    //if Google Analytics available, then push the path
    if (_gaq !== undefined) {
        _gaq.push(['_trackPageview', path]);

The point here is Google and most analytics package worth its salt has an API developers can integrate into their AJAX heavy applications. You can track anything you like. Did the push a button, track it. Did they scroll the content down, track it, etc.

Slow, Complex Build Tools

Maybe Quinn's background and mine are different, I cannot speak for him. My experience for about 14 years now is with ASP.NET, a compiled language. Yes it takes a few seconds to minutes depending on the solution's complexity. I guess I am just used to large enterprise line of business applications and actually compiling code. My guess is Quinn's background is with an interpreted language like PHP, Node or even Classic ASP. These languages are not compiled, but instead read and executed on demand by the platform. I guess you could say compiling these languages is the equivalent of saving a file, which is very fast.

Modern web applications are client heavy, this means markup, CSS, JavaScript, images, fonts and possible other resources need to be 'compiled'. I am a fan of GruntJS as my compiler. Grunt is really a task runner, but that is what any build system is. Grunt allows me to do necessary tasks very easily. It uses a command line interface (CLI), which means I can add Grunt to my normal Visual Studio build process. I use it to at least bundle and minifiy scripts and CSS, run unit tests and perform static analysis (linting) over JavaScript and CSS. Gulp is the newcomer on the block and I honestly have not had any time to try it out, so I cannot speak to Gulp's abilities and setup process. My understanding is it is like Grunt, but faster. If so, cool. My typical Grunt process takes about 5-20 seconds on my Surface Pro, just saying, I am not sweating it. But then again I am known to have 50-100 view SPAs with < 125kb of JavaScript.

Slow, Flaky Tests

This is where my expertise gets a little thin and I suspect most developers are as bad or worse than I am with good unit tests. I agree with Quinn writing unit tests for the front-end is difficult. It is impossible to write an assertion with the name AssertIsFastAndFluid or AssertPleasesStakeHolder. These are obviously subjective tests and must be run by a person.

How do you mock the modern browser API? It is no small task. When I wrote deeptissue last Winter I gave up pretty quickly on mocking and stubbing the touch, pointer and mouse APIs. Instead I spent most of my time physically trying out the code while writing thousands of lines to the console to track the activity. There are obviously things you can test, mostly things that you control in your application's code. One rule in unit testing is don't unit test a third party library, trust it does what it advertises, at least until you can prove the library has a bug.

I think trying to unit test if the application is rendering what you are expecting the way he describes is not the best approach. I think instead if you can prove your core function(s) work isolate them and test them, not the entire system. Sure integration testing is good, but I think when it comes to user experience you have to use real eyes. Unit test the analytical stuff, eye ball the rest and trust your support libraries.

Slowness is Swept Under the Rug, Not Addressed

I agree with the statement, however again I have to disagree with the execution he implies. You can design web applications to run fast. That is why I titled my book High Performance Single Page Web Applications, to emphasize speed. We know 80-95% of a web application's performance issues are attributed to the way the client is architected not the back-end (data queries and business logic). The best practice for consuming server-produced data is to send as small a JSON package as possible and cache it as long as you can in the browser. Construct the markup in the browser using templates and append the generated markup to the DOM on demand. Seriously that is the secret sauce to solve the problems described in Quinn's article.

There is a broader issue and that is web sites are getting slower and slower because of poor architecture and sloppy development. Most issues can be easily fixed, but the general attitude is the time required is not worth it. Study after study proves slow sites have lower success rates. Google and Bing factor page load times in search rankings. Survey after survey proves everyone wants fast loading and responsive client experiences. Building modern web applications using classic web development techniques will not meet these demands. We as a web development community needs to step up our game to correct the current performance trajectory we are following.


Quinn's frustrations are not unique, I hear these and many other issues plaguing developers as they are fumbling with Single Page Applications and modern web development. One thing I learned the hard way several years ago is classic web development techniques were not going to work in the modern, mobile, touch first environment. I had to change the way I do things.

The process started by shedding classic concepts and forgetting everything that works in the server based web. You must make the effort to understand how the browser works and build applications that take advantage of their features and limitations. You have to assume the user wants to touch the data, uses an on screen keyboard and is on a Sprint 3G connection. This forced me to adopt the SPA architecture I use and develop the libraries I did because no library or framework was available that met these needs efficiently. Quinn's experience is based in AngularJS and I agree there are many problems. I have been studying Angular and other new god frameworks. I personally see many faults in these libraries that are going to hinder developers from building the rich modern web applications needed for the next generation.

You can visit a live demo of the movie application built in my book and review the source code. Of course I encourage you to buy my Single Page Web Application Development book too, it is 20 chapters and nearly 400 pages of content based on over 4 years of research, development and a lot of wrong turns. It is only $9.99 USD. Maybe Quinn and others can read the book, evaluate the code and rethink his stance on single page web applications. From the context of the article's points I think he and SourceGraph made a lot of common mistakes I see developers making as they enter the realm of modern web application development. I made them too, I learned from them and was determined to make modern single page web applications work. I know you can too.

Share This Article With Your Friends!

We use cookies to give you the best experience possible. By continuing, we'll assume you're cool with our cookie policy.

Install Love2Dev for quick, easy access from your homescreen or start menu.

Googles Ads Facebook Pixel Bing Pixel LinkedIn Pixel