What is Isomorphic JavaScript and When Should it Be Used?
Lately I have been reading more and more recommendations for isomorphic JavaScript in single page applications (SPAs). My last Blog post was a response to some folks in my galaxy of influence supporting server-side rendering tactics. The combination of client and server rendering is known as Isomorphic JavaScript. Isomorphic JavaScript is one of those fancy sounding computer science terms. Isomorphic means the application uses the same rendering engine on the client and the server, thus making it easier for developers to maintain markup templates.
There are many popular articles available to describe Isomorphic web architecture:
-Node.js and the new web front-end-Isomorphic JavaScript with LazoJS-Isomorphic JavaScript: The Future of Web Apps-Scaling Isomorphic JavaScript-The future of web apps is — ready? — isomorphic JavaScript
When I began my journey to modern SPAs I too felt like you needed to retain classic web applications server-side processing. It was familiar to me and seemed to make sense. I quickly realized this was a loosing strategy. There are two common arguments isomorphic advocates make, speed to render and search engine optimization. As I stated last week the speed argument is flawed. The SEO argument is more misunderstood than wrong. In fact SEO and legacy browser support are the only reasons to retain server-side rendering. However execution needs to be done differently than most recommend.
Using the same Rendering Engine on the Client and Server
Modern web application development is about building rich user experiences utilizing the new APIs and performance offered by today's modern browsers. Traditional web sites are rendered on the server, binding data with markup to produce a finished product. The finished product is shipped to the browser, which then renders the page as it loads CSS, images and any other asset designated in the markup and CSS.
As JavaScript and AJAX came of age about 5 years ago (give or take) some of us (me included) started pushing the envelope beyond simple dialog boxes to SPAlets and eventually full single page web applications. This has led to a shift in responsibility from the server to the client. Today many server-side responsibilities now reside in the browser. MVC, routing engines and view engines all run in the browser so we can create the rich, instant experience modern end users expect.
I like to visualize the modern web stack as an hour glass. The top bulb represents the client-side responsibilities. The bottom bulb the server-side business and data logic layers. The small aperture connecting the two bulbs is the modern web server layer. The web server is a thin relic of its old self, stripped of many duties it reliably carried out for us in the past. Now it has two primary duties, serve basic markup and host an API.
Today my typical web application has a single controller with a single end point. This end point serves the markup to drive the entire application in the client. From there JavaScript drives the experience making AJAX calls, caching data and markup to make an instant response experience for customers. The experience beats the average native application hands down. Sure there is an initial wait of 1-2 seconds to render that first view, but after that the only latency is retrieving the data to hydrate views as they are requested. If the data is already cached rendering takes less than 1 or 2 screen refresh cycles, even on mobile devices.
To accomplish this I had to create a special SPAHelper library to extend ASP.NET MVC to support certain features. It extends the Razor view and adds some helper functions to enable some things I found necessary to marry the client and the server experience successfully, in particular the core site concept I will talk about later. There is a pair of chapters in my SPA book detailing the dance that should be managed between the client and the server to properly serve a single page application.
Node.js Impact on Web Architecture
When Node.js was introduced some developers saw an opportunity to utilize their JavaScript skills honed in the browser on the server. Soon after Node.js hit the Internet developers added web server modules with Express.js being the most popular. Over the course of the past year I have begun building web applications using node and express and have become quite fond of the combination. As a developer that has become more comfortable writing JavaScript than C# it just makes sense.
Like ASP.NET and other traditional web platforms there is the concept of a view engine. Jade is Express' default engine, but Handlebars may be more popular because it works on both the client and server. Handlebars is derived from MUSTACHE, my preferred templating engine and its syntax is very straight forward. MUSTACHE is what computer science types refer to as a push model, where data is pushed into a template to produce markup.
These 'push templating' libraries give developers the luxury of template reuse on both the server and the client. This is a very tempting scenario because the potential fragmentation of client and server is solved and maintenance is simpler. This does rely on the concept of deep linking to the server a viable. As I pointed out last week this is not always the case because the deep linking route is never be passed to the server in a SPA.
Before I dive into the answer, I want to examine the MVVM approach.
The MVVM Problem
Binding is another popular rendering technique. Angular and Knockout use this model. Instead of MVC you utilize the Model View View Model (MVVM) approach. This works on the client and is easy for some developers to pick up. It does have some performance issues, so choosing a binding library should be done with care. My favorite is (Rivets)[http://rivetsjs.com/] because it has a simple syntax and lacks the memory leaks that plague Knockout and Angular.
MVVM's problem with an isomorphic design is difficulty rendering the markup on the server. Binding libraries rely on browser plumbing to operate, this plumbing is not available on the server. Well it is, but you need to run all requests through phantomJS, a very slow process. Some have advocated pre-rendering all possible requests on the server using a build tool like Grunt or Gulp. While possible for small applications most dynamic applications would require way too much work to make viable. When I think of isomorphic web architecture I throw out binding libraries as a viable approach.
SPA Deep Linking and the Core Site Approach
Once you get past the initial load scenario the next problem is how to deal with search engines. You want to provide a deep link to content, yet preserve the integrity of your SPA. A proper SPA utilizes a URL where the domain is the primary link and the deep route is defined after the #. This means the route never reaches the server, so the server cannot render the requested content within the markup.
http://www.domain.com#!spa/route/with/variables
A traditional web site's route would look like:
http://www.domain.com/route/with/variables
To utilize an isomorphic web architecture you would need to pass the second address to the server. This causes the web server to render the page using classic rendering techniques we have been using since the early 1990's. From there your SPA would utilize the hash fragment to drive the application, right? While it could, you would not want to do this because it causes a negative user experience.
http://www.domain.com/route/with/variables#!route2/with/variables
When the previous link is used to access the application the initial view will be route, not route2 based. This means the initial content sent to the browser is not the desired content and the user experiences a flash or an unintended animation while waiting on the desired content to be delivered and rendered. In short this is a deep linking nightmare. This is where Isomorphic web architecture falls apart.
Instead you should never utilize traditional URLs in a modern, SPA application. If you are upgrading a legacy web site then provide proper 301 redirects to the new hash fragment driven URLs, like the following:
http://www.domain.com#!route2/with/variables
A properly architected SPA does not experience any significant initial delay in rendering compared to the traditional server-side rendering model. Anytime you have dynamically driven content, which is most web sites today, there is a built in delay as the server retrieves the data and renders it in the markup templates. This big package is then delivered to the client to be rendered.
The way I have learned to architect my SPAs leverages an initial markup load of a 'master' layout and initial views plus core CSS and just enough JavaScript to drive the application. The majority of time my applications render in under 1 second across broadband and within 4 seconds across most cellular networks. The vast majority of classic server rendered web sites cannot match those speeds due to their weight and poor architecture.
The Place for Isomorphic Support: SEO and Legacy Browsers
After making the case against isomorphic there is a place for the practice. Search engine spiders and legacy browsers cannot execute the client-side application, thus neutering the modern SPA architecture. Instead of a client driven application these scenarios need content rendered on the server, using "old fashion" techniques. Instead of the rich client experience you deliver a 'core site' or minimal page.
A core site is at its essence the base of a progressive enhancement story. Progressive enhancement is where a basic experience is delivered and then additional 'layers' are added to drive a richer user experience. My core site strategy is to deliver a base of server-side rendered markup and a minimal set of CSS. I tend to avoid any JavaScript and the CSS is just enough to enable a basic layout in older browsers. I avoid spending too much time making the 'page' dressed up, just focus on the basics. Is the content readable? Can I post a form back to the server? Do the hyperlinks reference other pages?
The reason I invest very minimal effort is legacy browsers are being phased out. Microsoft will not support Internet Explorer 9 and 8 this time next year. That leaves old Androids as the bulk problem. This is an issue in emerging markets, but not so much here in the US. I find old Android users tend not to use the browser and apps, but instead focus on phone calls and texting. Capital expenditures to support those platforms is not money well spent.
I did not create the concept of a core site, I learned the term from the Guardian, the UK newspaper. Andy Hume's Smashing Conference 'Cutting the Mustard' presentation explains the concept and their experience. To execute the cutting the mustard strategy I use the following script to do some basic feature detection. If the browser falls short I execute a redirect passing the SPA route to the server via a QueryString parameter.
<script> var ef = "?_escaped_fragment_="; if (!('querySelector' in document) || !('localStorage' in window) || !('addEventListener' in window) || !('matchMedia' in window)) { if (window.location.href.indexOf("#!") > 0) { window.location.href = window.location.href.replace("#!", ef); } else { if (window.location.href.indexOf(ef) < 0) { window.location.href = window.location.href + ef; } } } else { if (window.location.href.indexOf(ef) >= 0) { window.location.href = window.location.href.replace(ef, "#!"); } }</script>
The "?escapedfragment_=" QueryString parameter is not something I made up, it is part of the SEO specification Google has defined for modern applications. That means the core site strategy serves two purposes, providing a usable server rendered version to legacy browsers and search engine spiders.
This does mean developers must potentially manage two views, one for server and client-side rendering. Libraries like MUSTACHE give us a chance as there are server-side view engines in node and ASP.NET. Razor on the other-hand will only work server-side as it is too C# focused to work on the client. Binding libraries are difficult to execute on the server.
If your application is just an application, behind a login and SEO is not important then you most likely can avoid the server-side rendering all together. This will be even more true over the next year as Internet Explorer 8 and 9 are deprecated. In essence developers are free to eliminate just about all server-side rendering for modern single page web applications.
Summary
The concept of isomorphic JavaScript has its place, but it is often inappropriately applied. Server-side rendering should be limited to the scope of search engine optimization and legacy browser support. If these are not important then time should not be wasted developing a rich server rendering story. If server rendering is needed then push template libraries like MUSTACHE should be considered because they work well on both client and server. Again I have a pair of chapters going into more detail about the client and server relationship in a single page application in my High Performance Single Page Web Applications book.