Bypassing Server-Side Rendering Altogether For a Better Web User Experience

I almost hate to write this article, but at the same time I have experienced a special freedom lately. We build high performance single page applications, this means our web sites render data in the browser rather than the server. Over the past year we were challenged by customers to live without ASP.NET, node and other server-side rendering services. After several experiences and some changes to our workflow, we found don’t need them and neither do you.

classic-web-site-server

Why No Server-Side Rendering?

You may have read articles and heard statements to the effect server-side rendering is faster than client-side. I have disputed this theory and even done some simple tests to compare the two techniques. Part of the argument is that browsers and DOM are not fast. This is untrue. They are not fast when you do things wrong, which most fast food frameworks do. They are designed using server-side architecture and techniques, not good client-side browser techniques.

Server-side rendering is performed by an application engine like ASP.NET, Ruby, PHP, Java, node, etc. Often this involves making a call to some sort of data store and possibly evaluating authentication credentials. This can and often leads to a slower time to first byte (TTFB).

Good server-side rendering engines offer a caching mechanism like ASP.NET’s Output Caching. Here the server renders content once and caches the result in memory. Output caching allows you to declare caching parameters so you can designate the cache time to live (TTL), queryString, language and other variations. It is a powerful technique.

In essence Output Caching produces a static HTML file. Because data becomes stale over time this cache needs to be purged and rendered again. All data has a life cycle or personality as I call it that determines the right way to cache the rendered markup.

Modern single page web applications by their nature move this server-side exercise to the client. This means a SPA is responsible for managing markup and rendering as needed. They also manage data caching. These responsibilities are managed by Love2Spa, our web platform, as fundamental design features.

This does not mean there is not a dance between the web server and the client that needs to be managed. Single page applications require a different approach where the server offers more of a dumb or static server architecture. The application instead relies on a robust API to provide on-demand data, preferably in JSON format. A single page application needs a fast, static web server that beckons back to the web’s early days. The API provides the dynamic aspects of the application, data and authorized content.

This is good because static CDNs like Azure and AWS’ S3 and Cloudfront are cheap, globally distributed and fast. APIs can be built and hosted on the same cloud platform, using services like Azure App Services, Blob Storage, AWS Beanstalk or S3. Again API platforms are cheap and highly scalable. While many management tasks need to be managed in this scenario, they replace many tasks previously assigned to the web server. In my experience there is even less administration required.

So how does the modern single page application architecture look? Let’s look at two diagrams, one with a single web server and another using the distributed cloud based services previously described.

class-web-server-spa

Love2Spa Approach

I will spare you the technical details, but provide a high level review of how Love2Dev builds fast, scalable and maintainable static single page web applications. The first step is to render the core application markup once, as part of the build process. If you are not a developer, do not worry. We, the developer and devops manager, create automated tools that take all the different parts of the application and compile them into a ‘production’ version.

The products of this process include a single index.html file and a single JavaScript file. CSS is injected inline in the document’s HEAD. The compiled files are then gzip compressed and deployed to cloud storage. Once in cloud storage my cloud CDN distributes the files around the globe to be closer to potential readers and customers. By placing the assets closer to visitors, the content loads faster, making customers happy.

The index.html file contains all the markup needed to render the entire application. It also contains all the CSS code. The CSS is injected in the document’s HEAD to reduce rendering time. This technique improves performance by eliminating an extra file request, plus the CSS is parsed and available to the browser before it starts rendering the markup. This one file is a powerful single package that defines the application’s content structure and how it renders.

static-html-spa-via-build

The other file is the application JavaScript. This file contains the code that drives the application. There are modules that parse the markup, manage view swapping, and other necessary in application experiences. This file is a combined, uglified version of all the source JavaScript files. If you don’t know what uglifying means, it is a way to eliminate white space characters, like spaces and return characters. Additionally a JavaScript uglifier will modify the code replacing variables and function names with single characters, reducing the overall file size without breaking the code. The trade off is the code is less human readable, but causes no issues with the JavaScript engine.

These two files combine to drive the application experience. Since they are the entire application there is no need for a server-side rendering engine, thus eliminating the need for the previously mentioned platforms. Once the server-side rendering layer is eliminated, the server returns the files faster.

Larger applications may require additional files, but the average web site doesn’t. You may consider deferring some of the payload in extremely large applications or when you have code protected for authenticated users. In the latter you may need the web platforms to secure access to those assets. Again, this depends on the nature of your application. In general you want to protect the data first, which requires web platform software to host the application’s API. Love2Dev’s SPA architecture manages these deferred loading scenarios seamlessly so customers have a fluid user experience.

At run-time when the end user interacts with the HTML5 application, code manages the markup, displaying and removing views (a SPA corollary to a web page) as the user moves through the application. This involves retrieving and posting data to the application’s API (outside the scope of this article) and rendering content so the user can read or interact as needed. The server used to handle this part. Now the server is less stressed since the rendering process moved to the client, which means you can scale your application easier.

I hope you understand how traditional server-side roles have moved to the browser. Now, the API can manage the remaining server-side responsibilities. We often associate an API with returning or accepting structured data like JSON or XML. But there is no rule saying the data must be in a limited set of data formats. An API can return anything, binary files for example. When working with a SPA, we are interested in strings, markup strings to be specific.

The API, like ASP.NET Web API or Node, can return strings very efficiently and provide some of the server-side features we used to trust to layers like ASP.NET and PHP. Remember API services like ASP.NET Web API provide rich server-side abilities like talking to the database and checking security. They simply skip the markup rendering step. You could create a data formatter or engine, or even leverage Razor or any other templating engine. After many years focusing on single page applications, I found this to be unnecessary as efficient rendering takes place in the browser. So use your applications’ API to send and receive data from the client. But be prepared to potentially use it to serve static content like markup and protected resources (CSS, images and Java).

What About Legacy Browsers and Search Engines?

The last reason I clung to the server was for legacy support. In January 2016 Internet Explorer 10 and below effectively become unsupported by Microsoft. This should force enterprises worldwide to upgrade to Internet Explorer 11, eliminating the issue of legacy browser support. As time goes by, IE 11 will become more and more obsolete too.

In order to handle legacy browsers, you can perform feature detection and load a polyfil as needed. For example, Love2Dev tests for Promise support; if not available a Promise polyfil is loaded before the main application code.

SEO is the next concern, the life blood of many businesses. Search engine spiders are becoming capable of spidering client-side applications, which means having server rendered content is not really a requirement anymore. In fact Google performs several client-side evaluations to determine search engine placement, verifying the website’s ability to run modern single page applications.

What about Google’s AJAX guidelines?

In late 2015, Google officially deprecated their AJAX guidelines, posting on the Webmaster Central Blog the updated rules related to crawling and indexing sites used in Google’s index to determine search engine placement.

“In 2009, we made a proposal to make AJAX pages crawlable. Back then, our systems were not able to render and understand pages that use JavaScript to present content to users… Times have changed. Today, as long as you’re not blocking Googlebot from crawling your JavaScript or CSS files, we are generally able to render and understand your web pages like modern browsers.
To reflect this improvement, we recently updated our technical Webmaster Guidelines to recommend against disallowing Googlebot from crawling your site’s CSS or JS files. Since the assumptions for our 2009 proposal are no longer valid, we recommend following the principles of progressive enhancement.”

This significant change in Google’s indexing practices meant a core site is no longer needed for search engine optimization.

Summary

I remember back in the early 90’s when CGI gained traction. Even I was an early adopter of Microsoft’s Application Server Pages (ASP). I wrote my first server-rendered content with data from a data store sometime in 1994 (I think because I used Windows 95 Beta). Technology constantly changes, and today we have the ability to build rich, fast, highly interactive web experiences on just about every platform (excluding legacy browsers and old, cheap Androids). So why waste time on the server anymore?

Of course, this means we need good client-side architecture and tools. Love2Dev has over 6 years of mobile friendly, single page application development experience. We built hundreds of SPAs and provided rich user experiences that scale well while being easy to manage. You can contact us to learn more about our single page application services. We would love to talk to you about your site or application and how you could upgrade to modern standards.

Share This Article With Your Friends!

We use cookies to give you the best experience possible. By continuing, we'll assume you're cool with our cookie policy.

Install Love2Dev for quick, easy access from your homescreen or start menu.

Googles Ads Bing Pixel LinkedIn Pixel