Fixing the Microsoft Store's Web Performance
Have you visited Microsoft's online store? I do at least 2-3 times a month. This weekend I wanted to see if there was a KidSpark class for my step-daughter. I could not believe how slow the experience was and of course decided to investigate.
First what are my technicals? I use a Surface Pro 3 i7 with 8GB of RAM, a very powerful machine. I have a home FIOS 50MB connection, more than enough bandwidth.
To collect performance data I used the always free Web Page Test. You can read Rick Viscomi and Andy Davies Web Page Test book to get more details about using Web Page Test. I ran one test Saturday night and a second this morning.
Because I was looking up events in my local store I audited the King of Prussia mall's page. Most companies tend to optimize their home page, but the reality is the interior pages account for the majority of user traffic. Often these are the entry points to our sites and applications and provide a first impression.
Time To First Byte
Most sites align to the web performance golden ratio of 80/20-95/5 of a page's load time being dependent on front-end architecture, not the server. The Microsoft Store suffers from very poor time to first byte (TTFB). The page's markup took 11.829 seconds to load, more than 11 seconds slower than acceptable.
I do not know what web platform the Microsoft Store uses, but they need to replace it as fast as they can. To me a good TTFB is 250ms or less. Over a cellular network you can add an 500ms to account for the GPRS delays. Redline13.com did an interesting evaluation of the top web sites using HTTPArchive's data. You can see many top sites suffer from poor TTFB, but nothing like the Microsoft Store.
If the Microsoft Store back-end offers 11 second or more request times they need to change it now. No way is 11+ seconds is acceptable.
Looking at the Response headers you can see the site uses IIS 8.0 and ASP.NET, not a good sign for Microsoft's web platform. IIS and ASP.NET can be configured to return pages like this almost instantly.
This page and most of the pages on the Microsoft Store, they are static or should be. There is no reason to invoke an complex back-end rendering process every time the page is requested. I have two strategies for the store could employ, Output caching or static rendering.
Output Caching
ASP.NET supported output caching since it was first introduced. Output caching allows you to cache rendered markup so a request does not pass through the entire rendering cycle over and over. This eliminates much of the overhead associated with server-side rendering like ASP.NET or PHP. Subsequent requests return the previously rendered markup much faster.
Personally I have seen the power of Output caching on many sites. I was able to reduce 100% CPU to 2-4% CPU instantly. IIS is a very fast web server and can return static assets fast. This is what you need, no server-side latency.
Static Site Generation
Static site generation is a hot topic today. I recently migrated my Love2Spa architecture from an ASP.NET rendered infrastructure to a family of node modules that generate the application markup once. This means there is one process to render the markup. Rendering is done as a Grunt task, part of the build process.
The Grunt plugins generate the application index.html and a deferred payload if needed. I can chose to deploy to a traditional web server, run using a local file or deploy to a storage solution like Azure Blobs or Amazon S3. The node modules do not stop with rendering, I can deploy to any web server. Most of my clients the past two years have not used IIS for example. I had no issues since migrating away from ASP.NET.
Static pages are served instantly by any web server because there is little to no server-side processing needed. Static pages work with traditional server-rendered web pages too. The majority of web pages render the same, static, content and rarely change. I recommend investigating a static site option for your web sites. The Microsoft Store could benefit from this strategy. Remember rendering one time, off the web server is far more efficient than rendering every request on demand.
What Causes Slow TTFB?
The web rendering layer, ASP.NET for example, is normally not the culprit, but the business and data layers. The rendering layer has to wait on the back-end processes to return before a response can be sent to the client. This is one reason why modern web applications retrieve data via an API and not part of the rendering cycle.
Using an API decouples these expensive operations from the front-end, improving the perceived performance.
Decoupling is not enough, APIs should return quickly as well. This is why the world moved away from binding expensive back-end operations to anything related directly to the front-end. Instead fast, scalable properties have moved to a NoSQL data store.
NoSQL data stores act as a structure data cache as opposed to a relational database where data must be assembled based on a where clause. NoSQL databases are fine tuned to perform better at read or writes as the case may be. The point is a site like the Microsoft Store can benefit from a NoSQL infrastructure because the nature of the site's content is very static.
For example the list of classes per location does not change constantly, the list updates about once a week. But even if the data updated quickly NoSQL data stores offer better front-end data solutions. For example modern social networks only work at their massive scale because they leverage NoSQL. They would be unbearable experiences if tied to a traditional back-end.
I wont go into the details on how to implement NoSQL today, it is too broad a topic to cover. If your organization is not using a NoSQL data store you should consider researching the option and plan to implement one this year.
When you can eliminate any server-side processing from the web server you reap the rewards.
Eliminate Excess JavaScript
I enjoy writing and using JavaScript, but its use across the Internet has reached unacceptable levels. Images get most of the blame for bloated page size, but the rise of JavaScript is a bigger problem for concern. Not only do scripts need to be downloaded, they must be evaluated. This evaluation is a blocking process, which means the browser stops rendering while scripts are evaluated.
Let me reiterate, the vast majority of the web is static content and very little real interaction requiring JavaScript. Evaluating the Microsoft Store's waterfall you see 3 jQuery libraries being loaded. No where on the page do I see anything requiring any of these libraries.
- jQuery
- jQuery Mobile
- jQuery UI
I looked over the page and found two UI components that need a minimal amount of JavaScript, the dropdown menu and a dropdown card in the header. Both of these could be implemented with about 20-30 lines of JavaScript and a minimal amount of CSS. Certainly a company with the vast depth of developer talent could see this and getbasic vanilla JavaScript solution(s) implemented. I have done this for a few clients in just a few hours.
Eliminating these libraries eliminated 3 HTTP requests and almost 200kb of data transfer. My personal belief if this would improve the overall rendering time by around 1 second or more. Real tests would need to be performed. Don't forget about the reduced hosting and download costs. The page weighs in at almost 3MB of data, so every bit matters.
A special library for IE 8 being loaded. Microsoft stopped supporting IE 8 last month, please remove this script.
A reference to Microsoft's Virtual Earth library, yet the page has no map. This is a large library and should never be loaded unless it is needed.
Diving into the page's waterfall I see several AJAX calls being made for data. Again this can all be part of the server-side render content and cached appropriately. For example they make an AJAX call to get the shopping cart item count. You can keep this an API call, again you can do an AJAX call and rendering in about 20 lines of code so we are still rather light on the code requirements.
Often developers leave references to libraries like jQuery in a default project template. They do not remove them. My advice is always remove the default scripts and CSS from your Visual Studio project to avoid this sort of unintended pollution.
User Tracking Services
Additional JavaScripts are requested from different analytics packages. Ensighten is the first one I see listed. I cannot explain enough how I loath the Ensighten package. This library is poorly designed, in fact its code is about a decade outdated. I employ everyone to stop using this package.
The store injects more than one analytics package. This has become an epidemic, killing the web. Third party scripts are often poorly written, again Ensighten is terrible, and this poor code quality is dragging down the web. Just this week I found this article, If you care about web performance, 3rd-party scripts can be the bane of your existence..... I love the tool that demonstrates how third, fourth and fifth party scripts are being loaded in their example. This is not uncommon.
- Ensighten
- Optimizely
Pick a single, less intrusive analytics service. It is a good idea to collect real user metrics, but there is no need to do the things services like Ensighten do or use 3 or more services. The nature of these libraries is to inject themselves into the lower levels of the code, overwriting native functions. This is a very bad practice and only causes bugs and degraded user experience. Tomorrow I have another example of this practice.
Optimize Images
Images account for 56% of the page weight and 49 of the 100 HTTP requests. I picked 13 of the larger images and ran them through Kraken.io to get optimize versions. I was able to reduce the size of the 13 selected images by 823kb or over 60% of their original size. Combine smaller images with my script reduction and the page now weighs over 1MB less than it started without any degradation in user experience.
Any web CMS platform, which is what a site like the Microsoft Store is, should have the ability to optimize images as part of the work flow. You can employ Grunt or Gulp tasks as part of the build process if you need. But in general you have content creators uploading images. I can only assume they know nothing about image optimization tools.
As a developer any administrative interface for a site like this should have an image optimization back-end process. Kraken.io offers an API and has node modules to auto deploy to Azure Blob or Amazon S3 storage. I am sure you can configure a simple module to auto deploy to what ever CDN you use.
Server Rendered Images
This looks sexy to developers, use the back-end rendering pipeline to serve and generate images on demand. This is one of the worst decisions you can make. Image rendering is an expensive operation. When you need to render an image on the fly for many simultaneous requests, you are killing your CPU resources.
This is exactly what it seems the Microsoft Store is doing with several images. You can see from the following URL, a version QueryString parameter appended to the image.
https://c.s-microsoft.com/en-us/CMSImages/Kids_395x220.jpg?version=aec50408-2d98-ca57-a546-50706c825103
There are two reasons why developers use this technique, to generate images on demand and cache busting.
This image caught my attention on the second run because it too over 47 seconds to respond. This caused the Visually complete value to be over 63 seconds. I tried to run the image through Kraken to optimize it and the request timed out, three times. The server was too tied up to return the image on the fly.
Instead use a build task to render and optimize all your images once. You can create the different sizes one time, on a build server, not your web server. Let your web server serve assets, not generate them.
If you properly set expires headers you should not need to use a QueryString parameter to return the proper image or asset version.
Summary
I do performance audits of sites from time to time here on the Blog. This one is real to me because I order products from the Microsoft Store. The experience is quickly degrading and they need to reverse course to stay viable.
This is a Microsoft property where they are selling their flagship products. They need a web experience that screams user first quality, not we don't know what we are doing.
The Microsoft Online Store should also be a showcase of Microsoft's web technology. Instead is an embarrassment of poor development practices. When you use the 7% lost sales per 1 second latency web performance rule you can quickly see the Microsoft Store is killing themselves with lost sales. Can you imagine how many sales Microsoft lost over the holidays? I bet they would sell out of their flagship products more often if the site was responsive. They would also have more visitors to their stores because the online experience is easier and inviting to see what classes and workshops are being offered in the local store.
As a Microsoft fanboy and MVP I encourage the Microsoft Store folks to fix their site. An investment of a few days of a good web developer would yield high returns in sales and customer satisfaction.