How Accurate are HTML5 Score Keeping Sites?
I have been slowly trying to take in many of the recent BUILD sessions. One I enjoyed was a panel discussion by members of the Internet Explorer team. Most of the questions they received are common when it comes to Internet Explorer; what about extensions, what about slow enterprises, are you going to increase cadence, etc. One in common question I often hear, "Why don't you support all of HTML5?" After the question had been asked, and not really answered, another audience member suggested the team visit HTML5Test.com as a reference for missing features. But is that a good resource? Are any of the similar sites scorecard sites a good references?
As a disclaimer I am fortunate to be a part of the Internet Explorer User Agents program and know most of the panelist. I and other User Agents have asked these same questions and been able to probe deeper with the Internet Explorer engineers on the topic of HTML5 support. So what I am writing here has some influence by those conversations, but is also based on my personal research and opinions.
Internet Explorer has good coverage of HTML5 standards. Drafts and proposals are another thing. Chrome and FireFox are pretty good at throwing implementations of concepts out to the world. Chrome and FireFox can, they do not have enterprise customers relying on their platforms like the Internet Explorer team does.
Chrome and FireFox have introduced access to 'nightly' builds, like Chrome Canary, where bleeding edge developers can try new concepts. These are a 'safe way' to get new APIs and features out in the real world to see how they might work. Both browsers have been burned by supporting specifications as they proceed through the recommendation process from early draft through recommendation or deprecation. On the contrary Internet Explorer gets burned by 'being the last to implement'. There is a sweet spot in there somewhere, and I think all the platforms are struggling to find the sweat spot, for each specification.
After I wrote the original draft of this article the Internet Explorer team has announced IE Developer Channel, an early preview to few feature implementations. Internet Explorer Developer Channel is similar to Canary and FireFox's Nightly and Aurora builds.
Each specification is unique and has its own characteristics. Some are pretty obvious there will be little to no change from early draft to recommendation. Others there are red flags, like security concerns. WebGL for example had some early security concerns, which kept the IE team from implementing the specification. It is also a complex specification and takes resources away from supporting other features that may have a broader impact in the real world.
I think it is safe to say the most important aspect of implementing a specification in Internet Explorer is security. WebRTC is the current specification the team seems to receive the most grief (the new WebGL if you will), but there are some real security considerations around the protocol. As I follow the debate at a distance my impression is WebRTC is heading down a deprecation path in favor of ORTC, a more secure protocol. I doubt Internet Explorer will ever implement WebRTC, but would not be surprised to see ORTC support within the next year or two. Its tough to say because I, unlike the browser vendors, am not a part of the actual specification discussions. I don't know what the reality of implementation is for them and I don't know what the mood in the room is when these guys get together. Yes they do get together to have face to face discussions several times a year.
What About the Scorecard Sites?
Sites like HTML5Test.com and haz.io can be useful to see what browser supports what features. In particular they run test code to see how you browser fares. HTML5Test.com also arbitrarily assigns points to each specification. As I am writing this post Internet Explorer 11 scores 376 of 555 points, which sounds terrible compared to Chrome, FireFox and Opera. But is it really? Lets look at the specifications the site tests, how scores are applied and how references are provided.
The current HTML5Test.com interface is very nice with the ability to dig deeper into each specification and see how many points are assigned to a specific feature or API. It also provides links to most of the specifications, both W3C and Mozilla as well as Google's WHATWG, etc. You can see if a specification is a recommendation, draft, etc, which is important.
What is interesting is the arbitrary score values assigned to each feature. For example WebRTC receives a whopping 15 points. It is an early draft proposal and like I said most likely heading to deprecation soon. So why 15 points? I can't say, but I would give it 0 because it is an early proposal. While the concept of WebRTC is awesome the implementation looks weak. If I controlled HTML5Test.com I would also list ORTC and give it a value of 0 as well. Again it is an early draft. It is nice to know if the proposal is supported, but since it is yet to be a recommendation I personally cannot have faith in the API and therefor would shy away from implementing it in a production application. Hence a 0 value.
There are some features that have reached recommendation state, but are not implemented by Internet Explorer, in particular INPUT date types. While Chrome, FireFox and other browsers go beyond the specification by creating an associated, unstylable, user experience, Internet Explorer ignores these types as the default text type. The value the specification has for the date types is possibly driving an onscreen keyboard and native validation. The date types also add potential value to localized data validation.
Personally I am back and forth about the date types. I have had client requirements all over the place when it comes to a date value. Often it never comes down to 4/24/14 or April 24, 2014, etc. Sometimes it was 'Next Summer' or 'Next Week'. Then you need to factor in localization and other vernacular oddities. The INPUT type=date is a narrow data type and may not be as useful as you think at first glance. Interestingly the specification says nothing about a user experience, aka a calendar popup control experience. Yet Chrome and other browsers implemented this functionality, which sound great. But what stakeholder have you ever had that was happy with a fixed control like this out of the box? Rarely if ever I would wager. These implementations are not stylable and therefor would not be usable in just about every project I might try to use them. Instead it is a good idea to use a UI library for something like this experience, giving you more control over the experience. This is where the future of web components looks promising.
Because of the ambiguous nature of the date types I think I would give them a value of 1, not 3. In comparison the search type I use all the time, but HTML5Test.com gives it a value of 2. I would give it a 3. I would also reduce color to 1.
By changing the value weights for WebRTC and the input types Internet Explorer now scores 378 of 530 points. Not quite as bad, but still only passing 71% grade. Adjusting the value weights to 0 for rejected proposals and 1 for early proposals brings the top score down to 492 points, bringing IE up to 76%.
So what about Chrome it scores a whopping 505 of 555 possible points. Does it really? Again about 40 points are given to it because HTML5test.com assigns value to proposals from the WhatWG, Chrome's personal standards body. Those, while sometimes very nice, are not part of the W3C specification, which I hold as the actual standard. Removing those points brings Chrome's score to 465. Reduce the date input values and the score is down to 453. Take away WebRTC, 438 and I could go on. The point being the modern version of each browser is pretty close to each other in actual HTML5 specification support.
Another site I like is CanIUse.com. The site list various specifications, which you can select or search to see a support grid and access resources defining the specification and documentation. If a browser partially supports a specification it will display an 'off' color to visually indicate the partial state. Below the grid is a group of tabs added descriptive comments, issues, links to resources and the ability to provide feedback.
Status.Modern.ie & Chrome's Status Site
Going back to build the Internet Explorer team launched the Status.modern.ie site, an online tool to tell what and when not only Internet Explorer supported a specification but also competing browsers. The Chrome team has a similar site, it is just not as eye appealing. The cool thing about status.modern.ie is you can filter the list in many different ways, by standard type, browser and version. You can also see if the Internet Explorer team is working on adding support to new specifications.
Unit Test - An Authoritative Test
Each HTML5 specification is complicated to understand. I doubt 99% of web developers have ever taken the time to even read a specification. They are often very dry documents that look more appealing to a lawyer than the average developer. Yet these specifications define a contract between web developers and browser vendors. They state what a developer should expect a browser to support if they claim to support the feature. Ultimately it is just easy for you to find a standard and see how the support landscape looks.
Unit tests are the only true way to determine if a specification is supported or not. Last year the IE team published a set of roughly 9000 HTML5 unit tests. They have since shut the site down in favor of a new unit test resource maintained by the W3C. The site is very, very raw and more of a rough reference. But there are several thousand unit tests and still growing. You can load any of the tests and immediately exercise your browser to see if it implements a specification or not.
Test The Web Forward
Test the Web Forward is an extension of the W3C unit tests, I believe. Similar to the Web Platform Docs site it is an open opportunity for developers to add unit tests to the official body of tests. If the open source community works they way it should this will be a way to rapidly expand the body of available tests to verify browser support for feature specifications.
Jonathan Sampson's Test Site
My friend Jonathan Sampson has compiled real data by scanning through the window and document object's in Chrome, FireFox, Internet Explorer and Opera. This is another way to visualize feature support, providing an accurate test of support. Because he lists all the properties based on actual interrogation it is pretty accurate. It is also unbiased and assigns no point values to a feature. It is not as sexy as CanIUse.com because it is more raw data and is unsearchable, well short of doing a text search.
Knowing exactly what each browser supports and does not support is important when deciding how your application will behave. Assigning a score or value to each feature is a personal thing. Someone else's favorite feature may not have any value in your application context, therefor their point scale may not matter. For the most part all the browsers offer solid support for HTML5. All offer support for some features others do not have as well as other create comforts like Internet Explorer's Enterprise Mode. My tolerance of arbitrary scorecard sites like HTML5test.com has reached an all time low and give it a score 1 of a possible 5. My advice is to use unbiased tools like CanIUse.com or status.modern.ie or Chrome Status sites.