Chrome to Block HTTPS/HTTP Mixed Content by February 2020

Chrome to Block Mixed HTTPS/HTTP Content
Chrome to Block Mixed HTTPS/HTTP Content

Starting with Chrome 79 (December 2019) Google will start moving toward blocking all mixed or non-secure content loaded by a secure site. The December change is the first in a series of steps to phase in this change and will affect many sites that may not know they are serving mixed content.

Today most sites serve their content via HTTPS, providing a safe, encrypted experience for their visitors. Google reports over 90% of time on sites uses HTTPS and most search results are HTTPS.

If you have not upgraded to HTTPS you should do that today. If your hosting or CDN providers do not offer free TLS certificates you should invest in migrating.

Over the migration period, which will be between Chrome 79 in December to Chrome 81 in February, the browser will auto upgrade non-secure content to be secure if possible. The second phase, Chrome 80, will upgrade audio and video resources. The final step, Chrome 81, will do the same for images.

If you site is using mixed content the browser will not display the padlock, indicating a secure site even if the document (HTML) is served via HTTPS. Instead your visitor will see the 'not secure' chip in the browser's omnibox.

Chrome Not Secure Chip
Chrome Not Secure Chip

Currently if you have mixed content you don't get a secure padlock and you don't get a not secure chip. You get something in between, which can be confusing even to someone like me.

What is Mixed Content?

Web pages are composed of different resources. This starts with the actual document, which is HTML. The document references additional URLs or resources to compose the page. These resources are typically stylesheets (CSS), JavaScript, images, fonts, media and other file types.

HTTPS Mixed With Not Secure Content
HTTPS Mixed With Not Secure Content

When the main document is served via HTTPS but some of the resources are loaded over HTTP this is mixed content.

It is a problem because even though the main document is fetched using HTTPS the sub-resources are not, and they can be tampered with, altering the actual content rendered in the browser.

Think about it, if your page shares information, but a man in the middle can alter a script before it reaches the client they can inject code in the script to alter the information you are sharing. They could also inject additional 3rd party scripts and more.

If you want to know more about how HTTPS works I have a more detailed article that explains how the TLS encryption layer works and protections you from man in the middle attacks.

HTTPS Secure
HTTPS Secure

Even though the main document was loaded over a secure connection, the content can be compromised by a dependency loaded over a non-secure request. This compromises the integrity of all the resources loaded by the page.

How Do Sites Have Mixed Content?

You may be thinking, I have a TLS certificate installed and I have HTTPS enabled, aren't all my site's resources secure?

Maybe, but often not, unless you have taken the time to audit your content. For newer sites most likely you won't have an issue. But older sites often will.

If you ever served your site using HTTP and then upgraded, you most likely have mixed content that needs to be corrected.

3rd party resources are another common source of insecure content. If you add scripts for services from outside parties you should make sure not only that script you are loaded uses HTTPS, but all additional URLs loaded by that script are also secure.

You would be surprised how far your page requests will sprawl due to a third-party provider. I will show you how to audit for these leaks next.

If you find a 3rd party that is not able to provide an HTTPS reference then I advise you to drop them, find a new provider or just build the feature yourself.

How Many Sites Suffer From Mixed Content Issues?

Its really hard to say exactly, but it looks like about 6% of the current web cannon might suffer from mixed content issues.

Pulno, a website crawling and auditing service audited their customer base's websites to see how Chrome warning and blocking mixed content might affect them.

They audited over 37 million web pages across more than 600,000 domains. They found over 2.2 million pages with mixed content, which affected almost 31,000 sites.

That is right at 6%. But I think it might be a bigger problem.

Sites that use a service like Pulno tend to be higher quality. This means the actual percentage mixed content can affect will be higher, probably mor elike 10%.

So let's expand that concept. If your site has mixed content this will be a clear signal to search engines not to rank your site. You will also suffer from visitor abandonment when they see the warning or worse your site is broken due to a missing script.

Let your competition keep their mixed-content, fix yours and you can start earning a slice of their traffic. It is a simple fix and a service like Pulno can help.

How to Audit Your Site for Mixed Content

To find your insecure resources you need to audit every page on your site, which sounds like a tedious task. Fortunately, there are tools that will help.

The first place is to open your browser's developer tools (Chrome and the new Edge) and use the Lighthouse audit. It will highlight insecure references.

But this is a page by page tool. You need something that will audit your entire site.

Screaming Frog is a popular search engine optimization auditing tool. It has a view that allows you to filter requests by protocol. You should let the tool crawl your site, collecting all the references made by all your pages.

Then open the 'Protocol' tab. Just below the tab level there is a filter with a drop down. Select 'HTTP' to see all the resources loaded via HTTP.

Screaming Frog Filter by Protocol
Screaming Frog Filter by Protocol

Take this knowledge and update all the pages with insecure references. If you are lucky you will only need to update a common shell referencing scripts or stylesheets. Older sites will often have many older articles that need to update image references.

Another tool you should look into is WebPageTest. Again this will be a page by page process, but is great to visualize how 3rd parties are affecting your site.

You will get a full audit of your page's speed/rendering profile, but there are links to additional visualizations. One of those is Simon Hearn's 3rd party visualization tool. This renders a map of all the requests made by a page across different domains and who makes the actual requests.

Sort of hidden in the top right corner, just below the score card, you will see a link to 'Request Map'.

WebPageTest Request Map Link
WebPageTest Request Map Link

Go ahead, try it out, you may be scared to learn what is actually happening and just how much control of your user experience you outsource.

Example WPT Request Map
Example WPT Request Map

In this example I tested a site with 777 requests! It took almost 40 seconds to render over high speed and as you can see has a scary request map.

You can drill into the visualization to see more detail. Each circle represents a domain and the lines indicate what resource initiated the request for the domain's resources.

So you are not 100% scared this is what the request map for one of my page's looks like. It is more inline with what you want to see.

Simple Website Request Map
Simple Website Request Map

Another WebPageTest feature is running the Lighthouse audit as part of the testing process. You just need to enable it before running the test.

WebPageTest Capture Lighthouse
WebPageTest Capture Lighthouse

How to Grow Your SEO Success With HTTPS

We know HTTPS is a ranking factor. In fact, I rarely see any sites still using HTTP listed on the first page for any search query.

With about 10-15% of the web still not using HTTPS this means there is an opportunity to steal or pick up valuable links from additional sites. Chrome indicating a site is not secure with mixed content you have an additional hook to reach out to sites for backlinks.

Even if a site is using HTTPS but uses mixed-content it might be enough for you to be able to pick up a link.

If you know a site you want to outrank or possibly a site you want a link from you can perform the same audit on their site's you perform on your own site.

First, how to steal links from a competing mixed-content site.

The easiest way to identify them is to search a keyword phrase you want to rank better. I am going to use a SERP extractor tool provided by Chris Ainsworth to make the job easier.

I have my Google search results set to 100, the maximum to simply the process even further.

I randomly decided to search for 'Texas Fly Fishing Guides', a very long tail term that should have a nice list of results. Sure enough you can quickly spot insecure sites starting around position 27.

Texas Fly Fishing HTTP Serps
Texas Fly Fishing HTTP Serps

Now you need to know who links to these sites, which can be tricky unless you have a tool like Ahrefs handy.

Plugging the first site into Ahrefs and analyzing the backlinks shows there are over 80 do follow links. Many from some nice sites and pages. From here you can reach out to those linking sites to see if they would be willing to link to you, a secure website and maybe even replace the competing site all together.

Example Ahrefs Tool For Backlinks to Steal
Example Ahrefs Tool For Backlinks to Steal

You need to have matching content, which you may need to create. But if you are a Texas fly fishing guide it may be more than worth your time to create matching content and earn more traffic to grow your business.

If you spot competition that has mixed content, which will take a little more effort than just scraping search results, you can apply the same technique to earn more links to your secure site.

Summary

Google Chrome is not the only browser gradually increasing the pressure on non-secure sites. Edge, FireFox, Opera, Samsun Internet and Safari are also displaying secure and not secure chips and messages in their omni bars. Each browser has their own criteria and warning user interface, but in the end they all serve the same purpose.

FireFox Green Padlock Certificate Information
FireFox Green Padlock Certificate Information

The web should be a safe secure place. HTTPS is the baseline all sites should be using today to make their content secure by default.

Remember, you can't be a progressive web application or even access many modern web platform features without using HTTPS. For example, Geo-Location and service workers require HTTPS.

Use tools like Screaming Frog, WebPageTest and others to help you identify where you site may have non-secure references. Then take the time to update those to use HTTPS.

If you don't have a TLS certificate installed on your site feel free to reach out and maybe we can help you out. All sites we develop and host are HTTPS by default.

Share This Article With Your Friends!

We use cookies to give you the best experience possible. By continuing, we'll assume you're cool with our cookie policy.

Install Love2Dev for quick, easy access from your homescreen or start menu.

Googles Ads Bing Pixel LinkedIn Pixel