Why Brian Dean is Wrong About Page Speed & SEO

Does Page Speed Affect SEO?
Does Page Speed Affect SEO?
All marketers want engaging websites. Our customers want good sites.

Research shows we want websites that render fast, in under 3 seconds fast.

Google and Bing both score pages on how fast they render and use that as a ranking factor.

But how does page speed relate to earning higher search results?

Brian Dean posted a video of a small, anecdotal study he performed to determine how page speed affects page rank.

This sort of study always sparks my attention because page speed is a passion and specialty of mine.

If you don't know, Brian runs a very popular SEO site and YouTube channel where he often shares in depth research.

This research combined with his ability to deliver content is why his site and videos are so popular.

But sometimes research and studies are flawed.

This can lead followers to make bad decisions affecting their online success.

Brian's recent study is flawed.

His conclusions will lead many down a path of destruction. (play ominous audio clip here)

Brian had noticed many high ranking pages were slow. And wondered how could they rank if page speed matters?

In recent years Google and Microsoft have both stated page speed is used to rank pages. Since slow pages are ranking Brain wanted to see if it mattered.

His research focused on a small, one page case study. He tinkered with a page on his site.

He made the page 'load faster' to see how it affected the page's ranking.

He chose this page because it ranks on Google's first page for 'SEO Tips'.

This is a rather low volume search term, but has super high competition. It is also of interest to me since I have a page with over 20 SEO tips I want to show in the results.

The research method missed key metrics and shows a lack of knowledge about how page speed is measured. I will cover the main points in this article and show how his research is flawed.

I also made a fast version of the page to compare.

Unfortunately most search marketers misunderstand how page speed is measured. Even more developers miss these metrics too. Honestly developers seem to ignore anything related to user experience. But that is my personal experience working with developers.

So pay attention I will demonstrate many common mistakes in this article.

I also took the time to create a better optimized version of his page. I measured its page speed with both Google PageSpeed Insights tool and WebPageTest.

You should put page speed as a primary rule for your pages. You should also target 3 seconds or less on mobile.

But what does does a good page speed score or time mean?

How can you measure your page speed?

How can you fix your site?

And does page speed matter?

I revue the answers to these questions and more. I will also share fixes to Brian's page. In the process you will see common mistake holding most websites back.

I also demonstrate why his results are inconclusive at best.
If you want to see my optimized page just load it for yourself.

The source code is on GitHub.

Examining Brian's Test Page for a Base Line

Brian picked the SEO Tips page because it scores poorly in both Google PageSpeed Insights and WebPageTest.

First, let me applaud his choice of tools. The Google PageSpeed Insights test is pretty good and of course easy to read the results.

Backlinko Google Page Speed Insights
Backlinko Google Page Speed Insights

WebPageTest is the ultimate page speed test tool because it is so thorough. It of course is my favorite page speed tool.

But using these tools and looking at the top level scores are not where you want to focus.

You must look at the low level details and think like a browser.

Backlinko WebPageTest Metrics
Backlinko WebPageTest Metrics

How Brian Improved Page Speed

First let's review some of the key things Brian said were 'improved' .

Backlinko Page Speed Improvements
Backlinko Page Speed Improvements

The first thing he shared was the page uses lots of large images. There are many images. He is using responsive images, so they are not physically (dimensions) too large.

He is using the wrong image format for many and I will touch on that later.

When I analyzed the page it requires over 600 images!

I am going to cut to the chase and share a spoiler, images are not Backlinko's problem.

He pointed out the page is long and longer pages take longer to load.

While there is some truth to this, it is not why the Backlinko page 'loads slowly'.

They trimmed some of the frivolous JavaScript, which does help the site render faster.

They also reduced the CSS payload, again always a good idea. And after cleaning the page's CSS I agree. I removed nearly 1MB of CSS alone.

They also did some things that actually slowed the page down, like removing the favicon. I have written about favicons and page speed.

In the end the page scored 100 in the Google Page Speed Test, which is awesome.

But after a few weeks the effort did not move the needle in the search ranking, the target goal.

Why?

Hang on I will get to the why soon.

First, let's look at some numbers.

Backlinko's Page Speed Test Score and Recommendations

I don't know exactly what the page's scores were before the effort. But Brian said they reverted everything back after the test to have a 'slow' page.

So taking him at his word I am able to run the same test to get some data to figure out how to correct the flaws.

First, I get a Page Speed Test score of 15, which is close to what Brian shared. Any test tool will experience slight fluctuations. You should always run multiple tests, from different locations, client devices and browsers.

The first number that stands out to me is the first contentful pain. This is the time it takes before a meaningful pixel is rendered.

3.2 seconds is slow, but not the end of the world.

The other times are off the charts bad on the surface. But you need to know what is being measured. And yes they are bad because of the images.

Next there is a whole slew of warnings, which I will cover a little more when I get to what WebPageTest tells me.

Backlinko Page Speed Warnings
Backlinko Page Speed Warnings

Each of these warnings are due to misconfigured image URLs, a simple fix.

A key point the test does reveal is to 'defer offscreen images'. This is a key point and one that will help Brian's pages.

If Backlinko lazy loaded images and other assets the use of dozens of images would not delay the page. It would not matter how many and how large the files are.

Defer Images
Defer Images
The images would not load until the visitor scrolls them into view. That is how I configure all my sites.

I also defer any YouTube embedding. I have a small script that injects the YouTube scripts when the user wants to play the video. This keeps about 500kb of YouTube JavaScript from loading until it needed.

Instead I load the video thumbnail as a placeholder. When clicked, it toggles the actual YouTube code. Since the majority of readers do not watch the video it keeps them from feeling the fat JavaScript tax.

But there are more things we can learn from the details WebPageTest tells us.

WebPageTest Results

I ran a test using the primary location in Washington DC. The test uses a standard Cable Internet connection and Chrome.

So desktop.

You can run a WebPageTest from many locations around the world. You can also use different browser combinations and more. I kept it simple because I got the data I need from this test.

At first glance I see bad things man, bad things.

Before I dive into the details it is important to clarify how to measure page speed.

Most think page speed is the time it takes the browser to 'load' a page from the server.

This is time to first byte. This is not a common issue today. If you have server-side bottlenecks it can kill you before you begin.

A good time to first byte is 250-500 milliseconds.

Backlinko is fine in this department.

Instead page speed it about your content rendering and being interactive. This is why WebaPageTest is the best tool to measure page speed. You get a deep set of details to see what a real user experiences.

First, 624 requests! Really?

That should be illegal.

Lazy loading images and assets will fix a majority of that problem.

For the record I count 3 images that would be loaded up front if the page lazy loaded images.

Backlinko Above the Fold Images
Backlinko Above the Fold Images

If I look at the number of images requested from my browser I see 413. Only 588 requests.

Backlinko Image Requests
Backlinko Image Requests

The total request discrepancy is due to an ad block extension.

Lazy loading images should reduce the initial impact by 410 requests!

Score!

You should always try to limit the number of HTTP connections to load a page. HTTP requests are expensive to create. This is why HTTP/2 is important. It manages HTTP connections efficiently.

A side note. Images are not the real problem with page speed, JavaScript is.

What is Blocking vs Non-Blocking?

Images are not a blocking operation. They are loaded on a separate thread.

JavaScript is loaded on the primary UI thread. It blocks all rendering while the script is loaded, evaluated and executed. This is the primary reason why you need to avoid excess JavaScript.

Brian's page suffers from excessive JavaScript.

Most pages suffer from this today.

Personal comment here: Most developers are more interested in writing more code than making a page load fast. This is why average JavaScript payloads are growing past 5MB when 20-50kb will do.

Backlinko JavaScript Requests
Backlinko JavaScript Requests

29 JavaScript requests, not the worst I have seen. But it loads a whopping 3.3MB of JavaScript. That is like loading 500MB of images, at least that is what I will relate it too.

Many of the scripts were duplicates. Wasted requests, bandwidth and extra processing with 0 gain.

Just to brag a little. I reduced the JavaScript payload to 4kb (there are some caveats I cover later).

An article like Backlinko's SEO Tips does not need much JavaScript. A few dozen kilobytes should do.

I know, I create pages like this all the time.

The Third Party Effect

If we look at the page's request map it is not terrible, compared to many I have reviewed.

Backlinko Request Map
Backlinko Request Map

There are 2 distinct script solar systems that could be completely eliminated.

One, the teal green group on the left, is the YouTube script ganglia. Applying my lazy load approach they are completely removed.

Next, are the orange scripts, all related to OptinMonster. This is how Brian collects emails for his newsletter.

I hate these third party scripts. Brian can streamline the lead magnet form by writing simple HTML, a few lines of JavaScript and CSS.

I changed it to use a few lines based on IntersectionObserver.

As I cleaned up this page for my test I found blocks and blocks of injected scripts, CSS and HTML from OptinMonster.

Dozens of the blocks were duplicated, a bad practice. This is an easy test to determine code quality and OptinMonster's quality is real bad.

I did some simple triage on the OptinMonster's inline styles and was able to remove over 200kb of payload.

A quick scan through their CSS was a lesson in how not to write CSS. So many mistakes made it obvious they do not care about their product quality.

Multiple class repeats, inline styles, duplicate rules and heavily over specified selectors. I could write a 5000 word article on how not to write CSS using their code.

So many code smells.

While CSS is not as heavy as JavaScript it is still a drag on page rendering.

My pages go through a CSS optimization step that extracts just the CSS rules used by the page. It then inlines the styles in the document HEAD. Since most pages need 1-5kb of CSS this can have a noticeable impact on page speed.

Plus it only needs to be performed when the page content is updated, not each time request.

As I cleaned up the page I found about 1MB of CSS. I reduced this to about 35kb and inlined it in the document HEAD.

I will note, I did a quick clean up. To make it as small as possible I would start from scratch, even the HTML. In all honesty I could reduce the combined HTML & CSS payload to under 100kb.

Still there were dozens if not hundreds of inline styles declared in elements. This is a horrible practice and should be avoided for many reasons.

As a note. When you use a third party you are outsourcing your success to their quality.

In Brian's case he takes a heavy hit with this third party. Ultimately, he pays for it by either working harder to build a brand or just simply bleeds additional revenue.

This is common when you subscribe to a third party service. They typically provide low quality code. This reduces your ability to convert visitors to customers more than they help.

Writing vanilla code would drop a few dozen requests and external dependencies. It also reduces the size of the markup and CSS measurably.

You can still collect emails. If OptinMonster's back-end is worth it, you should be able to supply the information to their API. Their front-end is not worth using.

I added a few lines of JavaScript to show his lead form and post the email back to an API.

Yes, for a real, production page I would put in a little more effort to the script for the popup. But it would add a handful of JavaScript lines, not megabyte.

Remember I am trying to work within the confines of an existing page.

Back to lazy loading content.

Comments and Gravatar Images

The article does not use 413 images, that would be incredible.

Most of the images are small mug shots of commenters. Each time a comment is added the Backlinko comment system fetches their gravatar photo.

A nice, personalization touch!

But it adds weight and most visitors wont bother with the comments.

Instead the entire comment section should be lazy loaded.

Sure, the search engine spider will not read those comments. But I am in the camp comments don't add SEO value. I know some studies claim they do, but I doubt it.

I believe articles with large comment chains rank better because they have good ranking signals. The comments are a natural by product.

I could go on an off topic rant here, so I will spare you to stay focused on making your pages better.

If you want a good example of lazy loading comments look no further than YouTube. Comments are not loaded by default, you have to scroll down to trigger their load. Follow that lead.

I set this up in my version of the page.

Since about 95% of the images in the page are Gravatar images all those requests go away!

Clean HTML

Before I move on I want to say something about clean HTML. Even after I cleaned up the page the HTML was 241kb. This is very large, even for a 10,000 word article.

This is due to over nested HTML and inline styles. I did not change the HTML structure for this test.

I looked it over to see if there was some quick triage to perform. The nesting scared me off since this is a quick fix sort of effort.

Out of the areas I worry about the HTML depth is the last thing. HTML is processed very quickly by browsers. Yes, Backlinko could remove about 50% of the markup, but it would take a monumental effort without just starting fresh.

If you want an order to fix issues follow this list:

  • JavaScript
  • Reduce the Number of HTTPS Requests
  • Lazy Loading Assets
  • CSS
  • Images
  • HTML

Image Optimization

The final thing I will point out is to optimize images.

I said earlier that images were not the main problem. I will stand by that statement.

When you see images managed as poorly as Backlinko's it is difficult not to work on them.

Oversized Images
Oversized Images

Based on my WebPageTest run, about 25MB of the 35MB payload is images.

My version starts with 157kb and completes with 8.7MB, not counting the Gravatar photos.

One caveat.

There is an animated gif file used on the page that weighs nearly 10MB. I did not include that in my total.

I did not modify that file, yet. I may in the future. But I worried it would break the animation.

However I would recommend changing this to a video, which will be smaller.

The animation does not add value to the message. Changing to a jpg showing part of this infographic would be even better.

From what I can tell Brian may not know the jpg image format exist. He uses nothing but PNG files.

Portable Network Graphics (PNG) is an image format for line art or illustrations. It may or may not be more efficient than GIFs, but is terrible for photographs.

Many images the target page loads are photos or are better using jpg. The PNG format makes them very large.

I will demonstrate with a single image:

Example Optimized Image
Example Optimized Image

As a PNG the file's size is 408kb. Optimizing the PNG reduces the file size to 280kb.

But changing the format to jpg reduces the file size to 72kb. That is a 336kb savings!

Now, let's scale that to all the images. Assuming similar savings for all 413 images, Brian could reduce the payload (desktop) by 35MB or more!

Of course some images should be PNG or GIF. I may play and see if I can get an accurate number.

I wrote a simple node script to compare the two versions locally. It updates the HTML to reference the smaller file size version.

These are all actionable items most sites should apply. Eyeballing the page's waterfall this would keep the page under the 3 second threshold.

But there is one more thing that stands out.

Properly Referencing Page Dependencies

Notice all the yellow in the waterfall?

301 Image References
301 Image References

Those are typically network requests without a Cache-Control header.

Not in this case!

Backlinko has image requests that generate a 301 redirect response.

Why?

The site references the images using the CDN origin instead of the images on backlinko.com. The CDN domain has an auto redirect (301 status code) to backlinko.com.

This means there is an extra, expensive, network request and hop back to the site. I count about 85. Those references need to change to use the backlinko.com domain, not the CDN alias.

You want to reference everything on your domain, not an external domain when you can. HTTP connections are expensive to create.

Brian is using HTTP/2, which everyone should be utilizing. This optimizes the connection to the origin.

Its one of the many reasons why all sites should use a content delivery network service. They all support HTTP/2, a more optimized delivery protocol.

Final optimizations I made were to use intersectionObserver to lazy load the comments. It uses a simple fetch call that sets the comments to the innerHTML of the comment wrapper.

I also displayed one of his pop-ups as the user scrolls the second H2 (sub heading) into view. It is visible for 10 seconds.

For a real, production version my pop-up code would be a little more sophisticated. But not much more code required.

After I performed my triage on the page my version was fully loaded in 1.06 seconds. The speed index is 513 or .513 seconds.

Optimized Backlinko WebPageTest Scores
Optimized Backlinko WebPageTest Scores

The initial payload is less than 147kb before unzipping the text assets. 423kb after unzipping those files.

There are 11 initial requests.

Optimized Backlinko WebPageTest Waterfall
Optimized Backlinko WebPageTest Waterfall
I also did not include Google analytics code or tracking pixels. On my site the analytics code is the biggest drag to pages loading. However the script is loaded asynchronously.

As for tracking pixels I use just the pixel, not the slow scripts most sites use.

There are dozens of other optimizations Brian could make on his site. I did the big ones staring me in the face.

Optimized Backlinko Page Speed Insights 100
Optimized Backlinko Page Speed Insights 100

My version scored 100 in the Google page speed test.

Of course my version is not ranked, nor am I looking to have it rank. So I can't test that part. But I can speculate.

The Baseline - Does Page Speed Affect SEO Rank?

The simple answer is, it's complicated.

But, what Brian did to speed up his page did not really speed it up! At least not in the way actual page speed is measured.

You see Brian was looking at total time for all assets to load. Not time to render or first interaction.

It is the time to first interaction that really matters.

Why?

As I mentioned earlier JavaScript blocks or locks the UI. Images don't.

Backlinko loads a lot of images on its pages. They add real value to the content. At least the images in the article.

But they do not slow the rendering process.

If you look at the Speed Index, it is a key metric for actual page speed, it is not bad. Speed Index measures how much of the visible page rendering over time.

It focuses on when above the fold pixels are rendered using some basic calculus.

In my test it was 3 seconds, not bad.

All the optimizations Brian made affect time after this point. Not what search engines are looking for.

If the user can scroll the page, it is interactive. If the page is locked while scripts are being evaluated that is bad.

But an image loading 10 'page down taps' away from the initial view is not 'disrupting' the page's usability.

This is why Brian's effort had a minimal impact.

Now if Brian's updates had moved the time to interaction needle closer to 1 second there might have been a tangible movement in ranking.

I think there is more we can learn.

Evaluating the Competition

This leads me to a final chapter in this saga.

How does the competition fare?

I mean Brian did this study because he routinely sees slow paging ranking high.

Why?

Lets look at the remaining top 10 'seo tips' results.

HubSpot - 4.6s speed index GoinesWriter - 9.252s speed index AHrefs - 2.954s speed index Neil Patel - 1.211s speed index OptinMonster (Ironic I know) - 1.59s speed index Search Engine Journal - 2.851s speed index Entrepreneur - 2.963s speed index Databox - 4.319s speed index

I will note the AHrefs article has over 75 tips, the most of any article on page one. Yet the page is the smallest, 650kb!

They also do not support user comments, just an observation.

Why?

They lazy load content, a primary recommendation I made in this article!

By the way, I can see a few low hanging fruits to improve the AHrefs page even more 😉.

For the record I have a rather long article on SEO Tips myself. My page has more tips than Brian's and might even be longer. It also loads way faster.

It sports a 1.3s speed index score and is fully loaded in 2.15 seconds.

The initial payload size is 196kb, even better than AHrefs 😋.

It is not even listed in the top 100 for 'seo tips' in the Google results!

Why?

Most likely because I don't have any competitive backlinks coming to the article.

This is a very competitive keyword. The serps are owned by some big names throwing domain authority around. These articles also tend to have very strong backlink profiles.

At the same time, traffic is limited. Maybe 3000 global searches per month. But it has business value so effort is put into those first page rankings.

This means it is difficult to move those listings.

I mean the first result, as I write this article, is the GoinesWriter article. It has 9+ second speed index score, the worst of the bunch!

But I think the site has other signals that overpower everything else. This could be backlinks, direct traffic, social signals (do those really matter?), etc.

In other words, there are over 200 known ranking factors (according to Backlinko). The GoinesWriter article has other things going for it.

Brian does need to improve the user experience by improving page speed. He could cut the actual page render time by a second or more.

This might, after a month or so move him up one or two positions.

In the end applying these optimizations will save visitors bandwidth. It will also reduce Backlinko's server load and monthly bandwidth charges.

Getting more links would move the needle more.

I have a theory. If you want more backlinks make a faster page!

You see, page speed can be measured and it is a single signal.

But it does not stand alone. Page Speed affects most of the remaining signals because it changes visitor behavior.

Visitor behavior is expensive to measure.

If you give them a good user experience they will behave more favorably. This lifts your site and page profile in the eyes of search engines.

Brain gave it 2 weeks. This was not enough to change Google's page profile. My gut tells me they keep a running average profile for your page. The page has some decent age on it thus an established history.

A two week anomaly may not be enough to move the needle.

Also consider the page gets 150 weekly page views. I bet a decent amount are direct due to Brian's brand engagement. This also reduces some signals to the search engine.

150 page views is not many. Not enough to generate enough real data to move a needle over 2 weeks.

If there were 1500 a week then you would be at a level that might matter in a 2 week span.

I know having fast pages are a key reason why my site ranks well for thousands of keywords. But I am not naïve in thinking it is the only reason.

For now, backlinks are still king. Google and Microsoft guidelines are key to creating quality content and user experience.

As I complete this article Brian's page has moved to #2 on Bing and #6 on Google. I suspect this is due to increased links and page views due to his page speed study.

Happy customers translates to better search rankings, which of course brings more visitors. It is a wonderful cycle.

Summary

Every site can and should improve page speed. Even me!

But sometimes it is not the problem.

In Brian's small case study there are many factors at play. The level of competition for this low volume keyword make it tough to move the needle.

His pages do not reach time to first interaction in a terrible time frame. Compared to the other top ten results the Backlinko site is within tolerance.

The test page has around 200 backlinks. I did not do a formal backlink profile compared to the competition. It might be weak compared to the top results.

Unfortunately, Brain has shared that page speed does not seem to be a ranking factor.

This is wrong.

Brian did not measure his existing time to first interaction. Instead focused on the wrong metric, final load time, and fixed the wrong page issues.

Or more accurately, did not correct issues the right way.

It pays to understand how a browser renders content. You also need to know what search engines mean by page speed.

Once you understand things like the critical rendering path and how different assets are processed you can design your pages to load super fast.

Fast pages make visitors happy. Happy visitors convert to happy customers, at least eventually. And that is what we really want.

Search engines know what people want and they reward you for matching those signals.

Page speed is the foundation of good user experience and collectively that impacts your page authority.
My optimized version can be studied here.
Note: As I go to publish Brian has updated the article header. My numbers are based on the target page the day I first watched his video.

Share This Article With Your Friends!

We use cookies to give you the best experience possible. By continuing, we'll assume you're cool with our cookie policy.

Install Love2Dev for quick, easy access from your homescreen or start menu.

Googles Ads Facebook Pixel Bing Pixel LinkedIn Pixel