2014-05-28

Share

Last week, Google published a post entitled Understanding web pages better on Webmaster Central blog. The post discussed how the web has evolved from static HTML pages to richer content with lots of JavaScript and CSS and what this evolution means for Google’s web crawlers. But, as we will present here, Google’s post also has some interesting implications on the future of web performance and SEO.

Technologies such as single page apps, client-side template engines, and offline capabilities have helped move the construction and generation of content to the client. Instead of getting an HTML page with content to index, some applications are getting a simple HTML shell, and use JavaScript to fetch, format, and place content into the DOM. This creates a problem: how to index this dynamic content? Google’s solution is to make their crawlers work more like web browsers, by downloaded external resources and by executing client-side code. From the post:

In order to solve this problem, we decided to try to understand pages by executing JavaScript. It’s hard to do that at the scale of the current web, but we decided that it’s worth it. We have been gradually improving how we do this for some time. In the past few months, our indexing system has been rendering a substantial number of web pages more like an average user’s browser with JavaScript turned on.

Google enhancing the capabilities of their crawler to be more like a browser certainly helps their indexing goal. However, I believe will have an interesting fringe benefit as well: improving the performance metrics used in search engine rankings.

Performance’s relationship with SEO

While Google has announced many times that web page “performance” was a factor in its ranking algorithms, no one outside of Google knew what aspects of web performance Google used, and how much impact those metrics had on ranking. So, in August 2013, we worked closely with SEO researchers at Moz to try to determine what web performance metrics are used in search rankings, and what their effects were.

To answer these questions, Zoompf and Moz captured the top 50 search results for 2000 diverse search queries. We analyze those 100,000 pages for over 40 different performance metrics using a large cluster of WebPageTest instances. The full details of our approach and the results are beyond the scope of this post. I highly encourage you to review our research findings.

Our data showed no relationship between conventional metrics for web performance like page load time or document render time and search ranking. In fact, the only metric to with a correlation to ranking was the Time To First Byte (TTFB). This metric captures how long it takes your browser to receive the first byte of a response from a web server when you request a particular URL. In other words, TTFB encompasses the network latency of sending your request to the web server, the amount of time the web server spent processing and generating a response, and amount of time it took to send the first byte of that response back from the server to your browser.

In our conclusions, we wrote (emphasis added):

[Our] data shows there is a correlation between lower time-to-first-byte metrics and higher search engine rankings. This means that, despite conventional wisdom, it is back-end website performance and not front-end website performance that directly impacts a website’s search engine ranking. The question is, why?

TTFB is likely the quickest and easiest metric for Google to capture. Google’s various crawlers will all be able to take this measurement. Collecting document complete or fully rendered times requires a full browser. Additionally, document complete and fully rendered times depend almost as much on the capabilities of the browser loading the page as they do on the design, structure, and content of the website. Using TTFB to determine the “performance” or “speed” could perhaps be explainable by the increased time and effort required to capture such data from the Google crawler. We suspect over time, though, that page rendering time will also factor into rankings due to the high indication of the importance of user experience.

Now that sounds familiar! Our conclusions were that more sophisticated performance metrics weren’t used because they would be difficult to capture at scale. In fact, Google’s post last week specifically mentioned that emulating a browser and executing JavaScript is “hard to do that at the scale of the current web, but we decided that it’s worth it.”

Google’s effort to enhance their crawler to more closely behave like a browser means they can also collect richer performance metrics like window.onload, DocumentComplete, and fully rendered. This paves the way for Google to use more meaningful performance metrics than TTFB in its ranking algorithm.

The future of performance and SEO

As crawlers are capable of taking richer performance metrics, we expect to see a performance take on a larger role in search engine rankings. We already know how a fast website improves the user experience, revenue, and conversion rates. The addition of stronger ties between performance and search ranking truly means that a “speed is a feature”, and not a secondary and optional objective.

If you are interested on how performance impacts SEO, you will love Zoompf. Zoompf analyzes yours website for over 400 performance issues and tells you how to be faster. You can get a free performance scan of you website now and take a look at our full featured performance product today!

The post Google Adds Ability for Better Performance Metics appeared first on Zoompf Web Performance.

Show more