2014-05-16



Depending on who you speak to, conducting SEO audits, in the traditional sense, is either the most gratifying or the most mind-numbing aspect of life as search marketing consultants. Just the name “auditing” implies an exhaustive dismantling of any given website’s approach to information architecture, coding, content, and meta data, all crescendoing into the delivery of some dizzyingly-verbose Excel spreadsheet or fancified PDF document ripe with opportunities and different ways to game the system.

That’s the definition of auditing in the traditional sense. The audit has a timeline, a beginning, an end, and a hard deliverable. Many SEO consultants, including this one, have used the auditing period as an important stepping stone when building consultant credibility with clients. It is, after all, the most tangible contractual deliverable we can provide. It’s part of every client onboarding process and we do it this way because that’s the way it’s always been done.

But is it possible the industry has outgrown its previous approach to traditional SEO auditing without looking at what’s actually working anymore? After all, SEOs often speak about how dynamic our marketplace has become, and yet they are still auditing websites the way they did in 2005. But instead of dismissing auditing as an unimportant or antiquated process, all SEOs should embrace the process and take a 2014 perspective on the things we believe to be critical components of on-page success and those that may be past their hey-day. So, without further ado, let’s attempt to pick apart and give some of the core tenets of on-page optimization a fresh perspective.

SEO Tenet

Duplicate content is bad.

2014 Reality

This assumption, in itself, isn’t necessarily wrong, but to make such a rigid statement underscores complete detachment from the actual intricacies of the modern web and how a search engine crawler works. In fact, the presence of duplicate content is now a very natural reality for most web properties. There are certainly instances of egregious and manipulative duplicate content that should be axed entirely, but the real value you can glean from your SEO consultant now comes in the form of how to appropriately handle the various manifestations of duplicate content that now occur regularly.

As Google indicated late last year (when revealing that nearly 30% of the web is actually duplicated content!) their crawlers have evolved in the ways they index and surface instances of duplicate content. While they reserve the right downgrade the performance of sites engaging in malicious content duplication, duplicate content, in and of itself, is simply not considered web spam. In fact, duplicate content can present a positive and streamlined user experience when properly declared.

Between canonicalization, no-indexing, declarations of alternate versions of your site, and the crawl limitations of URL parameters now allowed in Google’s Webmaster Tools console, there’s now a multitude of ways to deal with most duplicate content issues.

When it comes to duplicate content off-page, webmasters used to be wary of how that might impact their site’s rankings- particularly in instances where syndicated or scraped content is out of the content creator’s control. While surely having the canonical source of a particular piece of content outranked by scrapers or syndicators is not ideal, even for this there exists the ability to alert Google of the problem.

What to Watch For

The antiquated belief that all instances of duplicate content were dangerous led webmasters to seek out scalable solutions to creating unique value (an oxymoronic pursuit)- resulting in the “thin” content phenomenon that led to Google’s Panda algorithm. This is a major area of concern today as Google is actively seeking out crawling efficiencies in their ever-growing index. If your site has automated the creation of “unique” content to any extent (e.g. dedicated pages and metadata for blue widgets, red widgets, round widgets, etc vs. one good page on all widgets), you’re playing a dangerous game.

SEO Tenet

I need all high-volume keyword phrases represented in my landing page title tag.

2014 Reality

Note* This tenet is based on the inference that Google still weighs title tag text quite heavily as a ranking signal. I’m not here to dispute that by any means. I firmly believe this to be true- if only because it’s so closely tied to user experience that those who attempt to over-optimize typically fail to convert and eventually play within the best practice guidelines. That said, there are those who are still using a 2005 approach to title tag creation and conventions. This should stop.

There’s a school of thought that you should create title tags based on AdWords, or, when it was available to us, GA keyword referral data. This tends to create fairly clunky user experiences that populate critical title tag real estate with redundant and awkward phrasings. We all know what this looks like. This was actually effective at one time, but like many things, Google’s interpretation of your title tag has evolved and the most frustrated SEOs are those who have not even bothered to see what’s working anymore.

Title tags should account for two major things:

1.) Topics. Relevance signals for a number of synonymous phrases can be satisfied by the presence of a single synonymous phrase or topic. How do we know? Because Google tells you when they’re roughly equating one phrase to another by way of highlighting in the SERPS. Below you can see that for a search of “antique clothing,” Google has chosen not to highlight that phrase in the first result, but that of “Vintage Clothing,” indicating that the presence of both “antique” and “vintage” in the title tag of the ranking page is superfluous.

Certainly one could debate the valuable subtleties between the phrases “vintage” and “antique” and defend the usage of both, but this instance certainly illustrates Google’s more elegant understanding of phrases and topic umbrellas in 2014. More than ever, particularly with the recent changes in SERP layout now shortening even further the amount of visual real estate available, it’s important to consolidate your phrases and topics so that read cleanly and consistently.

2.) Brands. Frustrated site owners and webmasters will often complain about Google not displaying the title tag they declare in their source code meta data. It’s a rude reminder that you’re essentially at big G’s mercy if they don’t feel the title tag you’re providing best describes the page. However, if we had to isolate a single variable for which Google chooses to re-write a title tag, it would be failing to include the brand name in some capacity. In fact, Google’s Matt Cutts recently indicated that their testing indicated that users prefer to know what site or brand the links are pointing at somewhere in the title tag.

We see this playing itself out in the SERPS more than ever, where Google is making a conscious effort to strip down the title tag to its core topic and serve the brand. As you see below, even the folks at Major League Baseball are failing to have their keyword-stuffed title tag displayed for a Google search for “red sox jerseys.”



 

You can see the difference between what is displayed and what they actually try to declare in the source code…

<title>Boston Red Sox Jerseys – Buy Red Sox Baseball Jersey, Red Sox Uniforms at MLB.com Shop</title>

What to Watch For

If Google is re-writing your title tag, keep an eye on what they’re actually providing to users. Just because your tag is being rewritten doesn’t necessarily mean it’s manipulative or unhelpful. In fact, Google seems to alter the title tag for a given page depending on the query. Don’t feel handcuffed to 55 characters if the phrases you’re including each have unique value and serve different varieties of search intent.

SEO Tenet:

Use nofollow links when linking outside your domain to preserve the link value of your pages

2014 Reality

This methodology for trapping link “juice” is long past its life cycle as a practical on-page tactic. While certainly some consideration should be given to the amount of links you feature on a page–both for usability and practicality purposes–Google has effectively eliminated your ability to sculpt PageRank through the use of nofollow links. Even so, I still encounter clients who are wary of linking out from their own websites.

Why should site owners discontinue this practice? The first reason is it doesn’t actually work anymore. Going back a few years, Google unveiled a new way of handling nofollow links that effectively distributed the amount of available PageRank by the amount of links on the page- whether they were followed or not. By doing so, they stifled overzealous nofollowers’ ability to optimize certain links on a page to flow more PageRank than any other.

Another reason you might want to reconsider this practice is that linking out to other authoritative websites can benefit your site’s authority. By all reasonable calculations of how the modern link analysis and PageRank algorithm is actually comprised today, Google values websites who link to other authoritative websites. While we may not know the precise value of these authoritative link spaces within Google’s assessment of domains as a whole, it’s telling that nearly all PageRank proxies developed by third party tool provides (including MajesticSEO’s TrustFlow and MOZ’s Authority, which model their scores as predictive ranking metrics) include evaluation of a site’s outbound links.

What To Watch For

It would be an oversimplified inference for us to assume that all links on a page pass the same type and amount of PageRank, despite how Google treats nofollow links. In fact, Google’s link analysis is much more elegant in its ability to assign different levels of value to certain links over others. For instance, the antiquated practice of including keyword-rich anchor text links in the footer or site navigation is not effective in passing anchor text or interlinking within a particular site.

Instead, careful attention should be paid to those links occurring within the body content of a page and those wrapped in context- which are much more valuable from a semantic relevance perspective. Unfortunately, like all valuable SEO practices, over optimization ran amuck and turned perfectly good web pages into link vomit, devaluing the page content in the process.

So how does Google sort through the noise on this? A Google patent disassembled by the great Bill Slawski shows us that Google is more than capable of determining the relevance of certain phrases within proximity of each other, and can apply that to links on a page. Beyond the anchor text of that links, what do the phrases in proximity of the link tell us about what that link is pointing to?

The example below is a traditional, over-optimized approach to interlinking within a given paragraph:

ShoeStore.com is the best place to buy shoes online. I’ve had trouble finding stiletto high heels in a color that match my dresses in the past, but their color-match guarantee solves this problem. Beyond just dress shoes, they’ve got a great selection of Rainbow flip flops for the beach and some of the best prices on running shoes I’ve found.

By attempting to pass anchor text value for a number of different phrases in the example above, the resulting paragraph reads awkwardly because of excessive interlinking. The good news is, based on Google’s ability to interpret the context of your interlinks, you can still inherit proximity-based relevance and limit your interlinking to a single phrase. Below is a more practical and user-friendly model of how Google can assign value to links within context, based on the phrases in their proximity.

 

The branded link in the example above inherits and passes relevance from all the surrounding phrases, even though the phrases are unlinked. The resulting paragraph reads cleaner and limits link clutter that a search engine crawler would have to decipher.

Some Closing Thoughts

There’s a belief in football that upon careful examination, you can locate a penalty on nearly every single play. And yet, it seems as if only the most egregious or obvious get called. In most cases these minor infractions have little impact on the outcome of the game. Like any good football referee, Google is getting better and better every day at sorting through the relatively minor diversions from traditional SEO best practices and rewarding great and compelling websites with great rankings.

If there’s one conclusion we can arrive at, it’s that while auditing is a critical process for any effective search marketing campaign, like SEO in general, it doesn’t have quite the beginning and the end that it once did. It is a recursive process of evaluating what you’re doing vs. what Google appears to be rewarding. Don’t be afraid to adapt to what you’re seeing vs. what you’ve been told.

The post A Fresh Look at Some Outdated SEO Auditing Principles appeared first on Terakeet.

Show more