2015-08-31

Want to discover something about your website that will absolutely shock you? All you have to do is conduct an SEO audit on your website.

An “SEO audit,” in spite of its dry-and-boring-sounding name, is one of the most jaw-dropping, eye-popping, mind-blowing experiences that could happen in the life of your business.

Why? Because an SEO audit uncovers dozens of unexpected, under-the-hood, hidden, or nefarious things that could be going on in your website. Without an SEO audit, you’ll never discover these things. With an SEO audit, you’ll find out exactly what’s making your website work…or not work, as the case may be.

Here at Volume Nine, we’ve conducted hundreds of SEO audits. These audits are intense, thorough, detailed, and complex. We leave no stone unturned, no metadata unexamined, and no title tag uncrawled. Basically, we take a website, turn it upside down, shake it, open it up, take it apart, and find out precisely what is affecting the site’s performance. We look at the site from every conceivable angle.

Needless to say, we’ve learned a thing or two. In this article, I’d like to tell you what exactly we’ve learned.

Also, we’ve put together a free, comprehensive SEO Audit checklist – scroll to the bottom of the blog & fill in the form at the end of the blog to receive your complimentary copy!

What You’re About to Read

The information in this article may shock or disturb you. Regardless, I recommend that every business with a website get an SEO audit.

Based on our experience of conducting hundreds of SEO audits, we’ve assembled a list of what most website do right and what most websites do wrong. In this article, I’m going to explain, first, how we do SEO audits so you understand the landscape for the remainder of the article. Next, I’ll start with the good news — what most websites do right. The majority of this article, however, is a detailed examination of what most websites do wrong.

I’m not a glass-half-empty kind of guy, but I think that dwelling on the negatives is, in this case, the best way to make positive changes. I’ve focused this article on the six major shortcomings, so you can analyze your site with cold objectivity, and make the hard changes that will create powerful change.

Many of the websites we’ve audited have learned their SEO the hard way. Hopefully, this article will allow you to learn the easy way — through the example of hundreds of websites that we’ve already audited.

How We Do SEO Audits

The team here at Volume Nine SEO have developed an intensive process of conducting an SEO Audit. Even audit covers all the SEO Best Practices for a website.

What kind of websites? Every kind. Our audit clients include some of the world’s largest brands, huge ecommerce sites, tiny ecommerce sites, SaaS sites, mom & pops, B2Bs, B2Cs, blogs, and and just about every legitimate industry and niche on the web.

Major Areas of Investigation

Here is a quick review of some of the things that we include in our SEO Best Practices Review. The major areas include technical SEO, on-site content, and off-site link and brand signals.

This is how we map it out:



Scoring Protocol

In the technical review, we look at over 50 different SEO Elements and decide if the website is Passing (Green), Passing with Caution (Yellow) or Failing (Red) for each element. The scoring results are mapped with the following easy-to-identify symbols:



Major Issues

Although we look at literally thousands of individual issues in a website, we usually surface 3 or 4 major issues in each website.

For example, a client may think that they have received a Panda-, Penguin-, or even Pigeon-, based penalty, so we focus our attention on that issue. In other instances, the site may have recently launched a new CMS or redesigned their website, and suffered a traffic loss. In each of these cases, we dive into the root issue behind such symptoms, to accurately diagnose the problem and propose a solution. We insist on examining every facet of a website, in order to make sure that nothing critical or easy-to-fix isn’t overlooked.

Technical SEO . . . and beyond

Unlike other SEO Audits, our analyses do not stop with technical items. We also evaluate things like content quality, brand mentions, link profile and social signals. Content quality and human readability is a soft algorithmic factor that affects a site’s ability to rank. Brand mentions are part of the rising trend of co-citation and co-occurence that influence search rank. A link profile comprises the largest set of offsite ranking factors. Social signals and their impact on brand perception and presence are also a critical feature of searchability and non-search engine findability. The more poking we do, the more we discover — and the better our audit will be.

Tools We Use

The tools we use during an audit are comprised of proprietary software that we have developed in-house.

We also make use of other publicly available SaaS and tools such as Screaming Frog, Google Webmaster Tools, Google Page Speed, and Google Analytics Algorithm Overlay.



What Websites Get Right

Clients that come to us for an SEO Audit have usually done a lot of the SEO basics right. They come to us, because they’re looking for more specialized help.

We see our highest rate of passing marks for on-site optimization. Here are the percentage of websites that earn passing scores in each of these areas:

Keyword Representation – 92% Passing

Meta Description – 87% Passing

H1 Tags – 81% Passing

Page Titles – 78% Passing

Image Alt Tags – 75% Passing

In most of our SEO Audits, we find that good target keywords are represented throughout the body text in an appropriate way without keyword stuffing or spammy usage. We also find that most webmasters use a single H1 tag to support a keyword. Most sites also contain and also image alt tags where appropriate.

Some Caveats

In spite of the success, there are a few common mistakes we see. One of the big ones is an overuse of H1 tags. We really want to see only one H1 tag per page. In addition, we look for the appropriate use of H2 and maybe H3 tags for really long content. Short headlines in the body copy with a <strong> or <b> around them often work just as well as an H3.

When we see a lot of H2 and H3 tags, it’s a signal to us to look for over-optimization of image alt tags and footer copy. We also start examining other SEO spam techniques like thin-content landing pages and exact-match anchors.

The Importance of Page Titles

The single most important onsite SEO element is the page title. Page Title Optimization is so important to us that we have built an entire tool in our SEO Dashboard. Even though a website might receive a passing overall for Page Title Best Practices, it is guaranteed that additional page title work will help the overall SEO for the website.

Here’s a typical scoring snapshot from our SEO Dashboard. Every Volume Nine client has full access to this information.

As stated earlier, most audit clients use page titles successfully — 78% to be exact. But if a webmaster gets copy, meta descriptions, and H1 tags right, why do only 78% of Page Titles get a passing grade?

The explanation is simple. In the page title, webmasters and SEOs are determined to use every possible character afforded them to get a little more lift in their rankings. It is much easier to throw a few more keywords on an existing page title instead of producing a new, engaging and different page or content around that additional keyword.

And why not? As Positionly.com stated in their Top Google Ranking Factors, “The title meta tag is one of the strongest relevancy signals for a search engine.” Both beginners and advanced SEOs know that managing page titles is high on their list for managing SEO campaigns.

We include keyword mapping, Page Title reviews, and Page Title Optimization in every Website SEO Audit that we perform for a specific number of pages. We also review how we approached the process for the website so that the webmaster can decide if she wants to complete the job or have us to do more page title reviews and optimization.

What Most Websites Get Wrong

Now that we’ve covered the positives, it’s time to dive into some of the not-so-positives. What follows is the list of areas where most websites fail miserably.

Websites and clients still struggle with Technical SEO and how to configure the website. We see our highest rate of failure in this category.

Let me show you exactly what kind of mistakes come up all the time.

1. Page Speed – Failed in 61% of SEO Audits

If you can pick between Fast, Faster, or Fastest, I’m sure you would pick Fastest for Page Speed. Google agrees. They want to Make the Web Faster.

While we can not measure actual download times for everyone’s browser, connection, and device, we can measure how long it takes to load the necessary HTML to begin rendering the page from a specific URL. There may be variance from one run to the next, but the differences should not be too large. In fact, highly variable server response time may indicate an underlying performance issue.

To help you make your website faster, Google has built a tool called Page Speed Insights.

Insights provides actionable information that will allow your pages to load faster on all devices. They will even host your website and help you make it go faster with their Page Speed Service. (Note: As of January 2014, Google’s Page Speed Service is still in the limited field trial phase, and not accepting new signups.)

When you run any of the popular page speed tools online, you will get a list of recommendations on how to improve your page speed. No matter how fast or slow your actual web server and website page performs, there a many to improve the actual page speed. Two of the more frequent ones are 1) setting browser caching to cache things like images, and 2) using GZIP Compression to send a smaller set of file sizes to the visitor’s browser.

Why all this fuss over page speed? This is a great question with two simple answers. First, Google’s algorithm favors sites with faster speed. Second, users favor sites with faster speed. From an algorithmic perspective, the faster your site, the higher you rank. From a user perspective, lower bounce rates and higher dwell times also translate into higher rank. Plus, faster load times are positively correlated with higher conversion rates.

I would recommend checking out this blog post by KISSmetrics on How Loading Time Affects Your Bottom Line! A quick review of this article will remind you how much visitors tend to care more about speed than all the bells and whistles we add to our websites.

For such a critical issue, it’s almost tragic that so few sites get it right. But, hey, that’s why we do audits, right?

2. Blog Location – Failed in 62% of SEO Audits

Standard SEO best practice is to have a blog located in a subdirectory on the domain instead of installing it on a subdomain.

Put simply, here’s the wrong way and the right way:

Wrong:  www.blog.example.com
Right:  www.example.com/blog

Placing the blog on the domain instead of nestled into a subdomain helps the blog benefit benefit from the strength and authority of a domain, plus it allows the blog to build links to the primary domain.

Of course it is a whole lot easier, for the developers to just install the blog on a subdomain. There is a long-living myth that a subdomain will work just as well as a subdirectory. Our evidence proves the disastrous results of this myth. We strongly recommend that you work to put your blog into a subdirectory of your main domain.

Rand Fishkin discusses the use of subdomain or subdirectory saying, “I would still strongly urge folks to keep all content on a single subdomain. We recently were able to test this using a subdomain on Moz itself (when moving our beginner’s guide to SEO from guides.moz.com to the current URL http://moz.com/beginners-guide-to-seo). The results were astounding – rankings rose dramatically across the board for every keyword we tracked to the pages.”

If you can only install the blog on a subdomain for technical reasons, talk to your web team about setting up a Reverse Proxy to make it appear in a subdirectory for users and search engines.

3. Root Domain Configuration – Failed in 62% of SEO Audits

Standard SEO best practice for the root domain ensures that only one version of the root domain exists. You’d think that this would be an obvious quick fix.

Apparently not! I’m shocked at how many websites fail to ensure the following

Use https://www.yourdomain.com or http://www.yourdomain.com but not both

Ensure that there aren’t extra versions of the home page competing with the root

There are a lot of websites implementing SSL after Google mentioned that it will strengthen HTTPS as a ranking signal in the future. Of course we agree wth this move, and think that marketers and developers should be working to secure their site. But what’s often missed in this discussion is the critical URL change that impacts SEO. Switching from HTTP to HTTPS is technically a URL change, and needs to be managed to preserve the websites indexation and domain authority.

Once you have SSL figured out, you need to be sure that you have clearly decided whether to use www. or non-www. for your home page and all of the website page’s URLs. Allowing both versions to exist can create duplicate content problems and also unnecessarily divide the link value across your the two versions. Basically, you’re shooting your SEO in the foot.

In many cases, we also find other subdomains like wwww.yourdomain.com or dev.yourdomain.com or even bob.yourdomain.com. SEO truth is stranger than fiction.

It’s time to admit that you don’t need dev.yourdomain.com or bob.yourdomain anymore. They need to be redirected. Of course, there are reasons that you need to have things like mail.yourdomain.com or help.yourdomain.com for particular business needs, but these subdomains do not have the same home page as yourdomain.com

A fun exercise for us during an SEO Audit is to see how many duplicate home pages we can find, here are the top files that we look for.

yourdomain.com/index.html

yourdomain.com/index.htm

yourdomain.com/index.shtml

yourdomain.com/index.php

yourdomain.com/index.php5

yourdomain.com/index.php4

yourdomain.com/index.cgi

yourdomain.com/default.html

yourdomain.com/default.htm

yourdomain.com/home.html

yourdomain.com/home.htm

yourdomain.com/Index.html

yourdomain.com/Index.htm

yourdomain.com/Index.shtml

yourdomain.com/Index.php

yourdomain.com/Index.cgi

yourdomain.com/Default.html

yourdomain.com/Default.htm

yourdomain.com/Home.html

yourdomain.com/Home.htm

yourdomain.com/placeholder.html

yourdomain.com/test.html

You can do this test on your website. Simply replace “yourdomain.com” with your root domain name, and see what you come up with.

4. Canonical Tags – Failed in 65% of SEO Audits

Canonical tags are often misunderstood. There are two misguided views on canonical tags. First, some think that the rel=canonical tag should be indiscriminately placed everywhere to protect against possible duplicate content. Wrong. Second, some think that the rel=canonical is totally unnecessary, and they don’t bother with it at all. Wrong again.

The truth on rel=canonical, as with so many other issues, lies between the two extremes. The rel=canonical tag and related link signals to search engines that there is a preferred version of the given page that they should index, thus streamlining the indexation process, strengthening the original version, and preventing duplicate content errors.

The rel=canonical tag looks like this:

<link rel=”canonical” href=”http://www.example.com/url-a.html” />

That little snippet of code basically tells Google:  “Hi. The content you see on this page is actually the same thing as you’ll find over at this other page. Here’s the URL. Just sayin’! Thanks!”

That’s it. But from this simple canonical practice have issued forth an assemblage of errors:

Using the canonical tag for pagination

Adding a canonical tag to the original version

Creating canonicals with incorrect links

Using the canonical tag in the body tag

Avoiding the canonical tag cross domain

Using the canonical tag as a substitute for 301 redirects

Preemptively adding a canonical tag to an entire website

This bullet list is probably enough to give you canonical nightmares. I’m not going to pretend that the world of rel=canonical is simple and easy. It’s not. But it’s easier than some webmasters have conceived of it.

Which is probably why we’re seeing that bone-chilling 65% violation rate.

5. URL Length – Failed in 66% of SEO Audits

Our SEO Best Practice around URL structure focuses on the overall length of the URL. We find that un-structured URLs  use complicated (often dynamic) query strings or try to stuff in a few keywords. This is a bad idea. Well-structured URLs, by contrast, tend to be short and focused.

Shorter URLs are easier to share, remember, link to, and type in. Our evidence informs us that keeping URLs below 74 characters is the best approach. This way, the entire URL displays in the SERPs without truncation.

Here’s an example of a bad URL and a good URL. The difference is easy to tell at a glance:

Bad URL: http://www.yourdomain.com/product.php?pid=123&color=4&size=3&session=3
Good URL: http://www.yourdomain.com/category/product-name.php

We are often asked how many directories you should use. The answer is to keep it to one category in your directory. For large sites with a deep hierarchy, we still recommend keeping the directory depth to one and maybe two at the very most.

If you want to go deeper than 2 directories, read more about How Google Might Classify Pages Using Hierarchical Categories.

You should not rely on URLs to create site hierarchy. Use silos within your actual content to organize the website content into grouped themes. The first article I read on this topic was by Bruce Clay, and it’s still worth a careful review. Here is a more recent blog post on Site Architecture for SEO that I found on Moz.

Another tip straight from Google is to consider using punctuation in your URLs, and using hyphens instead of underscores.

Although a lot of sites have a screwed up URL structure, it’s very dangerous to indiscriminately change URLs on a site. Why? Switching URLs is a risky maneuver. Once you change a URL, that URL along with all the links and referents to it are gone. Maybe you’ve created a shiny and perfectly optimized new URL, but you’ve just compromised the very page that you were trying to fix!

The solution, of course, is to implement a perfectly-executed 301 strategy right? Yes, but disclaimers are needed. Even if you have a perfect 301 strategy, massive URL changes can wreak crawling havoc, de-index entire sections of the site (sometimes temporarily), and crush your traffic.

Before you tweak a single URL, hire an expert SEO to provide careful guidance and review. The best solution is prevention. From the start, work hard to make all your URLs as close to perfect as possible.

6. 3rd Party Codes – Failed in 69% of SEO Audits

Third party code installations, usually JavaScript, can negatively impact site performance by increasing page load times. Remember point 1 above? Sluggish load times? This is what it comes down to.

What kind of 3rd party codes could be crippling your website performance? They’re more common than you might think — analytics tracking, Twitter, or even a Facebook like button. A sampling of over 295,000 unique hosts found that 90% retrieved a remote resource.

There’s no need to excise every single 3rd party codes from your websites. Some are necessary for user experience, performance, and tracking. If you look carefully, however, you’ll probably find a few third party tracking codes that you don’t need.These should be evaluated and possibly removed if they are reducing site performance. Understanding JavaScript code risk and controlling 3rd party website content is an important piece of managing your website optimization.

For any 3rd party code that you have on your website, make sure you move your JavaScript code near the footer of the site, rather than putting them at the front of the page. Google states that “deferring loading of JavaScript functions that are not called at startup reduces the initial download size, allowing other resources to be downloaded in parallel, and speeding up execution and rendering time” and that “in order to load a page, the browser must parse the contents of all script tags, which adds additional time to the page load.”

More JavaScript at the front of the page prevents the remainder of the page from loading, unless you are employing asynchronous scripts, as recommended by Google. Asynchronous scripts come with their own risks. Don’t assume that you can load up a page with async scripts, and expect pageload time to remain high. Why? Because async scripts still block the page onload. The solution is to both use asynchronous scripts and place them at the bottom of your pages.

Disqus (disqus.com/count.js, disqus.com/embed.js) : doc, blog post – async by default

Facebook (connect.facebook.net/…/all.js) : doc, blog post – async by default

Google AdSense (pagead2.googlesyndication.com/pagead/show_ads.js) : doc, blog post

Google Analytics (google-analytics.com/ga.js) : doc, blog post – async by default

Google DFP GPT (www.googletagservices.com/tag/js/gpt.js) : doc

Google Plus (apis.google.com/js/plusone.js) : doc, blog post

Pinterest (assets.pinterest.com/js/pinit.js) : doc

Shareaholic : doc – async by default

ShareThis (w.sharethis.com/button/buttons.js) : doc

Conclusion:  Lessons Learned from 100+ SEO Audits

If you’ve made it to the end of this article, you’ve tasted an SEO audit in all its bitter glory.

What should you do now? Throw up your hands in despair? Work 90-hour weeks? Fire your developer? Go play a round of golf?

All of those are appropriate, depending on the circumstance. Before you reach for your clubs, lets add a bit of rationality. If our statistics, collected over the course of hundreds of audits, are any indication of a trend, then there’s a good chance that your website is in violation of some SEO best practices.

By making the necessary improvements, you can successfully improve your traffic, boost conversions, and, yes, increase revenue.

Source: https://www.v9seo.com/blog/2015/02/04/6-biggest-website-fails-100-seo-audits/

Show more