2015-11-27



Google Search Console (formerly Webmaster Tools) is Google’s suite of tools, data & diagnostics to “help you have a healthy, Google-friendly site.”

It’s the only place to get search engine optimization information about your website directly from Google.

Side note – Bing has a separate but similar tool suite at Bing Webmaster Tools.

Google Search Console (Webmaster Tools) is free. Any website can use it. But, simply installing it will not improve your SEO, or your organic traffic. It’s a toolset – which means you have to understand how to use Google Search Console effectively to make any difference on your website.

That said, it can be a bit daunting to understand. This tutorial will go through what each feature is, what you should be using it for, and some ideas on being creative with it.

Getting Started

To get started – you need a website of course. You’ll need to link your site to Search Console then take care of a few settings.

Verification



There are plenty of ways to verify your Search Console account. I prefer verifying via Google Analytics since it reduces the number of files / tags to maintain.

If you are using WordPress, the Yoast SEO plugin makes it simple. Though keep in mind that you need to keep that plugin active to maintain the verification.



When you are verifying your account, remember that Search Console treats all subdomains and protocols as different properties.

That means that any change from HTTP to HTTPS represents a different website. Any change from a WWW subdomain to no subdomain is different. Your data will be wrong if the property that you have verified with Google Search Console differs from the website that Google is serving in the search results (SERPs).

Understanding Settings

Search Console Preferences

This section is self-explanatory. Decide on what language you want your Console to be in and choose your email alert frequency.

I always choose to receive All Issues just to make sure I know of any and all issues as soon as they happen.

Site Settings

Here you can set a preferred domain – usually a non-WWW version and a WWW version. This setting can help you help Google determine a single, “canonical” subdomain for your website. A “canonical” subdomain ensures that your website is not competing against itself in the SERPs.

Rather than rely on this setting, it’s better to make sure you have a permanent 301 redirect in place to your canonical subdomain. This solution ensures will show the correct version to all visitors and bots. I use this Search Console setting to complement the 301. It’s also helpful if you are having trouble getting a 301 redirect implemented.

Next you can set the crawl rate for Googlebot. This determines how much of your website Googlebot crawls everyday. Google’s goal is to crawl as much of your site without overwhelming your server. They have sophisticated algorithms to make sure this happens. But, if you think your server is getting hit too often, you can limit Google’s crawl here…or look into upgrading your servers.

Side note – Do not think that trying to increase your crawl rate will increase your rankings. It won’t.

Change of Address

If you are moving to a new domain, subdomain, or protocol, you can use this tool to complement the redirects that you should put in place. Just have both subdomains verified as properties and use the tool to tell Google about the move.

Google Analytics Property

Here you can link Google Search Console to Google Analytics. You should do that, so that you can quickly access Search Console data directly in Google Analytics.

To link accounts, go to the Admin section of Google Analytics. Click on Property Settings. Scroll down to Search Console. Follow the prompts.

Your reward in Google Analytics will be this screen –

Users & Property Owners

Add & manage all the people that can access your Search Console. They have to be a Google Account (ie, Gmail or Google Apps).

Verification Details

Manage & maintain your verification status here.

Associates

You can associate various accounts with your Search Console. The most common is YouTube, but other apps may show up here as well. There’s nothing specifically to do except monitor.

Search Appearance

Before looking at these elements, don’t overlook Google’s handy pop-up glossary. It outlines the different SERP elements along with how to influence each one.

Structured Data

This section provides errors and help with structured data on your website. Not all websites have Structured Data. Structured Data are things like name, address, phone number, price, product name, event name, etc. It can be implemented through various markups to help search engines parse the data. The most common markup is Schema. Read more about it here…

Data Highlighter

If your website provides structured data, use the Data Highlighter when you don’t have a way to efficiently implement structured markup.

It’s straightforward to use. You load up sample pages and tell Google if its guesses are correct or not.

HTML Improvements

This section tells you about the two page elements that show up directly in search – page titles and meta descriptions. Google will tell which ones need improvement.

You can also use this tool to identify related issues such as duplicate content. Be sure to click the links to each problem to get details, then work to fix them.

Sitelinks

For some search queries (usually brand terms), Google will show sub-pages directly in the search results. You cannot determine what Google shows, but you can tell it not to include certain pages.

Use this tool to demote a page (such as an admin page, low-content page, etc) from the SERPs.

Search Traffic

The Search Traffic section is the most relevant day to day section of Search Console. It’s where you’ll get the most useful data for optimizing for search & increasing organic traffic.

Search Analytics

The Search Analytics is a recent addition to Search Console. It replaced the old (and much derided) “search queries” report. Search Analytics will tell you a lot of useful data about how your website performs in Google Search. Before we break down how to manipulate and use the data, there’s a few definitions to look at – directly from Google. Google has an expanded breakdown on definitions here.

Queries – The keywords that users searched for in Google Search.

Clicks – Count of clicks from a Google search results page that landed the user on your property. Note that clicks do not equal organic sessions in Google Analytics.

Impressions – How many links to your site a user saw on Google search results, even if the link was not scrolled into view. However, if a user views only page 1 and the link is on page 2, the impression is not counted.

CTR – Click-through rate: the click count divided by the impression count. If a row of data has no impressions, the CTR will be shown as a dash (-) because CTR would be division by zero.

Position – The average position of the topmost result from your site. So, for example, if your site has three results at positions 2, 4, and 6, the position is reported as 2. If a second query returned results at positions 3, 5, and 9, your average position would be (2 + 3)/2 = 2.5. If a row of data has no impressions, the position will be shown as a dash (-), because the position doesn’t exist.

To effectively use the Search Console report, you need to change the groupings to find data that you are looking for. Remember that you can change groupings after you apply a filter (ie, you can look at Queries after you have filtered for a page).

Here are my 2 favorite pieces of data to pull.

Diagnose Why A Page Is Losing Traffic

Check all metrics boxes

Filter by page, select a date range

Click to Queries, Countries, Devices looking for a culprit

Look For New / Revised Content Ideas

Check all metrics boxes

Filter by page

Click to Queries

Sort by Impressions

Look for queries that are not directly related to the page, but where the page is still ranking

Use this data to either revise the content to address that query OR create a new page targeting that query

Here’s a brief video showing how I think through the different tabs.

Links To Your Site

Google understands the Web via links. They are still the primary factor in Google’s algorithm. This section helps you understand who links to you, what content gets linked most, and what anchor text other sites use to link to you.

There are three items to remember when looking at this section.

First, Google doesn’t give you all its link data. Most professional SEOs & site owners with a budget will use a tool like Ahrefs to pull more useful data.

Second, there’s no way to know how Google uses this data for any given query. Don’t get too focused on any single link or string of anchor text. Instead, use it for big picture marketing ideas & problem diagnosis.

Third, there is more data in this section than you’d expect. The key is to just keep clicking to find out more.

Here’s what you should do with Links To Your Site.

First, use it to understand what type of content gets links. You can use that data to do more of what is working.

Second, use it to understand what types of sites link to you. Use that data to find similar sites for content promotion. You can also use it to understand just how much of the web is spam.

Third, look at your anchor text to make sure it’s telling generally the right “story.” Look for spam terms that will let you know if your site has been hacked.

Internal Links

This section lets you understand the links within your site, and how Google is crawling your site. This will differ from a crawl by Screaming Frog, since this shows exactly how the Googlebot has been crawling your site.

You should be using this report to look for mostly one thing – outliers.

Sort the list by most links & by fewest links. See if there are pages that are linked to much more than others. See if there are pages that should be linked to more often…but aren’t.

Pages getting crawled more does not equal higher rankings. But, links to pass critical information to the Googlebot through both anchor text and link context.

If you have underperforming content, it might underperforming because your internal links aren’t painting the right picture for Googlebot. This issue is common in blogs where old content receives more internal links because it has been around for longer (not because it is more relevant).

If you see pages that have more links than they deserve – they might just be in a stale category or tag page. Based on that, you should revise your category structure to coax Googlebot into crawling deeper, more relevant pages.

Internal links can be overdone, but they are also the simplest links to build. The Internal Links report can help you do that.

Manual Actions

Google uses a combination of rewards, threats, announcements and manual team reviews to trigger webmaster behavior that leads to better, cleaner signals for the Googlebot.

If a Web Spam team member finds “unnatural” marketing or website behavior. They will let you know about it in this section.

International Targeting

If you are running a global operation, you might have location or language specific URLs. International SEO can get complicated. This section is where Google lets you know of any issues.

If you are rolling out a new language or country-specific section, you’ll need to reference this section.

Mobile Usability

Google has stated that it intends to demote sites that aren’t mobile-friendly in mobile search results. In a way, it’s a side-issue to the fact that users hate websites that don’t work well on their devices.

If Googlebot finds any common usability errors, you’ll find them here. You should fix them.

And keep in mind that just because your site “has no errors” does not mean that it is mobile-friendly for users.

Google Index

Google stores copies of your website on its servers. This report will help you figure exactly what Google has & if it aligns with your actual website.

Index Status

For a URL to appear in Google’s search results, Google has to have a copy of that page “indexed” on Google’s servers.

If a page is not in Google’s index, then it will not get organic traffic from Google.

You can use this report to make sure that all the pages that you want to Google to index are in fact indexed.

Content Keywords

This report “lists the most significant keywords and their variants Google found when crawling your site.” The Content Keywords report is also one of the more misunderstood reports in Search Console.

It has nothing to do with concepts like “keyword density” or how many times you repeat a word on a page. It doesn’t have a lot to do with any given URL’s relevance.

In fact, John Mueller said back in 2012 that content keywords “are not representative of how we view your site’s relevance in web-search, it is purely a count of words from crawling.”

So if it’s just a count of words from crawling – how should you use it?

Two ways.

First, look at the Content Keywords the way you looked at the Internal Links report. Look at it as a whole. Don’t focus on the individual words. Does the “theme” of the content keywords report reflect your website?

If you gave the report to someone who had never been to your website, would they be able to guess what your website is generally about? If yes, then you’re good. If no, then you are likely not detailed & descriptive throughout your site.

For example, take a plumbing website.

Is your Homepage titled – “Home” or is it titled “Acme Plumbing Services”?

Is your Services page – just “Services” or is it “24/7 Plumbing Services in Atlanta, GA”?

Is your Contact page – just titled “Contact” or is it “Get A Plumbing Project Quote!”?

If it’s the former, you’ll just see “Contact” and “Home” and “Services” in the Content Keywords report. You should start to see plumbing, atlanta, drainage, etc.

Second, look for keywords that should not be there. Ignore the miscellaneous terms like pronouns. Look for things that have nothing to do with your product, services or content.

If you see anything for viagra, gambling, etc – then your website has been hacked.

Blocked Resources

As of May 2014, Googlebot can execute JavaScript and load CSS files to get a full picture of your website. If you have blocked Googlebot from accessing those files, you’ll find out here.

Do whatever this report tells you to do – basically, unblock all assets.

Remove URLs

To block Googlebot from indexing a page that you do not want people to find (ie, a coupon, thank you or internal resource, etc) – you’ll need to implement a NOINDEX instruction.

But, that instruction doesn’t kick in until the next time Googlebot crawls the page. In the meantime, you can use this tool to quickly remove a URL from Google’s index.

Keep in mind that the tool will only temporarily hide the URL until you can implement a NOINDEX tag.

Crawl

To get your URLs into Google’s Index, Googlebot must crawl them (ie, “click” on them. And this section helps improve Googlebot’s crawl through your website.

Crawl Errors

This section shows you what errors Googlebot has encountered while crawling your site. The errors are listed by priority.

You can also click on the URL to find out where that URL was linked from. With that information, you can diagnose and correct the error.

The most common error is a 404 Not Found error. If you can replace the link, then replace it. If it’s from an external site, you can reach out to ask to have it fixed. Otherwise, you should implement a 301 redirect from that URL to the correct one, which will tell Googlebot where to find the correct URL.

There is a caveat though. This report is also known as the “go home Googlebot, you’re drunk” report. It’s normal to find URLs and broken links that do not exist. Be sure to actually check whether something is an error or not before attempting to fix it.

Otherwise, I check this report & fix issues about once per month to ensure Googlebot is crawling all my content & links efficiently.

Crawl Stats

This section will give you statistics on how much Googlebot has been crawling per day. It shows 3 charts. Here’s what you’re looking for –

Look at the Pages crawled per day and Kilobytes downloaded per day together. They should roughly reflect each other.

If Googlebot is crawling a lot of pages, but is not downloading many kilobytes, then it’s likely crawling lots of duplicate or thin content.

If Google bot is downloading a lot of kilobytes, but is not crawling many pages, then it’s likely run into large assets (big images, PDFs, etc).

Be sure to run a site:[yoursite].com search in Google to determine what Google is showing in the SERPs.

Combine all this analysis to learn a rough benchmark for your website. As you implement best practice changes (more internal links, faster website, etc), you will be able to look for anomalies & diagnose issues. If you are running a large website, you can also use this data to estimate how long it will take Googlebot to find & index site changes.

Note – if you make any major changes to your site, there is usually a “spike” in crawl rates.

The Time spent downloading a page report can help you diagnose server issues. If you are seeing regular spikes, you need to run regular speed tests on your server. If your speed data is also inconsistent, then you should look at upgrading, improving or changing your website hosting.

Fetch As Google

Fetch As Google allows you to see a URL on your website the way Googlebot sees it.

Enter the URL and either “fetch” it (Googlebot will crawl it) or fetch and render (Googlebot will crawl it and show you how Googlebot sees the page).

After fetching, you can Submit It to Google’s index. This is a powerful tool when you need Google to find page changes right now. When you submit to index, Google will have the updated page in its index instantly. If you have a new website section that needs to be re-crawled immediately, you can also tell Googlebot to crawl all linked URLs.

To diagnose a technical issue, you can click on the page line to view both the visual and source rendering that Googlebot has for the page.

You’ll be able to see if you’re blocking any resources Googlebot needs to view the page. Be sure to check both the Fetching and Rendering tabs.

If Googlebot couldn’t access all the page resources, you’ll find out what they are, where they are located, and their priority below the rendered page.

Robots.txt Tester

The Robots.txt Tester tool shows any errors, warnings or unintended consequences in your robots.txt. Remember that your website’s robots.txt is primary means for controlling how bots access your website.

Make sure your robots.txt file is blocking access to files that you don’t want crawled – and no more. It’s a blunt, but effective tool.

Find out if your robots.txt file is blocking a specific URL with the tool below the dashboard. Note that you can switch Googlebot user agents if you are having issues with indexed video, images or news.

Sitemaps

Search engines use sitemaps to complement their crawl of your website. Think of them as a “map” of your website. As Googlebot crawls your site, it will also look at your sitemap for guidance – and vice versa.

Sitemaps have to be in XML filetype. XML Sitemaps must have few or no errors – otherwise Googlebot will start to ignore it (though it’s generally not as ruthless as Bingbot).

Find errors that need to be fixed. Use it to reverse engineer pages that are not getting indexed.

URL Parameters

The URL Parameters tool is an advanced tool for “coaching” Googlebot’s crawl on websites that use page parameters. It’s mostly used for large ecommerce websites.

You’ll need to use this tool if Googlebot is getting trapped crawling the same page because of parameters. However, I’d recommend hiring an SEO consultant who knows what they are doing.

This tool is rarely effective (in a positive way) on its own. You generally have to make technical changes for it to make a difference with your site’s crawl. On the flip side, the URL parameter tool can hurt your website when used on its own incorrectly.

If you really want to use it yourself, you need to make an index of all your site’s parameters. Learn what each does; when your website generates parameters; and how they behave.

In the URL Parameters tool, you will then tell Googlebot what each of those parameters does. Also, you’ll give Googlebot instructions on what to do with those URLs.

Security Issues

Site hacks are not good for you or your users. Unfortunately, a hack can go undetected for a while. If Googlebot finds evidence of a hack, it will let you know here.

Other Resources

Google has many other resources dedicated to specific issues. You’ll find those tools here whether you are a local business, ecommerce store or a developer.

Next Steps

No matter how small your website is – you need to have a verified Google Search Console account. Make sure all versions of your website are in place.

After that, remember that Google Search Console is a tool. It doesn’t do anything on its own for your website. Start learning how to use it, what the data means, and implement website changes to keep improving your website & marketing.

Editor’s Note – I wrote this post for DIYers and non-professional SEOs. My goal is to simplify some of the jargon & data so that non-professionals can use it effectively. I fact-checked all statements against Google statements & respected industry tests. However, if there are any facts or phrasings that you think are inaccurate, please contact me!

The post How To Use Google Search Console (Webmaster Tools) Effectively appeared first on ShivarWeb.

Show more