2016-10-05

What Is Google Webmaster Tool?

Google Webmaster Tool (GMT) also known as Google Search Console (GSC) is a free tool very powerful essential to any webmaster, as well as Google Analytics. It is unthinkable to work its SEO or build an SEO strategy without using Google Webmaster Tool. GSC provides insight into how Google crawls your pages, indexes, and reference in the search results (SERP), via a series of reports and comprehensive tools.

In this article, I’m going to explain how to use Google Webmaster Tool as part of your SEO by analyzing one by one each of the major sections of the menu: “Appearance in the search results,” “search traffic“, “Google index” and “Crawling“.

The “Appearance” menu to optimize the display of your pages in search engines

This first menu Google Webmaster Tool lets you see how the pages of your website appear in the search results. This is very important because the display mode of your pages in SERP influenced the click rate of users. The Google search engine has evolved significantly in recent years and proposes new richest display modes. Display modes that use semantic markup, which enriches the HTML of your pages so that search engines “understand” better content.

Structured Data

The first sub-section of the Appearance menu to verify that your Structure data markup has been integrated and be aware of possible errors:

Structure markup allows Google to show results enriched form of “rich snippets” with stars, price, notes, the number of calories (receipts), the number of views, etc:

The “structured data” allows both to check for errors in your markup but also provides a tool to identify the source of errors encountered.

Rich Cards

You can perform the same checks on your rich cards (if using). The rich cards are a very new display mode introduced by Google. It is still largely deployed at the time of writing. Here’s what looks like a “Rich card“.

With Google Webmaster Tool, you can check for errors in your cards but also enriched enriching:

Data Highlighter

“Data highlighter” an alternative to the HTML markup method. The goal is the same: to visually enhance the way your pages appear in Google. Data Highlighter are easier to implement when you do not have programming skills. The subsection “highlighter” will allow you to set up very easily markup (using your mouse) and control the display of your pages marked in Google:

HTML improvements

The “HTML Improvements” is very interesting and very useful. It optimizes your Meta Description and Title tags. And more precisely :

To identify the presence of meta description tags or Title twice. This is an opportunity also to recall the importance to customize each of your Title tags and each of your meta description.The fact that several pages of your site have the same title tags and Meta Description can have a negative impact on your SEO but also on the user experience. These tags, which both appear in search results of Google, serve to describe briefly the content of the page. They play an important role in the click rate.

Identify too long or too short tags. Remember that the title tag should not be more than 65characters including spaces and the Meta Description tag should not exceed 160 characters including spaces.

Another plus: this subsection “HTML Improvements” can also be used to identify the duplicate content on your site (pages published in duplicate).

Sitelinks

The sub-section “Site Links” controls the display site links, that is to say, links that may appear under the main link. It is generally Site Categories:

“Control” is a term perhaps a bit strong. In fact, it is Google and that he alone, through its algorithms decide whether to show the site links. By cons, you can choose not to display this page in the Sitelinks in “downshifting“. In other words: you can not decide which links are displayed, but you can choose those who should not appear.

Accelerated Mobile Pages

The last sub-section of the menu “Appearance” – “Accelerated Mobile Pages” – is new to Google Webmaster Tool. This section allows you to optimize the display of your pages on mobile devices by making them more “mobile-friendly“. The use of this subsection requires programming skills. For more detail on Accelerated Mobile Page You can refer Google search guide.

The “Search Traffic” menu to analyze your natural traffic, your backlinks, and your internal links

This section of Google Webmaster Tool  allows you to learn a lot about your natural traffic, particularly on the source of your visitors, the source of your backlinks but also on your internal backlink structure. This menu is probably the most important from a purely SEO perspective.

Research Analysis

The first subsection “Research Analysis” provides access to many reports on traffic to your website. This is a real gold mine. Filtering options allow access to very detailed statistical and make interesting comparisons.

This subsection will enable you to know the number of clicks per keyword, the number of impressions per keyword, CTR (clicks / impressions), the keywords on your pages which are positioned well, etc. I just said, you can also compare all these statistics over time depending on the country or the type of equipment used.

Links to your site

This subsection allows you to learn a lot about your backlinks is to say on tousles links on other sites and linking to your content. You can see the most used areas in your backlinks, the number of backlinks to each of your pages and the keywords associated with your backlinks (from anchors used). These three lists can be exported to .csv or Google Docs.

You can know in sufficient detail that points to your site and what the Most Linked pages. But let us be clear: Google Webmaster Tool provides interesting information on your backlinks but is not sufficient for a complete analysis of these. You will need to use other tools, such as Majestic SEO or Ahref .

internal links

Google Webmaster also can analyze your internal links, that is to say, those which connect your pages. The report of Google Search Console gives you, for each page, the number of internal pages that link to it.

This report is useful for controlling your internal linking strategy and optimize the structure of internal links. One page totals of internal links, the more it is valued by Google. The internal linking notably allows valuing certain pages deep.

manual actions

The subsection “Manual Actions” lets you know when you have received a manual penalty from Google, because of spam or techniques of “black hat SEO“. To avoid penalties, I suggest skipping dangerous SEO tactics.

Good to know: Manual penalties must be distinguished from automatic penalties. In the first case, it is a person of Google that penalizes you & in the second case, it is the Google algorithm. Algorithmic penalties are not listed in this subsection.

Manual penalties are much heavier than automatic penalties and can have the dramatic impact on your SEO.They may consist in an outright de-indexing of your site. In the subsection “Manual Actions” you can know why you have been penalized.

Were you a victim of a manual penalty? Begin by repairing the purpose of the penalty and then click on “Request a Review” so that Google removes the teams.

International targeting

The subsection “International targeting” automatically displays the appropriate localized version of your website according to the geographical area of the Internet, through the use of “hreflang” tag. If your site is translated into English, you can apply via this section that Google automatically provides the English version of the site to users living in English-speaking countries. Google itself uses this technique: when you type in “google.com” into your browser, you are redirected to “google.co.in” or to “google.com.ca“.

Mobile usability

This subsection “mobile usability” list all the problems related to the display and more broadly to the usability of your website on mobile.

These problems may affect the overall ranking of your site. Optimizing the mobile usability will also allow you to deliver your visitors a better mobile user experience. Recall that more than half of the research is now done from a mobile device. The challenge is important.

The Google Index menu to analyze the indexing of your pages by Google

This menu controls the indexing of different pages of your website by Google and identifies any indexing errors. Recall that indexed page is a page indexed by Google and can be displayed in search results. Indexing should not be confused with SEO, the latter referring to the position of pages on target keywords. This section “Google Index” is divided into four subsections:

State indexing

keyword content

blocked resources

Remove URLs

Index Status

This subsection shows how the number of pages indexed by Google. If the number of pages on your website increases with time (as is the case for most sites), you will observe an increasing curve.

If you use robots.txt, the graph will display a second curve corresponding to the number of blocked pages (in the “Advanced” tab). I will return on the operation and the interest of lower robots.txt in our article.

keyword content

This subsection shows which are the most used keywords in the pages of your site, from a global point of view. The blue bar on the right keywords indicates the importance of the keyword in question in the Google results.

By clicking on a keyword, Google Webmaster Tool gives you the list of main pages where that keyword appears:

The keywords are still the vault of SEO key. This subsection allows you to optimize the presence of keywords on different pages of your site and possibly why some of your pages referenced are less well than others.

Blocked Resources

This subsection allows you to identify all the pages blocked by robots.txt resources. There may be resources javascript, CSS, images, Jquery, etc.The fact that some resources are blocked may have a negative impact on indexing or rendering of the pages concerned.

Remove URLs

It is from this subsection you can choose to manually de-index certain URL / pages of your website. De-indexed pages no longer appear in search results. This feature is useful to protect access to your private pages and the private data of your members. But also to prevent appear in SERP that have no interest to the user.

Note: The fact of de-index certain pages without value can have a positive impact on your SEO. This prevents the low quality of these pages will bleed through to the other pages and their SEO.

The “Crawl” menu to see how your pages are crawled by Google

The “Crawl” menu is interested in how your pages are explored ( “crawled“) by “bots” of Google. Crawling the pages is the step before indexing. Google begins by crawling the pages of the site before the index in its earnings results.This section has six subsections:

Crawl Errors

Crawl Statistics

Fetch as Google

Test Tool robots.txt

Sitemaps

URL parameters

Note: The three stages are crawl (the crawling), indexing and Search Traffic. No indexing without prior crawling. No SEO without indexing. One can notice and wonder that the Google Webmaster Tool menu presents these steps in reverse order: search traffic (SEO), Google Index (indexing), Crawling.

Crawl Errors

This sub-section shows you all the errors encountered by the Googlebot when exploring your pages: server errors, 404 errors, pages not found, redirect errors, etc. I advise you to return regularly in this sub-section to quickly repair crawl errors that may arise. Crawl errors have a negative impact on your SEO.

Crawl Statistics

From this sub-section, you can know, for the last 90 days, the number of pages crawled per day Googlebot, the number of kilobytes downloaded per day, but also your page load times for Googlebot. The goal is to increase the number of pages crawled and reduce loading time.

A time of loading up and a number of pages crawled down to reveal a performance problem in your site. To identify areas for improvement (performance level), I recommend using the free tool Page Speed Insights . A slow loading time has a negative impact on your SEO.

Is your Onpage is Weak? 12 techniques to improve the on page of your site.

Fetch as Google

The subsection “Fetch as Google” is a tool that allows you to test how Google seeks your URL and displays your pages. To use the testing tool, you must:

Select the URL you want to test.

Click Explorer to check the validity of the URL and identify possible errors (security, redirection, etc.).

By clicking on Explorer and display, you can access how Google displays your page.

Note to analyze your test results:

“Done” means that Google has managed without problems to explore the page in question.

“Partial” means that some page resources are blocked, which poses a problem with the page display.

“Redirecting” means that the URL is the subject of a redirect to another page (redirect that works).

“Not found” means that Google does not find the URL on the server.

“Unauthorized” means that Google was not authorized to access the URL. It can usually page with restricted access (subject to authorization).

“Blocked” means that the page in question is blocked by your robots.txt.

“Error” means that an error has prevented Google from accessing the URL.

This tool is very useful to understand why some of your pages will not be crawled by Google and do not appear in search results (= are not indexed). For pages that are a problem, you must repair the problem / optimize the page and resubmit the page in question to Google for indexing.

Test Tool robots.txt

You can create a robots.txt file when you want certain pages of your website are not indexed. You must configure your robots.txt to indicate which pages you want to block. Specifically, the robots.txt file prevents the Googlebot to crawl the pages in question. The robots.txt file can be useful to block pages containing private data or low-value content that can negatively impact your SEO.

The sub-section “Test Tool robots.txt” allows, as its name suggests, to test the operation of your robots.txt, check for blocked each page there is no error (= they are locked) and possibly modify the file accordingly.

Sitemaps

To optimize your SEO, you absolutely must create a sitemap (sitemap.xml) to your website. This enables Google to better understand the architecture of your site and how the various pages together. Over this architecture will be understood better your pages will be indexed. This “Sitemaps” subsection to submit to Google in terms of your site, to know the number of pages submitted and the number of indexed pages. Google Webmaster informs you of any errors in your sitemap.

URL parameters

This last sub-section is for people who have good programming knowledge. A parameter setting error can cause big problems indexing and therefore also of SEO. Better to avoid playing the sorcerer’s apprentice.

Security issues and other resources

I quickly pass over these two menus. The “Security Issues” menu to identify, as the name suggests, the presence … of security problems. It is by going to this menu you can find out if your site has been the subject of a hacking (eg a phishing operation). For each problem detected, Google Webmaster Tool gives you tips to repair it. The “Other Resources” section lists a set of tools or Google services that you may find useful. Including Google My Business  for local SEO , Page Speed Insights or the Help Markup that allows for semantic markup pages your site.

You now know all the features of Google Webmaster Tool. As you can see, Google Webmaster Tool is a very comprehensive tool and at the same time simple enough to use (compared to Google Analytics). It is clearly essential in an optical SEO but also to improve the performance of your website more generally.

The post How to use Google Webmaster Tool to analyze your SEO performance? appeared first on Startwithclick.

Show more