2016-11-11



This case study was created using an LRT Superhero account.

Some of the use cases explained in this case study are not available in lower plans.

The LRT Superhero Plan (and higher) includes all our 24 link data sources and allows you to perform link risk management, competitive research, professional SEO and backlink analysis for your own or your competitor's sites. You get to see your website's full backlink profile picture and this can make all the difference for your SEO success.

Become a Superhero!

Competitive Link Risk Management in the Italian ads market

Our LRT Certified Professional, Stefano Robbi, provided us with a great case-study already showing some fluctuations of the Real Time Penguin in the Italian ads market.

He gives the example of an anonymous client, who managed to outperform his competitors by auditing and then reducing his link risk.

It is a very interesting story, as the goal was to protect his client's site against Google Penguin in the first place. This is yet another prove, that a strong backlink profile is key to success in today's SEO. We are very excited to get some real-life footage about Penguin 4.0 penalties and very proud that Stefano decided to share his findings with us.

We are also very glad that our tools helped Stefano boost his client's rankings so well while reducing his risk of being penalized.

We look forward to your feedback and help spread the word on this!

- Enjoy & Learn!

Christoph C. Cemper

and the team of

LinkResearchTools (LRT)

Bonus: Download the full version of this case study as PDF for easy print or offline reading!

Tweet for download

Table of contents

Introduction

Brief excursus: the client

Competitive environment analysis

Bulk URL Analyzer (Juice Tool)

Quick Domain Compare (QDC)

Competitive Landscape Analyzer (CLA)

Link Audit

The theory behind a successful Link Audit

The practice: how we did the Link Audit

Disavowing the bad links

Last but not least: speed up the disavow process

Results and final consideration

What happened next?

Conclusion



Introduction

The algorithmic update known as Google Real Time Penguin started its roll-out on 23rd of September 2016. In these days we were hearing the first testimonials of recovery of some websites that were penalized by the previous update (Penguin 3.0) in October 2014, while other websites are struggling with a sudden ranking decrease.

In this time of general uncertainty on how to proceed, savvy web marketers are investing financial and human resources to determine the risk profile of their website. Although it’s been years since the previous official update of Penguin (October 2014), the memory of negative trends is still there, recorded by those penalized websites and the idea that this could happen to their businesses is increasingly tangible for the entrepreneurs.

Some websites lost 30% of their traffic, others 45% and the most affected ones lost more than three-quarters of their total organic visits. Penalties of such magnitude inevitably have an immediate impact on the business, resulting in lost sales, lower profits and often layoffs.

Facing the fear of an imminent threat, every successful manager needs to make decisions that will save his/her company, but at the same time must be able to weigh all the alternatives at his/her disposal to see if the threat can be transformed into opportunities for a professional or personal growth.

Brief excursus: the client

In March 2016, we were contacted by an Italian client who works in the online ads’ market (jobs, real estate’s ads, etc.). The company, of which we can’t reveal its name having signed a Non-Disclosure Agreement, was at that time one of the top 10 players in its sector, with earnings and revenues growing year after year, but with a market share well below the first three competitors.

The Italian ads’ market, in addition to being extremely competitive, both online and offline, is also an environment where links to spam and low-quality content are regular.

Nonetheless, the website in question was going through a relatively positive period with moderate but steady monthly growth of organic traffic. The director of the company contacted us expressing his concern because he noticed that three of its competitors, who had recorded the best growth trend in the last year, had just been penalized by Google. His fear was to be the next on the Mountain View’s blacklist.

Competitive environment analysis

To perform a link audit means to analyze in detail the inbound links of a particular website, understand their overall risk profile and implement the steps necessary to eliminate the identified threats.

Unlike what is commonly thought, a professional link audit not only involves the website of the client but should include even the major competitors in the field. As for the identification of links, it is sufficient to focus on a single website, to carefully ponder which of these links are risky or dangerous we must be aware of the inmate link’s standard profile of a top player.

In a hyper-competitive market, in fact, such as car’s insurance, the risk of a major player’s link profile is much greater than in less concentrated sectors such as, for example, that of agricultural products. For this reason, our analysis started from the first evaluation of the market in which our client was added. To serve this purpose we used three instruments included in your LRT Superhero subscription:

Bulk URL Analyzer (Juice Tool)

Quick Domain Compare (QDC)

Competitive Landscape Analyzer (CLA)

Bulk URL Analyzer (Juice Tool)

In order to understand the principal competitive dynamics that exist between various players in the market, the Bulk URL Analyzer is a great starting point. With this tool, we can quickly get an initial overview of the sector's competitiveness in which our client operates.

Unlike other tools that we will analyze later in this case study, the Bulk URL Analyzer can simultaneously analyze tens or even hundreds of different domains.

In our case, we have not only used this tool to analyze the major competitors in the market but also to locate them. The program has a built-in feature that allows the user to "Find Competing Pages." To do so, you must simply enter some keywords of interest, and the program will automatically provide the first 20, 30, 50 Google domain results or URLs.

In just a few seconds we can get a list of the major players in our industry, and we have all the input data necessary to perform the analysis.

Before we proceed, however, you must tell the program which mode you prefer the analysis to be performed: "sample analysis" or "full analysis." This point needs some clarification: the "sample analysis" allows you to get very quick results (only a few minutes), but the program will only return data that has in memory for websites that we want to analyze.

Almost all SEO tools on the market today work this way, but the problem is that it returns a set of data measured in the past that may differ drastically from the actual value. The real added value of the LinkResearchTools suite is its unique ability to perform a full re-crawl and verify it in real time before returning the results to the user. Through the "Full Analysis," the program will perform a real-time control of all the off-page metrics analyzed and return an accurate result and updated with each query.

That is why the advice is always to select the "Full Analysis" whenever you need accurate data to make critical decisions and radicals. In the end, the wait for the updated results will be rewarded!

After selecting the "Full Analysis" and entered the top 20 sites placed on Google.it for the main keyword of the sector in which our client operates, we'll get the results in less than 20 minutes.

For each domain the Bulk URL Analyzer can restore up to 20 different metrics, ranging by seniority to Power and Trust domain, not to mention the Traffic Indicators and data on social media profiles connected. Again, LRT can do an excellent job in partnership with the API of the leading SEO tools on the market today (Moz, SISTRIX, SEMRush ...): in a single search, we will return all the data for these metrics for each of the sites analyzed.

Below is an excerpt of the result obtained with only six metrics, for easy reading.

Thanks to this tool we can have a general overview of the market sector in which we are competing in these very early stages of analysis.

By analyzing all the metrics provided by the Bulk URL Analyzer we could make many hypotheses, but for now let's highlight just two significant points:

The Power*Trust score of 20 domains analyzed has a very high variability. While four competitors have a Power*Trust equal to or greater than 25, most of them are included between values of 9 and 15.

Also, the number of keywords for which various competitors have placements on Google is very heterogeneous. The most experienced players have tens of thousands of keywords at the top of the organic results, while smaller ones are focused on niche keywords.

What does all this mean? It means that the market sees that we are analyzing the simultaneous presence of both large websites and smaller competitors, but both fail to appear on the top 20 of the organic results of Google. The differences regarding overall link profile are evident from the graph above, but the defensive strategy in the niche market implemented by smaller competitors allows them to compete effectively against a player with more resources.

At this point, after having obtained a first general overview of the market, we get more details from our analysis, and we focus on a smaller number of competitors, to look more closely into their competitive strategies.

Quick Domain Compare (QDC)

Thanks to this tool, LRT allows us to perform a more specific analysis to compare the website of our customer with four other competitors that we can choose according to the purpose of our analysis. Having run the report previously with the Bulk URL Analyzer, we are now in a position to consciously decide to which player we want to make a more specific comparison.

The results provided by QDC obviously depend on the websites you decide to compare your domain to. To get a clear picture of the market, to simply compare your website with the top four players in the sector could provide you with a limiting perspective. In our analysis, we compare the website of our client with the four competitors ahead of him and those who are just behind him in the chart.

Below you’ll find the first indications we can find already from this first initial analysis:

By comparing our website with the top players, what we supported the initial analysis becomes evident: the competitive gap between the top three market leaders and the rest of the competitors is extremely relevant. The lowest LRT Power*Trust of the top three is 16, while from the 4th competitor onward the highest value is 6. In terms of the absolute number of links per domain, the lowest value of the three market stops at 114,822, while from 4° onward the highest value stops at 38,157.

While the first three players play a separated match, still we do not know how close or not the immediate pursuers are. That's why you need to repeat the same QDC report also analyzing the four direct competitors that precede our customers regarding organic rankings.

The situation, in this case, is diametrically opposite. From the 5th to 9th result of Google for the main keyword of that sector, almost all websites (including the client’s) have the same value of the metric LRT Power*Trust. This does not mean that the five analyzed websites have the same Power and Trust rank, but that the product of these two quantities is very similar.

Like other cases of previously analyzed studies have shown, Penguin’s algorithmic penalties have often involved websites that showed Power values higher than the Trust rank values, which often means a large number of links but from a lower domain authority.

It’s interesting to notice that jobnetwork.it is the only competitor that shows a negative Link Velocity Trend (LVT) and at the same time is the only one for which the Power is greater than the Trust of the received link. This is possibly an early sign that the website has begun to change the methodology of the links’ acquisition, shifting its focus from quantity to quality.

Before proceeding with a more detailed analysis, we still have to compare the website of our client with the competitors that follow right behind him in the organic results of Google.

Even in this case, it’s interesting to notice how the value of LRT Power*Trust is identical in three of the five websites analyzed. Then, by linking the two websites with different values with their trend of acquisition of the link speed, we come to another conclusion.

The one that has the highest value of LRT Power*Trust has a decreasing Link Velocity Trend (-25%); the one that instead has the lowest LRT Power*Trust value, is doing everything’s possible to gain new links quickly (+ 100%) and reach the value of LRT Power*Trust of its competitors. The tendency of these two companies, therefore, seems, at first sight, to converge on the average value of the other player.

After these initial analyses performed with the Quick Domain Compare, we now have a clearer understanding of the market the website of our customer is part of. In particular, we can deduce that:

the first three players have a very high competitive gap in terms of link juice compared to all other trackers;

from the fourth to fifteenth organic result, the value of LRT Power*Trust of the various websites is very similar;

in the first fifteen results, there are websites with incoming links much higher than average. Some of them today are slowing their pace of acquiring new links in favor of superior quality new links.

Based on these initial evaluations and being aware that the website of our client represented an organic ranking in 9th position of Google for the main keyword in the industry, we realized that there would be room for maneuver to make a step forward in the rankings, positioning it at the limit of the first three results.

To achieve this it is necessary to carry out another, much more in-depth analysis, highlight the differences between the website of our customer and the top three competitors. Once those have been identified, the next step will be to design a strategy that will gradually go to reduce them.

Competitive Landscape Analyzer (CLA)

Once you click on the tool’s name, inside the dashboard of LRT, a window appears immediately: it asks the names of the URL of the website you want to analyze and of the websites we want to compare it with.

If we do not know the name of the main competitors, we can use the built-in feature of Bulk Url Analyzer we saw before for finding competing pages.

It is important to point out that this instrument can return results both regarding domains, and more specific results in terms of individual pages. If the work is carried out on the entirety of the website, it is almost always advisable to select "Domains" from the last drop-down menu.

Once the ten websites on which to perform competitive analysis have been identified, you must select which advanced metrics we want the tool to generate a report of.

With the LRT Superhero account, you can choose up to 10 advanced metrics for each analysis and up to 15 with the Enterprise account. Depending on the account available, simply repeat the same analysis two or three times by selecting different metrics to get the detail of each metric listed above.

Below we show some of the main metrics available in the Competitive Landscape Analyzer:

After a few hours of analysis, the overall results have come out. Below is the main graphic:

Here’s the first problem. The website’s analysis shows a clear disproportion of inbound links having money keywords as anchor text: just 26% versus 7% in the first three websites. The same disparity is reflected in the opposite way on the brand anchor text values: 54% of our client's website, against the 88% of the top players.

Links with money keywords as anchor text are the most serious threat to the health of the website and in the case studied the clear disproportion compared to the average values of the best-positioned competitors should trigger an alarm signal.

Another problem emerges quite clearly from the second chart, in which the inbound links directing to the website's homepage versus the direct inbound links to internal pages are coupled.

As you can see, the website of our client has an excessive amount of links directed to its homepage compared to the top three, while it is quite in line with the average of the top ten organic results. All this is consistent with the current position of the website on Google (8th), but it provides important information in regards to how the company will approach its earning links future campaigns. Another couple of graphs and then we will return to this point.

Now we ask ourselves: what kind of inbound links have our various competitors acquired?

In this case, too, the metrics recorded on average from the first ten websites are aligned with those of our customer’s website, but the first three have different values: they have a greater amount of NoFollow links compared to their direct followers. The difference in percentage is reduced, unlike the previous graphics, but the existence of a proportional relationship between the organic rankings of the website and the percentage NoFollow links is still an indicative value in this area.

However, we must not believe that it is enough to increase the incidence of NoFollow inbound links to boost organic rankings, but it’s advised to take note of the current situation for all the following feedbacks we are going to give.

Before proceeding any further, it is important to consider at least two other important elements. The first one is the type of inbound links of the various websites we analyzed and effectively summarized in the following graph.

As can be seen, 94% of total inbound links of our customer are in text form, and only 3% come from images. Normally these values are out of proportion, but in the market in which the website operates the differences between top players are not enough to justify penalties.

The only improvement that can be highlighted in this case concerns the opportunity to improve the overall link profile by acquiring links in the form of images rather than text. Perhaps through infographics or other techniques, as the website of our customer only has 3% of links from images and increase it would help to provide uniformity to the algorithms of the search engines.

The last important aspect emerged from the Competitive Landscape Analyzer refers to the distribution of links in relation to the metric LRT Power*Trust, which as we have already explained, defines the numerical value of a link. To put it simply, the higher the value of LRT Power*Trust of a link, the greater the weight of that particular link in the determination of the organic ranking of a website.

The distribution, in this case, is not equal. Our client has many (probably too many) incoming links from websites with LRT Power*Trust zero and very few links from authoritative websites LRT Power*Trust > 5.

At the end of this extensive analysis carried out with the Competitive Landscape Analyzer, we can now summarize the most important considerations about the link of our client’s profile.

An excessive amount of money keywords as anchor text for inbound links (+20% compared to the top 3 players).

An incorrect distribution of inbound links in the various pages of the website: the total number of links pointing to the homepage than the inside pages exceeds by 14% the ideal value of the top 3 players.

The total of links obtained is properly divided between NoFollows and DoFollows, but you could optimize the result by increasing about a 5% the incidence of NoFollows. The same reasoning applies to the inbound links, because by increasing the incidence of links from images you could give much more uniformity to the overall link profile.

The inbound links received have an LRT Power*Trust too low. 85% of them have an LRT Power*Trust equal to 0 and no greater than the limit value 8.

Link Audit

The theory behind a successful Link Audit

All the considerations just formulated are crucial, as they allow us to understand how we should optimize your website link profile to optimize the performance in the organic results of search engines.

We will now start the real work of link audit, and for each inbound link, we should understand whether and why it should be removed. However, thanks to the joint work with the Quick Domain Compare (QDC) and the Competitive Landscape Analyzer (CLA), we still know that we should try to remove potentially harmful links before we start. At the same time, we can act in order to smooth out the differences that emerged in the four points listed above.

4 steps to perform a Link Audit

Firstly, it’s necessary to find the largest possible number of inbound links. To do so, we can use various tools such as Google Search Console (Google Webmaster Tools), Ahrefs, Searchmetrics, Alexa and many others. Each instrument has its own peculiarities and is able to identify specific types of links. By putting together the results obtained in the individual analysis you can re-create a full link profile, very similar to the real one. LinkResearchTools can combine all this data via API integration and serves as the platform to make decisions on the overall aggregated and recrawled results.

Once identified, the links are then processed and analyzed to determine the relative risk. This coefficient, as we have repeatedly seen in the paragraphs above, is strongly influenced by the characteristics of the market in which the website operates. And for this, we have thoroughly analyzed its competitors before studying the link audit properly.

Once the risk factor has been established, we must act in parallel on two fronts. On the one hand, webmasters have to remove negative links, on the other hand, a Disavow File with which it asks Google not to consider the link that remained negative has to be created.

Speed up the whole process that Google must make to take note of the removals that took place and the Disavow File.

The practice: how we did the Link Audit

After using all the tools at our disposal to download the list of inbound links of our client, we have compiled a single list on a text file. It is normal that in the list the same links appear multiple times, simply because they can be detected by different software. Before proceeding further, we can import the old list of links in any data processing program and remove duplicates to assess the risk profile of the identified link.

With Excel, for example, you simply go to the "Data" menu and select "Remove duplicate."

At this point, we’ve got a list of domains and unique pages that link to the website of our client. To proceed with the determination of the risk coefficient, we have to use the tool Link Detox.

Why Link Detox?

Unlike any other instrument, Link Detox carries a weighting of threats regarding inbound links in real time, and this is crucial for us. Real time means that when activated, it does not show the data that the program stored in the past or actually in the cache, but every time it starts a new scan of each link to see if it is still active or has been removed. If you want to get the maximum effectiveness from the link audit, we can’t ignore real-time analysis.

It can also report links that no longer exist or that webmasters have removed in the past. As we will see, it is also important to consider the links that no longer exist in order to notify Google of the Otherwise, it would not find out about them quickly enough. It provides the user with the ability to choose whether to carry out the analysis also considering the NoFollow links as a potential threat or not. There’s an endless debate between those who argue that the NoFollow link can’t constitute a danger and who does not. There’s even a research conducted in 2014 on the risk of NoFollows. Christoph Cemper, the founder of LRT, has already expressed his opinion about it but leaves the user to decide how to treat the NoFollow in their analysis.

I will not dwell further into the NoFollow\DoFollow debate, but let me give just one consideration. In case you do feel that NoFollow links don’t represent a problem, in any link audit you could easily forget all these links. However, if you were wrong, or Google was to change its mind in the future, the remaining NoFollow link could pose a serious threat to the website on which you have carried out the audit.

In the opposite case, however, suppose that the NoFollow links are a problem for you and so you evaluated them and weighted just as you would with any other DoFollow link. If you were wrong and Google did not consider them, what would happen? Most likely the website won’t face any problem: it would not lose rankings or be subjected to risks, but the only disadvantage is that you would have worked more to consider all the NoFollow links without any real advantage.

It should also be considered that by enabling the evaluation of the NoFollow links the risk profile of a website calculated by the instruments is almost always greater than the assessment without the NoFollows. For this reason, if you can reduce the overall risk profile at an average market value with the active NoFollow, the website will probably be safe from any algorithmic penalties (regardless of the fact that Google assigns a risk or not to NoFollow links).

That's why in the link audit I recommend in almost all cases to enable the NoFollow evaluation. Are there exceptions? Yes, there are. But it would require a separate article to deal with them! Now let's focus on the continuation of our link audit.

After generating a complete list of our website backlinks and filtered by any duplicates with Excel or any other tool we are going to load it into the analysis program DTOX using the feedback functionality.

If we have already created a Disavow File in the past for the same website, we can add it with the second screenshot button above and ask DTOX to consider it as a removed disavowed link.

The instrument will then start scanning all the links inserted in real time, which will add the links that DTOX automatically found from 24 different data sources and your own APIs provided (e.g. Ahrefs, Sistrix). After a bit of time, which can vary from 15 minutes to several hours, depending on the number of loaded backlinks, the program will return us the total value of the risk of the website with a graphic to support its thesis.

Here is the value obtained from the website of our Client.

It may be easy to notice, that the higher the number, the more the red bar that covers the semi-circle is colored red, the more the website is at risk to get penalties. In our case, the red bar has completely covered the semicircle within which it operates, which means that the risk of penalization for our customers is the highest level.

Do you remember the considerations we made earlier in regards to NoFollow links? Here is the calculation of the website risk score without taking into account the NoFollow links:

As already mentioned before, the overall risk of a website tends to decrease as we ignore inbound links of the NoFollow type. By doing so, the risk is lower but is still on a too dangerous level. For this, it is necessary to intervene and take action as soon as possible.

During our initial analysis we assumed that the spam was very common in the Italian ads market, and now that we are digging more closely into its dynamics, we can confirm it without any doubts.

Does that mean that all links pointing to the website in question are penalized? Of course not. On the website analyzed, in fact, DTOX recommended to remove 50% of the links and to evaluate at least another 16% of inbound links carefully; 34% of the total links are classified, finally, as low risk.

Link Detox is an amazing tool which can identify link patterns which represent potential threats to a website, thanks to its sophisticated algorithms. In the link audit, however, an instrument cannot completely replace the evaluation of a professional SEO; every instrument has the objective to provide SEO to the highest possible number of data and supporting analysis, but a professional SEO will always have to make the final decision whether or not to delete a certain link.

At this point, we realized that the risk associated with the website in question is very high, and we must not only act quickly to eliminate threats as soon as possible, but we must also act in a remarkable way. Bringing back the link profile to an acceptable risk (<700) is a long and dangerous job: on the one hand, we should eliminate all the negative links, and on the other, we’ll need to be aware that a website without links will almost completely lose its organic ranking.

The best solution is always a balance between the two, which realigns the link profile of our customers with the top competitors in the market and lower the overall risk. For this purpose, human intervention is necessary, which aims at analyzing each link to manually evaluate if it is the case to keep it or not.

However, DTOX comes to help on this, thanks to its "Link Detox Screener" that allows the user to view quickly and thorough at the same time all individual links in the entrance to the website that we are analyzing.

Link Detox Screener not only provides us with a preview of the page on which the link was posted but also some metrics decisive in the assessment of the maintenance or removal of the link:

anchor text: the words from which the link originates

DTOXRISK: overall measurement of the link’s risk profile

sitewide links: the quantity of links present in the domain and linked to the linked website

rules triggered: what DTOX rules have been activated to classify the link as dangerous.

In the specific case of this link, DTOX tells us that the risk factor is high (2225) as the TOX3 rules, SUSP4, SUSP1 were operated. Let us briefly see what this means:

SUSP4: “The SUSP4 means that the homepage of the website the link comes from does not rank for the title of the page, which usually indicates that the page or domain has been penalized”. In this case, the title of the homepage of the linked website is quite specific, as well as the DA of the linked website. All this suggests that the linked website has been subject to Google’s penalties. Let’s check with Searchmetrics:

The graph is quite clear! DTOX was right: the linked domain was most likely subject to a significant drop in organic traffic in the previous year.

SUSP1: “The SUSP1 rule means that the link is coming from a page on a very weak domain that has no external links. This is often the case for links from forums or when the link is coming from some special automated spamming activity or listing in a link directory.” This rule tells us essentially that the website analyzed could be a directory or at least a website with a very low DA. Also, in this case, DTOX is correct, the website that we are analyzing is just a directory.

TOX3: “The TOX3 rule means that the Link Detox Genesis algorithm classified this link as very unnatural […]. If you agree with our opinion and the link risk estimation expressed in our calculated DTOXRISK score, you should disavow these links as quickly as possible, and then try to get them removed”. This is the overall evaluation of DTOX. At the end of all its mathematical analysis, it determines that the links concerned are highly unnatural. Thanks to DTOX Screener we have manually confirmed that the link was artificial (being very low profile directory) and supported DTOX’s theory.

At this point, we can confirm that the link in question is dangerous and must be included in the list of links that should be removed from the website of our client. The same process must be repeated for each incoming link of our customer to formulate an overall judgment for each link.

It’s a long process, which often takes several days of work. It’s necessary to evaluate not only the intrinsic characteristics of the individual links but also to take into account all the estimates developed in the market and competitors in the first few paragraphs of this analysis, mainly thanks to Competitive Landscape Analyzer (CLA).

In evaluating thousands and thousands of inbound links, in fact, the evaluation will not always be as simple and intuitive as in the case we have just analyzed. In many cases, it will be necessary to consider how the removal of a link may affect the overall risk profile of the website and also negatively affect the total organic traffic.

We have mentioned earlier that the safest option would be to remove a large number of links from the website of our client, but the inevitable consequence would be a considerable loss in ranking. In a competitive industry, a website with no links can hardly withstand the competition.

In our case, thanks to CLA, we knew that during the next audit we should have mainly removed:

links with money keywords as anchor text (and reduce the overall incidence of these links by around 15-20% of total links);

dangerous links directing to the website’s homepage (to reduce the incidence of at least 10% of the total);

incoming links we have received with very low LRT Power*Trust (less than 2).

Consequently, we opted to hold and NOT remove some links that would be judged as risky if analyzed by themselves, but in the overall profile of the website did not constitute a serious threat. It’s the case for example of links to internal pages of the website with high LRT Power*Trust and branded anchor text. In these cases, although the linked website could be a directory, we decided to keep it.

This is why it is always advisable to perform a manual review of every single link before deciding whether to keep it or remove it. The information that DTOX provides is critical to perform an in-depth analysis, but then the decision as well as the resulting liability remains with the SEO.

At the end of the manual analysis of the various links, we have a list of about 746 links to request the removal for. Before proceeding further, we have verified the theoretical risk coefficient website after removing the identified threats. Here is the result containing the active NoFollow evaluation:

The result is satisfactory. Despite not having removed all potentially dangerous links, the overall risk coefficient was lowered significantly and returned to a safer area.

At this point, we can conclude that the analysis of the link to be removed has been accomplished successfully, but the job is not finished yet.

Disavowing the bad links

We have identified the links to be removed; now we have to remove them! Probably many of you already know that you can create a file called "Disavow," with which you can request Google not to consider (for better

Show more