2014-06-13

http://tfmainsights.com/detect-respo...wsletter130614

Having established the availability of negative SEO and the risks it presents to those who rely on natural search engine exposure in the first article of this three-part series, we went on to detail the architecture of a negative SEO attack via link accrual and social manipulation in the second piece.

This naturally brings us to the question of how to detect and respond to a negative SEO attack, and whether you can take steps to prevent such an occurrence.

Detecting a negative SEO attack

In the event that you weren’t on the lookout for the traits of a negative SEO attack i.e., monitoring the metrics and or checking webmaster tools for alerts, the first indication of such an attack will be a drop in ranks or removal from SERPs.

If you’re not monitoring the ranks and traffic statistics daily there’s good chance senior management will be aware of the drop off before you are. It’s a great idea to initially set up daily rank alerts across the money terms, followed the long tail on a less frequent basis.

The attacker will most commonly target the money terms to inflict the most pain, so these are the exposure points. In the event that you experience a loss of traffic from all of the terms that you have previously ranked for, the chance is that it far too late to solve the issue quickly so being alert and on guard is imperative.
Vigilance will save you the embarrassment of being the recipient of the bad news from another team member!
Responding to a negative SEO attack

The severity and impact of each negative seo attack can vary greatly. Analysis of the rank reports you or your agency create will help you to accurately assess the scope of the attack which will enable you evaluate its impact and allow you to respond accordingly.

The ideal response to an attack involves the following approach:
Step 1: Connect your domain(s) to Google Webmaster Tools. If you haven’t already connected the Google Webmaster Tools to your domain(s) do so immediately. Log in to your account and click “Webmaster Tools Preferences.” Enable email notifications and choose to receive alerts for all types of issues then Click “Save.”
Step 2: Check for Duplicate Content. As explained in the previous article, a common practice deployed by the attacker is the manipulation of content that resides in your domain. Targeted scraping of your domain can grab recent content that you have published, which is seeded in many other sites. If this content is picked up by the search engines before your actual content, you will be penalised and lose rankings.

It’s wise to perform an in-depth site audit of the domain to reveal if you actually do have internal duplicate content as this will definitely add to the problem (it’s not uncommon for multiple pages or legacy pages to exist in large sites), but a tool like Copyscape.com is eternally useful when detecting the wider problem in the WWW.
Step 3: Analyse Your Social Media Exposure. Again covered in the previous article, the attacker could be out there creating fake social accounts in your name. Early detection of these accounts is critical as they can be reported as spam and closed down before they accrue any serious followers.

There are many tools out there to do this and a review of them will help you pick the right one for your business. They allow you to set alerts around your brand, company name, individual names, handles etc. so the minute someone mentions these on the main social platforms or websites linked to these social platforms, an alert will be triggered for your attention.
Step 4: Monitor the Load Times and Speed of the Website. Closer investigation into the analytics and web logs will tell you if your site is becoming unreasonably slow due to over excessive requests. This form of attack is referred to as denial of service. Additionally, a careful look over the web logs could uncover the footprint of repetitive visits from the same IP and or IP ranges calling the same pages (URLs) which would indicate a scraper taking your content, and well worth nipping in the bud by deploying anti-scraping technology.
Step 5: Review your Bounce Rates. Analysis of the current bounce rates in comparison to the historical data in your analytics package is critical as excessive bounce rates could signify an attack from a solution we refer to as Anti-Click.

This is where a script is created to perform a query in the search engines from a unique browser, and is set to click your domain and then go back to the search results signifying a user with no real interest in your domain. The search engines interpret this as your domain being less relevant than others in the same SERPs, and excessive returns to the search engine results pages (SERPs) will cause a drop in rank relatively quickly.

Align the historical logs with the dates of any major changes you have implemented to the site to accurately determine if any changes look to be caused by external factors, or bad internal decisions.
Step 6: Review your SEO Activity to Date. The last thing you need is to have done harm to the domain via your own activities or those of your chosen agencies. Reviewing the activity of the SEO to date is of course something that most people would expect to be done on a weekly basis, but these checks can fall by the wayside after trusted relationships with the responsible parties have been established, which can lead to a less vigilant state of mind.

Quickly check the following to cover your back:-

EMAT: Ensure that the links built to date do not appear orchestrated in their appearance. EMAT (Exact Match Anchor Text) in large proportions is a big no no, so ensure that you or your partners have used a blend of terms.

Link Accrual: Make sure that you haven’t used one particular methodology of link accrual excessively. The best option is to use a blended approach, often by tasking different specialists with one task, as it reduces confusion and mistakes.

Affiliates: Check the redirect status of all your affiliate partners and make sure that you haven’t got an affiliate programme using 200 status code links to your main domain. It’s common for the smaller, less technically advanced companies to set up an affiliate campaign with static referral links to the main domain and then as the company grows they migrate over to a fully-fledged affiliate platform using 302 redirects via a tracking domain (or subdomain), thus losing all of the previous links that used to link to them. Also scan the affiliates to ensure that they aren’t using duplicate content as this is also very common.

Step 7: Fortify the Relationship with your Best Current Link Partners. Loss of credible and useful link partners is often a tactic deployed by not just the negative SEO attacker but also very common with competitive landscapes where your competitors are also fishing in the same pond as yourself.

Ensure that you have a strong relationship that will stand the test of time by:

Knowing your best link partners and reviewing the reports you receive from your internal or external teams. SEO managers who review the rankings each day but never open the reports are being reactive rather than proactive and are not fully armed to stand guard over the domain they have been tasked with.

Once a relationship has been created by you or your agency partners make sure you offer the new link partner a virtual handshake by emailing them in introducing yourself or team member as point of contact. This is not only good practice but also protects you from someone else pretending to work for your company making contact with them. Should they receive a link removal request from an attacker, they have someone to come to!

In an ideal world all of the above points would have been done already as part of the overall SEO strategy but we all know that this isn’t always the case. We understand that all of this sounds like a lot or work, but it can in fact, be automated relatively easily with the right advice and support.
Step 8: Back Link Analysis. Take a feed from all link sources that matter (Ahrefs, Majestic and SEO Moz) additionally blending all of the data from the webmaster tools daily. All of the data then needs to be run against a series of predetermined and editable criteria to identify the best link partners and equally assess any present risk to the domain.

This data, combined with the disavow logs, presents an interesting story from which you can draw many conclusions.
One point we must make at this stage is – not all low quality inbound links (backlinks) are actually bad, as they can naturally occur covering up a lot of other link accrual activities.
The ultimate goal is to build a shortlist of links and domains that you may wish to fortify the relationship with by making contact, and equally shortlist those that you wish to disavow.

The criteria you set for disavow needs to be clear and concise at all times, and not just actioned on the links Google reports to you (as they never offer a full report), but across all of the links that you can locate from the other link sources.
Index status

Should you have actually come under attack via the backlink profile, the first step is to have the ability to assess the index status of each inbound link and prioritise your investigation on the links that have appeared inside a time frame of three months prior to any loss of rank and or exposure. These links are the most likely culprits. That being said, all links need to be examined – this is just a good starting point. Being able to segregate the domains that deliver these links into categories will highlight directories, blogs, blatantly paid for links.
Article directories

Most domains will naturally have links from article directories, and are simply par for the course of running a website. Sometimes you create them in the initial stage of the launch and other times they are naturally accrued. It’s important to note that not all of these are bad and can actually be useful at times.
Paid links

It’s true that paid-for links are proscripted, but they can only be proven to be paid-for if the website identifies them on the page as a paid-for or sponsored link, and one can only assume that a link is paid for if it has a well-formed and orchestrated format. The way to commence the analysis of paid links is to have access to the rendered source code of the linking URL and run an algorithm to detect the words, paid and sponsored link(s) in the specific areas of the site that the links exist.

Matching the presence of these words and the link itself highlights a need to manually review these URLs and address with a disavow in each instance. You can also easily spot paid links by ordering the URLs that contain the links by Google PageRank™, as you will very rarely get the benefit of a high PageRanked™ URL for no investment, unless you were the topic of a newsworthy story and the major sites of that genre felt compelled to cover it.
Cache status

The next simple step is to assess the cache status of each inbound link. This data needs to be cross-referenced against any edit dates found in the linking URL and the age of the domain that the URL belongs to is equally important.
The age of the domain is not solely signified by the initial registration date of the domain but by the date that the domain underwent a name server transfer, as it could have changed hands for a multitude of reasons, none of which you are aware of just yet.
If the linking URL is new, not cached but contains a satisfactory number of internal and external links with unique readable content, then there’s a good chance that in time it will be cached, Google just hasn’t got round to it yet. However, if the domain that the URL belongs to is not cached, is over 6 months old and the content looks shabby and weak with multiple contextual links from the homepage, it would be worth manually reviewing and considering the action of a disavow request to pre-empt any future damage.

This initial analysis will of course turn up a lot of links that have been cached. This means that Google has found them, logged them and knows they link to you. These links need to undergo the same assessment at the domain level, reviewing the age of the domain, cache dates and edit dates (if available).

Retain focus on all of the non-cached and cached URLs by seeing if the referral domains themselves are ranking for their own domain name and exact match title found in the title tag of the Meta data. URLs that don’t rank for their own name and the exact match of the title should be considered a red-flag and may have encountered a penalty some time back themselves (if the transfer dates are also recent this may have had something to do with it). There a few exceptions to this rule; if the title tag is a very generic term like Home or News it won’t rank, although having a single-word generic title tag is likely to indicate that the site is low-quality and probably of little use anyway.
Exact Match Anchor Text

Scan over the entire anchor text profile and look for excessive Exact Match Anchor Text (EMAT) and if it’s not your industry, keep your eyes peeled for adult and pharmaceutical anchor text. Many negative SEO professionals will look to increase the EMAT to 60% or more and this is likely to do some substantial damage.
Capturing the source code

Next, feed all the inbound link sources into a crawler that is capable of capturing the entire source code of the linking page and all of the WhoIs records for each domain that delivers a link to you. If that sounds a little excessive, keep in mind that the major search engines have awesome processing power and amongst other metrics Google investigates CSS modules, WhoIs records and similar snippets of code to detect similarities in inbound links. It’s best that you can identify these similarities before the search engines do.

Having the rendered source code of the linking pages also means that you easily search for links going to competitors, inbound and external link ratios from the referral URLs etc., and if the data is updated weekly the insight you can leverage is invaluable.
Removing and or Disavowing Bad Inbound Links

I won’t get into a debate of the exact metrics you need to maintain across all of the elements I’ve covered as these vary greatly and human opinion is hard to sway, but by now we are getting to the final list of links that will at some point need to be chosen for removal or a disavow request.

When you get to the final list of the links that you would like to dissociate yourself from, the best first step is to directly make contact with the webmaster using a company email address – NOT a Gmail, Live or Yahoo account – requesting that the link be removed with a simple explanation as to why. Find the email address of the website owner from the WhoIs records or contact data found with the domain and send them a message explaining why the link should be removed. Make sure you give them all of the data:-

Location of current link

Anchor Text

Destination URL

Link age (if you have it)

Never ask them to change the anchor text or link to a competitor.

This process is known to be very time consuming if you lack automated technical capabilities to do most of the hard lifting for you and your team, and the time lag between emailing them to getting the link removed can be up to several months. Most site owners will comply with your request, some will ask for payment to remove the link and others will just ignore the request.

The best advice we can give is to make every effort to send a tailored email per request so that it doesn’t feel impersonal and clinical. Sending your request as a PDF on the letterhead of your solicitor is likely to carry much more weight as it is an official form of communication.

In the event that the offending link is not removed by the webmaster or the bounty to remove the link is far too high and you feel blackmailed, you can contact the hosting company of the linking website and request that they remove the spammy links. This isn’t the easiest approach and although most hosting companies will appear to offer you their support nothing much often happens, unless it is in extreme cases.

After making contact with the relevant webmasters and waiting for a period of three weeks, you will of course be left with a load of links still in place and this gives you no other option than to use the Google and Bing disavow tools. Whilst both disavow tools log the offending links as no longer relevant and remove them from their calculations when assessing domain trust, authority and the penultimate rank in the SERPs, Google also offers the option of requesting that links from a specific domain be ignored and never used in their calculations again.
Reporting the Attack

If the removal and disavowal of links fails to solve the issue and you have substantial proof that you have come under attack from negative SEO, each of the search engines do have policies protecting webmasters from such attacks. However, you need to be fast off the mark when telling them and offer as much evidence as possible to support your case.

As a matter of due diligence you should file an official complaint if all else fails and the search engines may be able to assist further via direct intervention. If you haven’t emailed the webmaster of the offending links asking for the removal of the link and made use of the disavow tools, the search engines will ignore your request.
How Long Until I Recover?

It’s going to take a few weeks to months to get back to where you used to be. This isn’t because Google is being awkward, it is down to the fact that each of the search engines have to re-crawl the locations of the URLs that used to link to you and no longer do, as well as those that have been disavowed, in order to assess the overall quality of your backlink profile. This can be expedited with a fast indexation tool, but these need to be used with caution in case you trigger any other penalties.
Preventing a negative SEO attack

Companies who rely on search exposure to drive revenue will undoubtedly agree that prevention is better than cure. As such, total visibility is imperative; an accurate backlink audit would in many cases prevent the loss of rank and traffic from a negative SEO attack by enabling the webmaster to spot unorthodox link accrual early on, disavowing the offending links and documenting the progress of this process to gather supporting evidence.

Unfortunately, most people aren’t expecting this type of attack and far too busy with the other tasks that form part their role, and as such require some form of automated support that acts as the Early Warning Radar System (EWRS).

Apart from disavowing links you are not responsible for, there’s another way to protect yourself from the offending domains that offer you bad links and it acts as extra security. To implement this solution, you will require access to the htaccess file found in the website’s main directory. You can create a HTTP_REFERER list of domains that you don’t want to have anything to do with and paste it into your .htacess file, denying access to all links posted on bad domains or subdomains. Should a search engine crawl them the bot will be given an error (usually access denied) and the link will appear malformed. If the search engine bots can’t follow the link to a destination page it is not a valid link.
Conclusion

Negative SEO is out there and it is relatively easy for someone with minimal resource to destroy what a team of many has worked hard to build. The best thing you can do to fight it is:-

Keep your site secure

Follow the rules

Produce good unique content

Review your links very regularly

Monitor the load of the domain

Review your affiliates

Monitor social activity

Manage your Bounce Rates

If you have experienced any of the symptoms of negative SEO or would simply like to share your opinions please comment below.

http://tfmainsights.com/detect-respo...ive-seo-attack

Show more