2015-09-24

Relaunching your website can have serious consequences for your search engine optimization if not planned carefully. Luckily, columnist Stephan Spencer has provided a comprehensive guide to ensure a smooth transition.



Has your company outgrown its current website? If so, congratulations on this important rite of passage!
Now, it’s critical to make the revamped website bulletproof from an SEO perspective.
With all the chaos and the huge task list that comes with a site relaunch, it’s easy to drop the ball on the site’s SEO and thus seriously jeopardize the link authority and rankings you had built up over time. The importance of planning out an elegant transition cannot be overstated.
But what exactly is the plan? Well, that depends on the type of relaunch upon which you are embarking.

New Skin?
If the scope of your redesign is limited to a visual refresh with few technical changes, then your relaunch is in little danger of experiencing major SEO snags. However, most redesigns also involve changes to information architecture, navigation, URLs and so forth.
If that’s the case for you, you’ll need to conduct the right SEO checkups to ensure the search engine spiders will be able to traverse your site effectively and find all of your content.

Completely New URL?
Unfortunately, a domain name change can be far more complex in terms of SEO. These complexities can often result in a loss of traffic.
Some of it simply has to do with user habits. Users can sometimes exhibit domain bias, the propensity to click on domains in the search results that use keywords they used in their query. If searchers are unfamiliar with your new domain, they may ignore its listing.
Google also has problems trusting new domains. Even if the links 301 redirected from the old site are of high authority, and the site continues to signal its relevance, the return to high rankings can be delayed. Google looks at the overall domain age and link equity when ranking a given site, and changing your site’s URL will impact both of these metrics.
If you deem it necessary to relaunch your site on a new domain, tread carefully. Ask yourself if changing to the new domain is worth the potential short- to medium-term disruption in search traffic.

New Server?
You can handle most of the process of transferring over to a new server can in a couple of steps. Simply update your DNS and/or go to the registrar of your domain and change your name servers.
Make sure your server settings and directives are handled correctly. Sometimes if both the new and the old server run on IIS, the new server can be configured in a way that transforms all 301 redirects to 302 redirects. The way to prevent this? Double-check.
Also, make sure all legacy code that was ported over is still functional on the new site. For example, if your new server doesn’t have Perl and associated libraries installed, you’re going to have a problem with any functionality requiring Perl.
Whether you are simply refreshing the look of your site or rebuilding it from the ground up on a new domain with new code, new content and new design elements, there are several things you can do to ensure that your relaunch goes smoothly.

Before The Rebuild

Crawl All Your URLs

If you want to successfully move your content to new URLs, you need to know what you have to move. Crawling your pages can sometimes be more complex than site owners initially realize, especially when site navigation is complicated, or there are a lot of pages to the site.
That’s when a good site crawl tool is a godsend. I recommend Screaming Frog SEO Spider. This tool will crawl your site and give you a list of every single URL it finds, as well as a huge range of other information about those URLs, like title tags, canonical tags, meta-robots tags and much more.
At this point, it’s helpful to also run a check of all of your externally linked pages on Moz’s Open Site Explorer or Google Search Console to make sure you include each page that is receiving link authority in your content move. Essentially, what you are looking for is any page with one or more external links pointing to it.
If you are building new URLs, you want to make sure that each of these pages gets redirected to its corresponding new URL to ensure link authority is retained.

Map The Moves

Certain systems require you to do each page redirect manually, which can be a huge amount of work if you have a large site with hundreds of thousands of pages.
Apache Web server users have it far easier and can take advantage of Apache’s mod_rewrite module, which allows you to perform the redirect of every URL to its corresponding URL on the new domain with a single rewrite rule.
For Microsoft IIS, the highly useful plugin called ISAPI Rewrite performs a function similar to Apache’s mod_rewrite.
No matter the system, creating a spreadsheet of where each current URL is going to be redirected will be incredibly helpful in keeping track that each piece of content is moved correctly. The main goal here is to ensure the link authority from old URLs will successfully transfer to new URLs.
When doing this myself, I use the data from a site crawl from a tool like Screaming Frog to get a comprehensive list of URLs, then determine the closest matching URL on the current site. It is time-consuming but by far the most reliable method.
When moving a ton of URLs that in general follow a pattern (ex. www.oldurl.com/page1 to www.newurl.com/page1), you may want to create rules that rely on wildcards and regular expressions, rather than mapping individual URLs one by one in .htaccess or another server configuration file.
One thing many site owners don’t consider is the need to also update current redirects, along with these new redirects. Failure to update these will result in “chains” of redirects. If the chains contain a certain number of jumps, you can expect a loss in link equity.
For example, a site may already have a redirect from page a > page b. If the new URL is page c, then not only should page b redirect to page c, but page a’s redirect should also be updated to point directly to page b.
You want to avoid situations like page a > page b > page c > … > page z. While building this redirect map, it’s also a good time to check the site and update any URLs that currently 302 redirect when they should 301, and also to create 301 redirects for URLs that currently result in 404 errors. You might as well clean all this up at once.
This map of old to new URLs will also allow you to transfer over elements from your pages that are currently driving organic traffic, such as title tags. If one of your current pages is currently ranking and driving significant quantities of search traffic, you will probably want to keep the title tag the same, at least for a while, until the dust settles.

Audit Your Site

I recommend that an SEO expert be involved in your site rebuild from the very beginning, so that any problems can be found at the earliest possible stage.
Unaddressed SEO problems that are inherent in your current site may very well carry over to the new site. For my clients, not only do I conduct a comprehensive audit, but I also provide SEO guidance at every stage of the redesign process — the content plan, the functional requirements, wireframes, mockups, development site, staging site, launch and post-launch.
Getting an SEO practitioner involved at the planning, information architecture, wire framing, design and coding stages will ensure the site gets built right the first time and avoid last-minute re-work and patch-up jobs.

During The Redesign

Meta Robots Tag Or Robots.txt Directive
Make sure Google isn’t going to hit you with a duplicate content penalty or simply index what you aren’t ready for them to index yet. There are two potential ways to accomplish this.
You can “noindex” the pages on your development site, which means, simply, you are instructing the search engines to not include the pages in their database of search results. This can be done by adding a meta-robots tag with the value of “noindex” to all of your pages. For those running WordPress, this is accomplished with the tick of a box in your WordPress admin.
Alternatively, you can use Disallow directives in the robots.txt file. Technically speaking, you could also issue noindex directives in there, too, but only Googlebot recognizes that directive, and it’s unsupported by Google.
Which one do I recommend? Without question, the former. That is the how you keep pages out of the index in both Google and Bing. Disallows only stop the spiders from visiting; they don’t tell the engines to de-index pages that were already discovered through other means.
It’s important to note that a meta robots noindex tag won’t be seen by the spiders if a robots.txt Disallow is in place, preventing the spiders from accessing the pages containing the meta robots tag. So remove the Disallow to allow Googlebot and Bingbot in to see the noindex tag.
Failing to block your development site from the search engines can allow it to be indexed. In many cases, particularly if all you are doing is a design update, the engines add duplicate versions of all of your pages to their indexes for your development and/or staging sites. Even if you block those sites after they have been crawled, it can be either a long waiting game or a very laborious process to remove those URLs from the indices.

Consider A Phased Approach

This comes from Google webspam guru Matt Cutts himself. Instead of just making a huge switch happen at once, try out moving a subdomain, a subdirectory or a subcategory. Make sure that transition works.
By moving it in smaller chunks, it decreases the likelihood of a spectacular failure. Move it, wait for Google to crawl that section, and see how it goes. Then, take on the next section.

Update Your Sitemap(s)

This seems simple enough, but it can be easy to forget. New URLs need to be added to the XML sitemap, and old URLs need to get phased out over time. Also consider multiple XML sitemaps for different content types to keep things organized. For example, have a separate XML sitemap for images.
Remember to add the location of your sitemap (or sitemap index, if you have multiple sitemaps) to both your robots.txt file and to Google Search Console, particularly if the URL of your sitemap (or sitemaps) will be changing with the redesign or relaunch.

Determine Landing Pages For Organic Traffic
You will want to ensure that you have a list of all of the URLs on your current site that are currently driving organic traffic. You can find this by going to Google Analytics, to the “Behavior” tab, then “Site Content,” then “Landing Pages.”
Add a secondary dimension by clicking “Advanced,” then “Medium” to create an advanced filter that includes the medium containing “organic.” This gives a list of all the pages that currently get organic traffic.
Is there an equivalent page on the new site? Is there a 301 redirect plan implemented here? This is a good system of checks to ensure you’re not losing valuable traffic and link equity. Make sure you have a plan in place for every page on your site that currently drives organic traffic.

Launch With A Marketing Blitz
Getting fresh, high-authority links to pages on your new site can improve Google’s ability to quickly sort out your transition. Obtaining these links requires marketing effort; this influx of new links won’t just happen on its own. Plan your link-building campaign early on in the redesign process, not at the end, once you need the links.
Some ideas to consider: create a contest that ties into the relaunch or otherwise gamify the relaunch; get your influential industry friends to countdown to the new site on social media or in emails to their list; email your list to notify them of the countdown to launch; offer sneak peeks; create a site launch party; and so forth. Make sure you get the word out!

Audit The New Site

Make sure you’re starting off on a solid foundation with the technical SEO elements firmly in place prior to launch by conducting a technical audit of the staging site.
When you think the site is 100 percent ready to get migration to production, get an SEO expert to do a full technical audit of the new site. It’s worth the investment to ensure you aren’t setting yourself up for problems after you already are up and running — and it’ll avoid lost profits.
I have audited sites in the past where the site owner contacted me literally days before relaunch, only for me to find massive SEO problems on the new site. The relaunch had to be delayed.
If you aren’t well-versed in SEO, you definitely need to get an experienced SEO practitioner involved in your rebuild.
Inevitably, some things will be found which are medium priority or “nice to have,” but you need to ensure there are no “show-stopper” issues before you press the “launch” button.
Ultimately, the management team for the site will need to determine whether to push back the launch or to address these problems after the new site is up and running.

After The Redesign/Relaunch

Contact Your Linking Webmasters About Links

Run Open Site Explorer to find the top 100 URLs sending traffic, and contact any webmasters linking to these URLs to encourage them to update their link. Although your 301 redirects will pass the majority of the link authority, direct links to the new site are ideal.

Know That The Change Isn’t Going To Be Accepted Immediately
Especially if you are making a ton of changes to a lot of pages and URLs, it is going to take time for Google to catch on and for your rankings to readjust to where they were. Accept it, and know that you aren’t doing anything wrong.  But just in case, ensure you haven’t made any big mistakes by running the following checks.

Watch For 404 Errors
See how Google handles all of your 301s to make sure they are correctly implemented everywhere. If a 404 does pop up, quickly correct it with the 301. Google Search Console can take a couple of days to get caught up to current data. If you have the ability, also have your system administrator monitor your Web servers in real time for 404 errors.

Run A Safety Check Of Your Important Files

Did you forget to remove the noindex tags from your development/staging site upon launch? Did you accidentally copy over the dev site’s robot.txt file with a sitewide Disallow?
Run analytics to check for pages on your site that are strangely getting views but no referring traffic. And check on the status of your meta noindex tags and robots.txt.
It should go without saying, but also make sure the pages you do need blocked are adequately blocked (noindex is preferable over disallow).

Check Your 301 Redirects

If you have a manageable number of redirects from old URLs to new, check all of your redirects to ensure they were done correctly. If you have a huge number of redirects, at least check your most important pages, that is, your former URLs which had the highest number of incoming links and/or incoming organic traffic.
This is actually reasonably easy to do. Again, I use Screaming Frog for this. You can manually upload URLs or upload a list of URLs in a spreadsheet to Screaming Frog and have it crawl just those URLs. It will give you a report of what type of redirect (if any) that URL goes through and what page it ultimately results in.

Have a Rollback Plan

If things go haywire, it is nice to know that you have a solid plan to fall back on. Have a rollback plan ready to execute so you can turn back on your old site if things just aren’t working out, even if it’s just temporary.
It is far better to roll back than to scramble because you hadn’t prepared a contingency plan.

Final Thoughts
Site relaunches happen — and frankly, they should happen. Regular updates keep your site delivering its best user experience and allow businesses to maintain relevance, update branding and more.
However, a relaunch doesn’t have to be painful. Google understands that redesigns are a regular part of the evolution of a website, and Google wants to help you to be successful with the relaunch.
With the right organization, planning and auditing, you can make the transition quickly and painlessly and be able to realize the benefits you initially wanted for the site relaunch: better user experience, better branding, updated look, improved functionality, and ideally, even better organic traffic.
Source : searchengineland

We Are Fossasia Stay Connected With Us On Twitter . . . ! ! !

Show more