If you’re going to move your site to a new domain or transition eCommerce platforms, being able to proactively avoid SEO missteps will ensure the transition does not negatively affect revenue. You’ll also keep, if not improve, keyword rankings which are directly correlated to the amount of traffic your site is bringing in. An improper transition that ignores SEO can mean starting over from scratch and your online revenue may drop to zero overnight. Scary isn’t it?
Luckily, we’re going to go over exactly what it takes to avoid that situation happening to you. Consider the following seven SEO obligations you have during your site transition:
1. Make Sure Your New Site is Crawlable
Your new site is going to have better aesthetics than your old one. However, the prettier, new version of your website might not be search engine friendly. Because search engines lack our human eyes, they have to read the code that’s underneath the surface to determine what’s on the page. This is called crawling.
Find Out If Your New Website Is Crawlable
You can easily test to see if your new website is crawlable. I like to test websites in Firefox by doing the following things:
Now browse your new site and see if there are major differences in the content you’re seeing. You’ll need to fix any major discrepancies that you discover.
2. Set Up 301 Redirects
Use 301 redirects to permanently redirect URLs of your old site to your new URLs. By using 301 redirects, you’re telling the search engines that your pages of your site have permanently moved. This involves doing page by page redirects. The benefit of using a 301 redirect over other forms of redirection is that with a 301 redirect, almost all the link juice will pass to the new URL.
Although it’s tempting to use a single redirect for routing all pages on your old site to the new site, it’s better to take the time to do page-to-page redirects. As well as providing your website users with a more consistent user experience, this helps keep your rankings with the search engines. If there aren’t exact matches of pages from your old site to your new site, redirect the old page to a new page with similar content.
Once you’ve implemented your 301 redirects, use a tool like Live Headers for Firefox that lets you check to make sure they’re working properly.
3. Use a Proper Robots.txt file
The robots.txt file is a text file that sits on your website’s root folder. When a search engine robot wants to visit your site, it firsts checks the robots.txt file for any special instructions. These instructions tell the search engine robot which pages on the site that it can and cannot visit which then affects which of your web pages show up in the search index.
Many times, if your new website is still in development and is on a live test server, your developers will have a robots.txt file that disallows the search engine robots from indexing any of the new site. Be careful as you are pushing your site live as we’ve often times seen the test server’s robots.txt file pushed live too. This will prevent the search engines from crawling your new site all together and website traffic will be lost.
Although you want to block a test site completely, with most ecommerce platforms you only want to selectively block certain directories. This is because you don’t want your search engines to be indexing directories not meant to be seen by the public such as file exports and image directories.
4. Inform Google of your Site Move
If your site is moving domains, this is particularly useful. In Google Webmaster Tools (you have a Webmasters Tools account don’t you?) you can directly inform Google that your site is being moved by using the Change of Address tool. To use this tool, you must first be a verified owner of both the old and new sites.
5. Create a Sitemap
Theoretically, if you have a perfectly crawlable website, Sitemaps are not needed. However, since it’s not safe to assume your new website is perfectly crawlable, Sitemaps are a way to tell the search engines about pages they might miss.
In Magento, creating a sitemap is native functionality. They can be created by visiting your Magento Admin and going to System > Configuration > Google Sitemap. Make sure that you’ve activated Magento’s Cron service as this will ensure your sitemap is updated on a regular basis as you add new CMS, product, and category pages.
6. Update all links to your site if the URLs have changed
If you have control of any links pointing to your site, change them to the new URLs. This is what 301 redirects are for, but if you have internal links, your linking within your site to another page. This will make sure the link juice is flowing properly.
For example if you have an old blog post that points to a now defunct URL, then changing that will ensure that search engines understand your site better.
In addition to changes you make on your site, you will also want to make as many changes as possible to external pages that link back to your site. Often items, you will not have control over these. That may require reaching out to these other sites and notifying them of the change. Remember that 301 redirects pass almost all of the link juice, so only invest the time and effort if you feel that these links are especially important, and that the extra link juice will make a difference. For example, if a site with a high pagerank links back to your site, it may be beneficial to have the link changed.
7. Properly Use Canonical Tags and Meta Robots
Canonical tags and meta robots, which include noindex and nofollow tags, help with the proper indexing of your site with search engines.
Canonical are especially important to eCommerce sites where pages often have multiple URLs. Certain categories, subcategories, and product pages may be accessed through more than one URL. When that happens, it’s up to the search engine to determine which URLs are more important. It’s much easier to tell the search engine which page is the most important. That’s what canonical tags do.
Meta robots are like the robots.txt file but on the page level. While the robots.txt file blocks certain directories, meta robots block certain pages. Let’s say you just want to block one file in a directory that you haven’t blocked in a robot.txt, you can do that with meta robots.
Let’s say you have blocked a certain directory, the URL may still show up in search results even though the content doesn’t. Using a noindex tag, the URL won’t even show up. Using a nofollow tag the URL won’t pass link juice.
Let’s say you’re managing an eCommerce site and you have a dealer page for your business to business customers. You wouldn’t want this private page to show up in search results. A noindex tag will remove the page from search engines so that consumers don’t inadvertently find it. You might also want to use a nofollow tag to further ensure the privacy of any pages you have specifically created for your B2B customers.
My New Site is Up, Now What?
After you’ve completed your seven SEO obligations, you’ll need to monitor your site transition by using a combination of Analytics and Google Webmaster Tools. Your Analytics can quickly show if your visitors have been affected by your site transition. You’ll specifically want to pay attention to organic traffic numbers. This is a metric that you’ll want to keep track of overtime.
Because search engines can take some time to crawl both your new and old site, you may see a dip in traffic at first but you can expect for traffic to return back to normal if everything was completed successfully. Monitoring Google Webmaster Tools will also expose any 404s errors, Sitemap, Meta description and title tag problems, as well as keeping track of your site’s speed.
When all’s said and done paying attention to SEO when transitioning websites can increase organic traffic and provide additional revenue. On the flip side, it can be very difficult to correct any mistakes that were overlooked and just like any changes made in SEO, can be a very timely process to recover from.