Site Relaunch Checklist

Senior Member from US

joined:Mar 30, 2005
posts:13010
votes: 222

Taking a client ecommerce site to an entirely new platform this summer (probably ready to launch next month) About 4000 urls, mostly products. 10,000+ user accounts. Old site was a hot mess ancient OSCommerce system filled with duplicate content and broken links that nevertheless did pretty well for the core competencies until Mayday/Caffeine. I figure it would drop, and it did. I haven't spent any time on the old site because we've been working on putting together this new one all year.

This is what I've done so far:

Unique page titles and meta tags on *every flippin' page* (granted some are better than others, but I focused on the most important stuff) Still have some dupe descriptions on products that are exceedingly similar except for some aspect (they have separate SKUs so they can't be combined)

We now have completely new URLs with actual product names in them, so set up a database that looks up the old product id and 301's it to the new product id for the product pages. Seems to work very fast. I manually mapped all the category and static pages, and they will be fed into the .htaccess. Since the old pages were served out of a /catalog directory, I *think* we can put the redirects there instead of in the root directory .htaccess.

Put redirects in .htaccess to 301 www to non-www (this is a change, but the domain name is a tad long and we don't use the www in the catalog or other marketing pieces anymore anyway), 301 non slash to slashes on pages without .html, and 301 all chars to lower case.

After a long struggle, the XML sitemap is looking pretty good - this particular shopping cart piled on category and sub category names, but we got that under control. Canonical url tags are installed.

The old site is going to be kept running (on a subdomain) for a few months - just in case - but walled off to the world.

The only thing I don't have and probably won't have before launch is our Google Shopping feed set up with the new urls. I hate to shut it off because it does send us sales. But it's probably better to do that than send it to the old urls and be redirected?

All the PPC campaigns have been duplicated with the new urls ready to go.

Client has been told to expect a big drop and a gradual build over time; they're a catalog company with a TON of repeat business, 15 year old category killer domain and healthy trade show / email / other marketing campaigns, so a search engine drop, while important, won't put 'em out of business.

What am I forgetting? I know there must be something.

I also should add a hearty thanks to WebmasterWorld and its members, because pretty much everything I ever learned about how to do this properly, I learned here. Big wet smooches all around.

Junior Member

joined:Dec 16, 2003
posts:79
votes: 0

Also watch your log files carefully for those 404 errors and redirect these ASAP with a 301 redirect. A custom 404 error page will capture those people but it is better to redirect them to their intended landing page.

Administrator

make sure you have the default directory index document 301 redirected to the trailing slash url. for example: http://example.com/category/index.html --> http://example.com/category/

check wildcard subdomains and if they resolve either 404 or 301 as appropriate. if the server is on a dedicated IP make sure requests for the IP redirect to the canonical domain. if there is a development domain make sure it is behind basic authentication (401). make sure you don't have domain aliasing canonicalization issues - such as example.net/.org/.info

Senior Member from US

joined:Mar 30, 2005
posts:13010
votes: 222

Thanks all! Already have the domain aliasing stuff and the trailing slashes taken care of. Ran into a glitch with redirecting to lower case because it breaks the admin and the back end (we didn't write it, so we can't change it) IP is something I hadn't thought of - thanks for that. I'm working on a script that pulls the 404s out of the log and maybe mails them to me. I'll take a look for backlinks; this is a B2B ecommerce site that has some links to its home page, but not very many to internal products pages. I can check though.

Senior Member from US

joined:Mar 30, 2005
posts:13010
votes: 222

Very likely. The only ZenCart sites I have left (meaning that I have clients running them, not me personally) only have a handful of products and pages in small niches, and they've dropped a bit but never had a lot of traffic to begin with. The OSCommerce sites dropped like a ton of bricks. That's why I'm trying to get my ducks in a row BEFORE we launch on the new platform.

Senior Member

joined:Dec 27, 2004
posts:1974
votes: 68

so set up a database that looks up the old product id and 301's it to the new product id for the product pages. Seems to work very fast

That is usually a big task to do, but one that matters a lot. What I usually implement is I'd have a new global include file that records the URL accessed into a DB and then do a XENU run(or 2) on it. After that Export that into an Excel file(or not) and go over it with 3 tooth brushes in each hand, just to make sure that I got all the patterns.

We now have completely new URLs with actual product names in them

It helps, Not sure how you set up NEW URLs but there is one thing that I keep seeing over and over where the rewrite pattern implemented on the new site goes like this:

/products_id/100/blue-shiny-widget/

Nothing is wrong with that at all, until Slurp comes in and trying to index:

/products_id/100/ or /products_id/100

and your site spits out content for the /products_id/100/blue-shiny-widget/

So if you haven't implemented that check and have similar new URI structure, it should be on the list as well, I think.

Senior Member

joined:Apr 14, 2010
posts:3173
votes: 0

We now have completely new URLs with actual product names in them

Unless you 301 old pages to these new ones they will all be considered new, in fact your entire site will appear new with no rankings whatsoever. It's a tough enough mountain to climb, I'd hate to start over at the bottom netmeg.

Preferred Member from US

I know this is somewhat off-topic... but some of the best things I have ever done to avoid the BotShock, are:

What am I forgetting? I know there must be something.

- Start communicating with your client-base NOW, 30 days in ADVANCE of the change! Explain the coming changes, show screenshots, do ANYTHING you possible can to prepare a full section of the site all about the new build and interface!

- Start preparing a series of press-releases about the new site, "Who-What-When-Where-Why" and on the day of the new launch, start distributing them!

Senior Member

joined:May 26, 2000
posts:37301
votes: 0

One approach I've been very successful with when I use it is only to 301 redirect the important URLs - those with lots of search traffic, or direct landing pages, or strong backlinks. I just let the rest of the URLs go 404 and assume that Google can crawl the the new site and find and rank that content just fine.

Caveat: I know that Matt Cutts has recommended against this approach and Google would prefer to see all the 301s for all migrated content. However, in practice I've found that a website full of 301 redirects seems to take much longer for Google to reprocess and trust-check than a site where just the cream of the crop gets a 301. YMMV - and if you take this approach it does require that you avoid as many technical errors on the site as you can - from day #1.

Junior Member

joined:July 13, 2010
posts:119
votes: 0

You should also do a stress test. Because if all three mayor search engine bots are coming to visit your site and start to reindex it, it might take your server(s) offline. We've had this problem once. A SEO company recommend it to "block" yahoo and msn the first few days and let Google do its work first. Search engines had to reindex thousands of urls at a new domain but with the same url structure.

Senior Member from US

joined:Mar 30, 2005
posts:13010
votes: 222

One approach I've been very successful with when I use it is only to 301 redirect the important URLs

I was thinking of something like that - there are a lot of category pages that never got any traction in the search engines anyway, plus we've reorganized some product lines. So it's probably more work than necessary to 301 them to the proper page. But I just hate to 404 stuff. I really really do. Seems like such a waste.

You should also do a stress test.

We're planning to do one for user load on the server anyway; that's an interesting idea on the bots. I will talk it over with my peeps.

Senior Member

joined:Mar 31, 2002
posts:25430
votes: 0

I manually mapped all the category and static pages, and they will be fed into the .htaccess. Since the old pages were served out of a /catalog directory, I *think* we can put the redirects there instead of in the root directory .htaccess.

Yes, you can, with adjustments to the rule patterns. However, make sure that no internal rewrites will be applied in the higher-level .htaccess file(s) or in any config files. If an internal rewrite is invoked before an external redirect, you will find that the internally-rewritten filepath will be 'exposed' as a URL to the client by the subsequent external redirect -- and in the case of search engine robot clients, that would be a very bad thing...

The rule is, starting with the server config files and proceeding down through all .htaccess files that will execute for any particular HTTP request, make sure that all redirects execute before any internal rewrites.

Where the RewriteRule patterns matching requested URL-paths are insufficient to guarantee this behaviour, it can be enforced by adding exclusions (negative-match RewriteConds) to those rules, by adding 'skip rules' (as previously mentioned by phranque) ahead of those rules, or by moving all redirects to the highest-level .htaccess or config file where any internal rewrites are invoked (and putting these relocated redirects ahead of the internal rewrite rules in that file).

Use a server headers checker to test URLs which have multiple problems, such as an obsolete /catalog URL with a non-canonical hostname and a couple of casing errors. Make sure that the server responds with a single 301 redirect to the final, correct URL. While it may not be feasible to ensure this for *all* possible old URLs, make sure that it happens for all of the most important ones.

Junior Member

joined:July 24, 2009
posts:169
votes: 4

If you can find the time I'd get the Google Product feed updated pre-launch if it drives any significant amount of traffic. We recently changed our site from a bunch of subdomains back to just a single (www) subdomain and it required updating the feed. We sent the new feed the day of launch and had no drop in traffic. In the past when the feed has been shut off for a time then restarted it seemed to take longer to regain the traffic. Having said that our feed builds dynamically and we didn't need to do much to autogenerate the feed with the new URLs and for us it was worth doing as it drives a fairly significant % of our overall traffic.

Senior Member

joined:May 26, 2000
posts:37301
votes: 0

What do people think about the OLD sitemap vs the NEW sitemap in tools? Do you just do a 301 and then notify them?

At launch time I would overwrite the old sitemap with the new - but using the same URL. I would point to that URL in robots.txt, and also ping Google directly through Webmaster Tools. But I'd say it's a bad idea to give googlebot an XML Sitemap that includes any URL that 301 redirects. So that's another good relaunch checklist item: make sure the XML Sitemap is being generated correctly.

Senior Member

In the case of that thread where jdMorgan posted, he was not directly talking about a sitemap - just making sure that googlebot saw one link to the IP address somewhere.

I've read a few SEO bloggers who recommend a "put the 301 URL in the sitemap" approach - but I have also read Google saying that they don't want to see any redirects in the sitemap. I guess you can take your choice. For me, I think a sitemap essentially says "these are my good URLs". On a large dynamic site, that can already be a challenge to generate cleanly.