I must point that that if you accidentally deindex pages using the <meta robots="noindex">, it will take several weeks to recover rankings. (i've seen it happen on a client e-commerce). (it will also impact other search engines, not only Google)

Another important difference is that the "url removal" tool is used mainly to quickly hide content from the users: it's an emergency measure and it's not definitive: after 90 days the content is again available for search, unless blocked by robots.txt.

I think that the greatest shift is the primary focus of the activity.For too many years SEOs put Search Engine(s) at the center of their strategy.Becoming Inbound marketers means simply that we need to establish a new focus: the user is at the center of our strategy: this is a major paradigm shift and by accepting that we will be able to explore new paths.

Great post, as i said on twitter.I would like to share some horror stories: i've worked with some international agencies on Seo projects regarding italian customers and except for one case, it has been a bloodbath. I wish i could do names without being sued, but i can't. I have personally experienced:english based keyword research translated using google transaltorMy Seo copywriting validated through the use of google translatorLink building machine-made for italian keywords using spinned english texts (hello penguin!)And those scumbags seo agencies report millions of dollars of revenues each year :-/

The requested resource resides temporarily under a different URI. Since the redirection might be altered on occasion, the client SHOULD continue to use the Request-URI for future requests. This response is only cacheable if indicated by a Cache-Control or Expires header field.

The temporary URI SHOULD be given by the Location field in the response. Unless the request method was HEAD, the entity of the response SHOULD contain a short hypertext note with a hyperlink to the new URI(s).

If the 302 status code is received in response to a request other than GET or HEAD, the user agent MUST NOT automatically redirect the request unless it can be confirmed by the user, since this might change the conditions under which the request was issued.

while the 503 http status is good, i would not recommend any kind of redirection after that: i did some 302 redirections on some projects and i noticed some erratic behaviour from google, like losing many keywords rankings, which were recovered only after some weeks.

It may seems trivial, but to my customers would mean a prolonged decrease in business versus a temporary glitch.

I saw today a 64-bit version of Xenu which should be able to (in theory) address more memory.

9. What is the maximum number of URLs that can be checked?There is no fixed number, but it seems to be above one million. The problem is that Windows XP applications have a size of 2GB max.

A 64 bit beta version is available which may or may not allow more URLs. It is based on Microsoft Visual Studio 2010. (Rename the xenu.exe that you already have installed). The 64 bit EXE file is much bigger than the 32 bit version. I suspect that I am making some config mistake with Visual C++ 2010 and that it is packing a ton of library stuff into the EXE file that isn't needed.