How Does Googlebot Find/Index Hidden FTP Logs?

Google's crawler constantly scours the Internet for pages to index. This is why it's important to remove the pages you don't want to get indexed. By using special code, you can disallow Googlebot indexing your pages under costruction or incomplete pages.

A thread at WebMasterWorld Forums shows another example of pages you probably don't want indexed and that is your FTP logs. The memmer complains that " The first response is fairly obvious, indicating that all FTP log and other pages that you do not want indexed should be password protected, therefore making it impossible for the Googlebot to crawl. So knocking out links and assuming the pages are protected, could it still be possible for the Googlebot to find the URL and "accidentally" index it?"

Another member commented that FTP logsare catched by Googlebot. So, it is better that all FTP logs and other pages you don't want indexed should be password protected to disable Googlebot from crawling the sames pages. Besides, as you enable Google toolbar and the PageRank, it sends URL data to Google. So Google can find even the unlinked pages.

Navneet Kaushal

Navneet Kaushal is the founder and CEO of PageTraffic, an SEO Agency in India with offices in Chicago, Mumbai and London. A leading search strategist, Navneet helps clients maintain an edge in search engines and the online media. Navneet's expertise has established PageTraffic as one of the most awarded and successful search marketing agencies.

PageTraffic Buzz brings you the latest happenings from all corners of the digital marketing world. You will find fast news, in-depth analysis, helpful articles, how-to-guides and a lot more right here, updated five times daily.