Not sure whether this is the right section to post this question, but I was wondering whether you could help with stopping google spiders etc crawling and indexing a particular folder on my site.I searched around and found this method.

User-agent:*Disallow:/foldername/

Is this the best way to do this? Do I just upload the file to the root folder?

felgall
—
2012-10-05T02:39:43Z —
#2

Legitimate spiders all generally look for and obey the content of the robots.txt file in the root folder. Bad bots may also read the file but will not obey the instruction.

milkandhoney
—
2012-10-05T03:16:35Z —
#3

felgall said:

Legitimate spiders all generally look for and obey the content of the robots.txt file in the root folder. Bad bots may also read the file but will not obey the instruction.

Thanks felgall!

So, basically this is the best that I can do, just minimise the amount of indexing?

ralphm
—
2012-10-05T04:53:26Z —
#4

You could also consider spanking those naughty bots by creating a black hole for them, as described here: