Generally you would want to ban spiders from certain sections of your site or pages that you do not want to appear in search results, or offer nothing for a search engine - such as a feedback form, script directories, image directories etc...

Sometimes spiders can hit your site at a high rate so blocking certain crawlers can help server load if they are hitting slow pages.

You would also want to ban it if you remove pages or directories so it doesn't spit out 404's in your server logs.

Remember that robots.txt is voluntary and is not highly reliable. Good robots do comply with robots.txt, so this is a good way to control content in your search engine.

There is a misconception that placing content in your robots.txt will increase your security or prevent robots from crawling poor-performing webpages. However, some bad robots will ignore robots.txt. If robots.txt is accidentally deleted for one week, the bots might scan your site and publish the results to Google/Yahoo/Bing, and you may never be able to clean this up. Some malware programs which will specifically look for juicy targets in your robots.txt file, and will specifically target those forbidden items.