How robots and spiders are causing issues, how to stop them. We can also talk about Completely Automated Public Turing Test To Tell Computers And Humans Apart - their use, their compliance issues, porn proxies, PWNtcha and other ways to defeat them.

Are there any freely available spiders that can compete with the likes of Appscan / Webinspect in terms of finding and parsing non standard links? I know burp has some javascript support, but it never seems to find the amount of web pages that the proprietary spiders find.