I'd like to block some spiders and bad bots by user-agent text string for all of my virtual hosts via httpd.conf but have yet to find success. Below are the contents of my http.conf file. Any ideas why this isn't working? env_module is loaded.

Done. Will need to check on the results later. Thanks.
–
Ferdinand.BardamuSep 12 '11 at 21:27

Yep. Hope you're successful. If you want to test it, use IE. Google for "Change UA in IE" and it's a toolbar addon thing, where you can set your UA to anything you like. Then try visiting your site. See my edit.
–
U4iK_HaZeSep 12 '11 at 21:30

Thanks. I'd like to have a single file to manage so am ignoring robots.txt for now. (And I don't want to block all friendly bots, just those outside target markets.)
–
Ferdinand.BardamuSep 12 '11 at 22:13

Fine by me, but robots.txt also allows the disallowment of specified bots, but the .htaccess or httpd.conf file is more secure: any robot can be told to disobey robots.txt.
–
U4iK_HaZeSep 12 '11 at 22:27

I deleted the order allow, deny directives from my .htaccess files and was able to trigger the expected behavior for certain user-agents when I spoofed them with User Agent Switcher in Firefox, so it does appear that there was some conflict. Other user-agents on my list, however, were not blocked -- but that's because I was unclear as to the significance of the carat (^) as used in my httpd.conf. The Regular Expression tutorials I read stated this but it didn't really sink in at first: the carat forces the server to look only at the very beginning of the entire user-agent string (not individual strings within, as I originally thought) when parsing the connection request. As the key identifying string for some of the spiders & bots I wish to block occurs later in the user-agent string, I needed to drop the carat to get things working.