Lingo of the Day: robots - A protocol that prevents spiders, crawlers, and bots from accessing all or part of a Web site which is otherwise publicly viewable. Known officially as the Robot Exclusion Standard, also known as the Robots Exclusion Protocol, the robots.txt protocol is achieved by placing a text file in the root of the Web site hierarchy on the server (e.g. www.example.com/robots.txt). Robots are often used by search engines to categorize and archive Web sites, or by webmasters to proofread source code.