What is Robots.txt File?

Robots.txt or the robots exclusion protocol (REP), is a text file webmasters create to instruct search engine robots how to crawl and index pages on their website. It is not mandatory but great when search engines frequently visit your site and index your content but often there are cases when indexing parts of your online content is not what you want. Robots.txt needs to be placed in the top-level directory of a web server in order to be find by search engines.