Robots.txt file is also known as Robots Exclusion Protocol or Standard. It basically tells the web robots or we can say search engines as to which file to crawl. It is a way to block excess resources on your website by blocking the search engines to crawl on your website.