What is a robots.txt file?
A robots.txt file is a set of instructions that search engine crawlers must follow when navigating your website. They are usually used to tell the crawler where the sitemap is located and to prevent them from crawling or indexing sensitive pages like login URLs.
↓ SCROLL TO See Frequently Asked Questions ↓