Robots.txt is a file at the root of a website that dictates to web crawling bots (like Google’s) which pages on the site should not be crawled and indexed. It’s like a “Do Not Disturb” sign that tells robots which parts of your website they should skip over.