Create a customized robots.txt file for your website with our easy-to-use tool. Control how search engines crawl your site.
The robots.txt file is a standard used by websites to communicate with web crawlers and other web robots. This file specifies which areas of the site should not be processed or scanned.
Key directives: