Build a properly formatted robots.txt file using a visual form. Add rules for multiple bots, set allowed and blocked paths, include your sitemap, and download the file ready to upload.
The robots.txt file is a plain text file placed at the root of your website that instructs web crawlers which pages or sections they are allowed to visit. It uses the Robots Exclusion Protocol β a simple set of directives understood by all major search engine bots.
A well-configured robots.txt keeps crawl budget focused on your important pages, prevents duplicate content issues, protects admin areas from being indexed, and helps search engines understand your site structure more efficiently.
β οΈ
Not a Security Tool
robots.txt prevents indexing but does not block access. Sensitive content should be protected with proper authentication β not just robots.txt directives.
π€
Googlebot vs Others
Googlebot respects robots.txt reliably. Malicious bots often ignore it entirely. Use it for SEO crawl management, not as a firewall.
π
500KB Limit
Google only processes the first 500KB of a robots.txt file. Keep it concise. Group related rules together and use wildcards to reduce file size.