Back to tools
robots.txt Generator
Build a robots.txt file by defining crawler rules, allow/disallow paths, and sitemap location.
Rule 1
Global Settings
Preview
User-agent: * Disallow: /admin/ Disallow: /private/ Allow: /public/
Build a robots.txt file by defining crawler rules, allow/disallow paths, and sitemap location.
User-agent: * Disallow: /admin/ Disallow: /private/ Allow: /public/