SOK

Robots.txt Generator

Create and validate robots.txt files to control search engine crawling behavior

Robots.txt Configuration
Configure crawling rules and directives

Delay between requests (seconds)

Preferred domain

Robots.txt Best Practices

Common Directives:

  • User-agent: * - Applies to all crawlers
  • Disallow: /admin/ - Block admin areas
  • Allow: / - Allow everything
  • Sitemap: URL - Reference sitemap

Tips:

  • • Place robots.txt in website root
  • • Use absolute URLs for sitemaps
  • • Test with Google Search Console
  • • Be specific with path patterns
Robots.txt Generator - Create SEO Robots Files | Smart Office Kit