robots.txt Generator

Create robots.txt files to control search engine crawling.

Back to Tools
Configuration
Disallow Paths
Allow Paths (exceptions)
Bot-Specific Rules
Generated robots.txt
# robots.txt generated by L2CONTROLHUB

User-agent: *
Allow: /
About robots.txt
  • User-agent: Specifies which crawler the rules apply to
  • Disallow: Paths that crawlers should not access
  • Allow: Exceptions to disallow rules
  • Sitemap: Location of your XML sitemap
  • Crawl-delay: Seconds between requests (not all bots respect this)

Place robots.txt in your website's root directory (e.g., https://example.com/robots.txt)