vextil
Back to SEO Tools

Robots.txt Generator

Create a robots.txt file for your website. Control which pages search engines can crawl and which they can't.

Basic Settings

* means all search engines

Path Rules
One path per line
Advanced Options

Time between requests (optional)

Generated robots.txt
Save as robots.txt
User-agent: *
Allow: /
How to use
  1. 1Choose your robots.txt settings
  2. 2Add sitemap URL
  3. 3Copy the generated robots.txt
Features
  • robots.txt Generator

    Creates robots.txt files for search engine crawlers.

  • User-Agent Rules

    Define rules for different bots (Google, Bing, etc.).

  • Allow/Disallow

    Allow or deny access to specific paths.

  • Sitemap Reference

    Automatically add your sitemap URL.

FAQ

What does a robots.txt do?

It tells search engines which pages may be indexed and which not.

Where must the robots.txt be located?

In the root directory of your domain: https://example.com/robots.txt

Can I have different rules for different bots?

Yes, you can define specific rules for Googlebot, Bingbot, etc.

Note: All processing happens in your browser. Your files are not uploaded to any server.