Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Robots.txt Generator

? What is the Robots.txt Generator Tool?

The Robots.txt Generator on pcjow.com is a free tool that helps website owners create a properly formatted robots.txt file. This file instructs search engine crawlers which pages or sections of your website should or should not be indexed.

⚙️ How Does It Work?

The tool allows you to set permissions for different user-agents (search engine bots), choose which directories to allow or block, and add special directives like crawl-delay or sitemap location. Once your preferences are selected, the tool instantly generates the correct robots.txt syntax, ready to be added to your site’s root directory.

? Who Should Use It?

This tool is ideal for:

  • Web developers managing site visibility in search engines.

  • SEO professionals optimizing crawl behavior.

  • Bloggers who want to hide admin pages from indexing.

  • eCommerce store owners controlling bot access to product or cart pages.

  • Anyone launching a new website and needing basic SEO setup.

? Step-by-Step Usage Guide

  1. Go to: pcjow.com/robots-txt-generator

  2. Choose settings like:

    • User-agent (e.g., * for all bots)

    • Allow/Disallow specific directories or URLs

    • Optional: Crawl-delay and Sitemap URL

  3. Click “Generate Robots.txt”.

  4. The tool will display the ready-to-use code.

  5. Copy the code and upload it to the root of your domain (e.g., example.com/robots.txt).

✅ Pros

  • ✔️ Free and fast robots.txt creation.

  • ✔️ No coding knowledge needed.

  • ✔️ Covers basic to advanced directives.

  • ✔️ Helps protect sensitive or duplicate content from being indexed.

  • ✔️ Supports sitemap linking for better crawling.

⚠️ Limitations

  • ❌ Only generates the file — does not upload it to your server.

  • ❌ Incorrect settings may block important pages from being indexed.

  • ❌ No live crawler testing — use Google Search Console for that.

  • ❌ Users should be cautious when disallowing entire folders.