Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Robots.txt Generator

Robot.txt is an amazing tool that generates a file that indicates the pages with great significance to any website. First of all, a search engine crawls the robot to your site to read the file and identifies the directories. At a domain level, you can also adjust the robots that are allowed to crawl to the website. This robot.txt generator is easy to use that you can choose from the list of the search robots to which you want to crawl to your files.

Uses of Robot.txt Tool:

  1. Robots are allowed to access your all files and manage the robot.txt that you want to allow or refuse. 
  2. Select the robots that you want to crawl to your site. 
  3. Adjust the robot that you don’t want to crawl your files. 
  4. You can also choose the crawl delay for 5 to 120 seconds as it is set as no delay by default.
  5. If you have a sitemap then you can paste it in the text box otherwise leave it blank.