Robots.txt Generator

Search Engine Optimization

Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Robots.txt Generator

When search engines comes to your website to crawl a pages, they first check for a robots.txt file at the domain root. If found, then they read the file’s list of directories and files, if any page of directory are blocked from crawling. This file can be created with with our Robots.txt Generator. When you use a robots.txt generator Google and other search engines can check your website and figure out which pages on your site should be excluded. In other words, the file created by a Robots.txt Generator is like the opposite of a sitemap, which indicates which pages to include.