Robots.txt Generator

Search Engine Optimization

Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Robots.txt Generator

What is a Robo.txt files? Well, robot.txt is those files kept hidden at the domain root. When a search engine spider crawls into a website, it first checks for the robot.txt files at the domain root. After finding such data, they read those files upon identification to identify directories and any other files that may be blocked from checking. Robot.txt files are just the opposite of sitemap which designates the pages to be included. But rather, robot.txt files indicate the pages that should be excluded. And if you are wondering how you can make such files, you are welcome at SeoCheckPoints.

At SeochekPoints, we have the ‘Robot.Txt Generator’ tool that will create the domain root files for you, easily and free of cost. All you have to do is to put the URL of your website and any other information that may seem necessary. After that, click the submit button, and the tool will generate robot.txt files for you. Robot.txt files will indicate the search engine spider about which pages have to be excluded and thus it is a very useful for every website.

This tool is easy to use and free of cost. So don’t just wait, check this tool right away and create your robot.txt files as well. Happy Generating!