Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Robots.txt Generator

Robots.txt Generator is a free and very useful tool that can be used by webmasters for making their websites Googlebot friendly. It generates the required file in no time whatsoever. It has a user-friendly interface with options for including or excluding the things in the robots.txt file.   

The robots.txt file can be placed in the root folder of your website. It helps search engines in indexing your site more appropriately. Google and other search engines use robots or website crawlers for reviewing all the content on your website.

There are some parts in your website that you want to exclude in user search results. Example: admin page. Well, you can go on to add these pages to the file and make them explicitly ignored. 

Robots.txt files make use of the Robots Exclusion Protocol. The Robots.txt Generator will easily generate the file for you with inputs of pages that you want to exclude. The tool allows you to generate the ‘robots.txt’ file at your root directory. It is of great significance for any website. When the search engines go on to crawl your website, they always first checks for the robots.txt file that is included in the domain root level. Once the robots.txt file is identified, the crawler will read the file and identify the files as well as the directories that may be blocked. 

With the Robots.txt Generator, you can generate the robots.txt file for absolutely free. Moreover, the tool is extremely easy to use. Here, we list out the steps you need to follow for using the Robots.txt Generator Tool: 

  • By default, all robots can access your site’s files. However, you can choose the robots that you want to allow or refuse access.
  • By default, the crawl delay is set to ‘no delay.’ However, you can also choose crawl-delay. You can set a delay duration from 5 to 120 seconds. 
  • If you have a sitemap for your site, you can paste it in the text box. If you don’t have then leave this space blank. 
  • List of search robots has been provided. You can choose the ones which you want to crawl your website. 
  • Lastly, you need to restrict directories. The path must have a trailing slash “/” as it is relative to root. 
  • In the end, click on the “Create” button to generate the robots.txt file using our Robots.txt Generator Tool.


SEMrush