Robots.txt Generator

Search Engine Optimization

Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Robots.txt Generator

Robots.txt file and its Uses

Robot.txt file is an important file from a Search Engine Optimization perspective. Robots.txt file helps search engine crawlers to detect which pages to crawl and index and which pages to leave and do not crawl. It saves search engine crawlers time and Search Engine marks such websites as responsive websites that care about search engine resources. 

The robot.txt file is a similar file to the sitemap file of the website. But both the files have different functionality. Sitemap file and Robots.txt file, both are important for search engine crawlers and cannot be used as a substitute for one another. 

It is the responsibility of the website owner or the business owner or the organization that both of these files are present for his website.

What is the Use of Robots.txt file?

Whenever a search engine crawls a website, they visit the domain root of the website and look for the files that are present there. Files that are present in the root domain are the priority for search engines. They look for files and directories that are present in the root domain and check if a file is blocked for crawling. Then such files are not crawled by the search engines. These files can be of any no-index pages or password-protected pages. 

The main job of the Robots.txt file is to inform the crawlers about those files that a user does not want to be accessed by the search engine. This is used to keep a file out of Google and also from the reach of the general public that surfs the internet every day.

What kind of files you can block by using the Robots.txt file?

The main types of files that people generally try to block by robots.tx files are web pages, media files, and resource files. 

Webpage- If you are sharing a pdf document or some other kind of document on your webpage and you think that it is needless for google to crawl this page then you can use the robots.txt file to exclude that page from being crawled by search engine crawlers. This will save search engine resources and will also save some of the crawling time of your website.

Media File- If you want to manage the crawl of a media file then you can do it with this free SEO tool used as a robots.txt generator. This can help you to mark images, videos, audio files, gifs, and ppts to save them from being crawled by the search engine.

Resource file- If you want to exclude a script or style files to be crawled by search engines then you can do it with our free online robots.txt generator. 

How to create a robots.txt file fast?

Creating a robots.txt file is also possible but it is a tedious task and consumes a lot of time to do so. On the other hand, you can use our free robots.txt generator to create a robots.txt file for absolutely free.  Robots.txt file contains "User-Agent" for which you have to fill in entries like "Allow", "Disallow", Crawl-delay". You have to fill the entries according to your preference. If you want a webpage to be left while crawling then you add a "Disallow" for such a page. Google Crawl bots will not visit such a page which will be marked by a robots.txt file.

However creating a robots.txt file is very hard as one single mistake or wrong entry can hinder the performance of your whole website. So it is better to use this online robot.txt generator tool.

Can you use the robots.txt file to hide a page?

Yes, you can use the robots.txt file to hide a page from general users. But there is one condition for this. Your page that you wish to hide from crawlers should not be connected to any other page on the web. That means you cannot add tags to that page, you cannot drop a link off that page into any other page as a backlink. If you drop your link then that page will be crawled even after blocking it with the robots.txt file. Also, make sure no one else would have used your page for backlink purposes. You can check the backlinks of a page from our Free SEO tool Backlink Checker. This SEO tool provided by us is also free to use.