Friday, October 6, 2017

What is robots.txt?

Robots.txt is one way of telling the Search Engine Bots about the web pages on your website which you do not want them to visit.

Robots.txt is useful for preventing the indexation of the parts of any online content that website owners do not want to display.

IF you  want to block all search engine robots from crawling your website, just put the follow code:

User-agent: *
Disallow: /

IF you  want to block Google from crawling your website, just put the follow code:

User-agent: Googlebot
Disallow: /

It is important to decide the location of Robots.txt very carefully, or else errors might occur while displaying the website.

Image result for robot.txt

No comments: