Thursday, 7 June 2018

What is robots.txt

Robots.txt is text file created by webmasters to instruct web robots how to crawl pages of their website. This file placed at the root of the site that indicates those parts of your site you don’t want accessed by search engine crawlers. The file uses the RES ( Robots Exclusion Standard ) protocol.
Default Format
User-agent : [agent-name]
Disallow : [string not to be crawled]

For more help visit here  Best Web Development 2018

No comments:

Post a Comment