Wednesday, 16 July 2014

Google’s New Robots.txt Tester Tool for Webmasters

Google’s New Robots.txt Tester Tool for Webmasters


Robots.txt file is usually used to prevent the crawling of some specific pages. There are many tools available in market to do it. However, Google has announced a new robots.txt Tester tool to test the accuracy of your robots file.
 Robots.txt Tester

Robots.txt Tester
The new robots testing tools will verify that how your website pages are behaving with Googlebot. Apart from this you can also combine this tool with other part of webmaster tool. For example: Use Webmasters fetch and render tool and if you find any blocked page then take help of robots.txt Tester tool to locate the directory which is blocking its crawling. It is also help you to find if your robots.txt blocking CSS, Javascript or Mobile content. In that case you have to improve your robots.txt.

See Also: A Complete Guide About Robots Meta Tag Values

We recommend you to use Google’s robots.txt tester tool even if you think the robots.txt file is okay. To find the testing tool go to:
Webmaster Tools> Crawl> Robots.txt Tester
Never be over confident, at-least test once.
Disqus Comments