How to use Robots.txt File

A robots.txt file is used to direct crawlers and bots. It tells them what files they should not visit. This can be done to avoid diluting your keyword density with less relevant pages as well as for other reasons. Regardless of the reason why you use a robot.txt file, it will need to be put in your domain’s top level directory to be most visible. There are a few things to remember about this type of file. First, it is accessible to the public. Second, it does not stop malicious crawlers, so you should not have sensitive or private information on pages blocked by this file.