Robots.txt is a text file which is present on your site (www.yourdomain.com/robots.txt) to tell search robots that which pages you would like them not to visit. On your website, you may have sensitive datas where you don’t want search engines to index them. To make this possible, writing on the robots.txt file will help search engines to read the robots.txt file and they don’t crawl that pages.
Creating a robots.txt file isn’t easy. You need to know the list of user agents, disallowed files and directories. So we use Robots.txt generator to create a robots.txt file. MN Robots.txt Generator is a simple web service that lets to create robots.txt file easily. Just select the options from the template and click create. Once the robots.txt file is created, copy and paste it on your root directory.
Start with the default all robots allowed option, then setting the crawl delay option to No delay and then enter the Sitemap URL of your website/blog. Then some of the search bots are listed where you can allow or disallow the search bots. Then if you need to restrict some of the directories from your website, then enter the URL of the directory. Therefore, it protects from search bots.
Check out MN Robots.txt Generator
| For a limited time I'm sharing some select Tips and Tricks and How-To Guides for FREE. |