Monday, April 29, 2013

Enable Custom Robots.txt File in Blogger

8:15 AM

Robot.txt is a way to tell search engines whether they are allow to index a page in the search result or not. The bots are automatic, and before they could access your site, they check the Robot.txt file to make sure whether they are allowed to crawl this page or not.   Sometime, people do not want to stop searching engine from crawling their whole website. On the other hand, they want to specify few pages, which should not be indexed in the search results. Therefore, today in this article, we will show you How to Enable Custom Robot.txt File in Blogger?

First of all, go to Blogger.com >> your site >> Settings >> Search engine Preference >> Crawlers and indexing. Now you will be able to see two options i.e. Custom robots.txt and Custom robots header tags. These two options would provide you the flexibility to customize your robot.txt file.
  • Custom robots.txt: This option provides you the ability to edit your whole robot.txt file. You can just type your content whether you want the content to be crawled by spiders or not. However, You can always undo your actions and can always revert back to normal. 
  • Custom robots header tags: This option is a bit completed. It does not provide you the ability to write your codes instead it provides few options with the check boxes. So, if have no idea about head tags then stay away from it. 
Now you have to enable the custom robot.txt file so press the edit button which is present next to the “Custom robots.txt” option. After selecting the edit button, it would ask you to enable custom robots.txt content so press “Yes” and proceed to the next step.

In the text area, type the content which you want to exclude from being crawled. After editing the file according to your needs, press the save button to conclude. However, if you want to revert back to the default robot file then, you can select “No” instead of “Yes.

For Example:

User-agent: *
Disallow: /about

User-agent: *
Disallow: /contact

User-agent: *
Disallow: /services

We hope this tip would help you. If you are having any issue regarding crawling then, feel free to leave your questions in the comments below and our experts would try to help you in solving them.

Note: If this tutorial worked for you (and it should work), please leave a comment below. Thanks.

Written by

We are Creative Blogger Theme Wavers which provides user friendly, effective and easy to use themes. Each support has free and providing HD support screen casting.

0 comments:

Post a Comment

 

© 2013 iFlasha. All rights resevered. Designed by Templateism

Back To Top