How to Add Robot.txt in website Blogger and WordPress

As a blogger or web developer it necessary to know about Robot.txt  it is very useful and must be aware of term Robot.txt .
In blogger it is known as Custom Robots.txt that means now you can customize this file according to your choice. In today’s tutorial, we will discuss this term in deep and come to know about its use and benefits. I will also tell you how to add custom robots.txt file in blogger.
A robots text file, or robots.txt file (often mistakenly referred to as a robot.txt file) is a must have for every website. Adding a robots.txt file to the root folder of your site is a very simple process, and having this file is actually a 'sign of quality' to the search engines. Let's look at the robots.txt options available to your site.

What is Robot.txt?

Robots.txt is a text file which contains few lines of simple code. It is saved on the website or blog’s server which instruct the web crawlers to how to index and crawl your blog in the search results. That means you can restrict any web page on your blog from web crawlers so that it can’t get indexed in search engines like your blog labels page, your demo page or any other pages that are not as important to get indexed. Always remember that search crawlers scan the robots.txt file before crawling any web page.
Each blog hosted on blogger have its default robots.txt file which is something look like this

History of Robot.txt:

The standard was proposed by Martijn Koster. when working for Nexor in February, 1994 on the www-talk mailing list, the main communication channel for WWW-related activities at the time. Charles Stross claims to have provoked Koster to suggest robots.txt, after he wrote a badly-behaved web crawler that caused an inadvertent denial of service attack on Koster's server.

It quickly became a de facto standard that present and future web crawlers were expected to follow; most complied, including those operated by search engines such as WebCrawler, Lycos and AltaVista.

How to add robot.txt in your website?


Writing a robots.txt is an easy process. Follow these simple steps:
Open Notepad, Microsoft Word or any text editor and save the file as 'robots,' all lowercase, making sure to choose .txt as the file type extension (in Word, choose 'Plain Text' ).
Next, add the following two lines of text to your file:
User-agent: *
Disallow:
'User-agent' is another word for robots or search engine spiders. The asterisk (*) denotes that this line applies to all of the spiders. Here, there is no file or folder listed in the Disallow line, implying that every directory on your site may be accessed. This is a basic robots text file.
Blocking the search engine spiders from your whole site is also one of the robots.txt options. To do this, add these two lines to the file:
User-agent: *
Disallow: /
If you'd like to block the spiders from certain areas of your site, your robots.txt might look something like this:
User-agent: *
Disallow: /database/
Disallow: /scripts/
The above three lines tells all robots that they are not allowed to access anything in the database and scripts directories or sub-directories. Keep in mind that only one file or folder can be used per Disallow line. You may add as many Disallow lines as you need.
Be sure to add your search engine friendly XML sitemap file to the robots text file. This will ensure that the spiders can find your sitemap and easily index all of your site's pages. Use this syntax:
Once complete, save and upload your robots.txt file to the root directory of your site. For example, if your domain is www.mydomain.com, you will place the file at www.mydomain.com/robots.txt.

Adding Custom Robots.Txt to Blogger:

Go to your blogger blog.
Navigate to Settings >> Search Preferences ›› Crawlers and
indexing ›› Custom robots.txt ›› Edit ›› Yes
Now paste your robots.txt file code in the box.
Click on Save Changes button.
You are done!

How to Use Blogger Robot.txt:

As you know, we are talking about setting up robots tags on blogger. Follow the steps given below to proceed with this.
Step 1: Visit blogger.com and sign in to your account. From the list of your blogs, choose the one for which you want to modify robots tags.
Step 2: Then go to Settings >> Search preference. There you can see a setting called Custom robots header tags under Crawlers and Indexing. Click the Edit link to the right of it.
Step 3: At this step, you will notice two radio buttons. Obviously, the first one should be your pick.
Step 4: Now, you will get a set of checkboxes. But don’t get intimidated! It may feel like a complicated one, but it’s not. You can set them on your own by reading the “Custom Robots Header Tags and Purpose” once again. Or, just follow the same settings I chose (refer to the image given below) and hit Save changes.


Note: We can set this up for the homepage, 
archive pages, and post pages as well.

Didn't get that? Watch video: 



Hurray! You have done this.

Now how to add Robot.txt in WordPress?

How to add robot.txt in WordPress:

From above dissections you will familiar with robot.txt and process of adding it to your blogger and websites. So by not taking your golden time it is the same processes as well in WordPress. 

Make your own robot.txt and uploadupload it to root directory.

Or you can directly generate robot.txt in robot generator. Generate you robot.txt and lace it in root Directory.

Final Words !

Adding Robot.txt is the best thing For SEO.
This was a full guide for adding robot.txt in blogger.com , WordPress , and in websiteswebsites. Make robot.txt and place it in your root directory to get good response in search engines.



Related topic: Add Keywords in Blogger for Optimization 

Now let me know what do you think? Having trouble doing this process? Follow me or comment your problem I will be happy to help you 😊thanks for reading 

Good luck !

Post a Comment

If you have any Questions related to the above post. You can comment below in comment box. I will be happy to Answer you !

أحدث أقدم