How To Add Custom Robots.txt File In Blogger Step By Step

Each and every one of us wants to announce our presence on the web right? Day in day out, we are trying ways and means to increase our search rankings on Google and other search engines. Making good settings and adding Custom Robots.txt File in your blog will really help you in search rankings.

If you want to learn how to add custom robots.txt file in blogger, then YES, congratulation, you enter the correct article. This is the article for you. In this post, I’m sharing how to add custom robots.txt file in Blogger Blogspot. Before we jump in, I would like to talk about what is robot.txt file is and why you should add custom robots.txt file in your blogger blog.


What Is Custom Robots.Txt File? Why Do I Need To Add Custom Robots.txt File In Blogger?

Robots.txt is a text file of Google that includes few lines of simple code. It is stored on the website or site’s server which guides the web crawlers on how to index and crawl your site in the search results of Google. That means you can restrict any web page on your blog from web crawlers so that it can’t get indexed in search engines like your blog labels page, your demo page or any other pages that are not as important to get indexed. Always remember that search crawlers scan the robots.txt file before crawling any web page.

In Blogger search option is related to Labels. If you are not using labels wisely per post, you should disallow the crawl of the search link. In Blogger, by default, the search link is disallowed to crawl. In this robots.txt, you can also write the location of your sitemap file. A sitemap is a file located on a server that contains all posts’ permalinks of your website or blog. Mostly sitemap is found in XML format, i.e., sitemap.xml.



Other Tutorials:
How To Set Custom Header Tags For Blogger Blog Step By Step - 4 Steps


All About Custom Robots.txt File In Blogger Step By Step With Explanation To Key Words

Each Site hosted in blogger has its own default custom robots.txt file that’s something look like that:

User-agent: Mediapartners-Google

Disallow:

User-agent: *

Disallow: /search

Disallow: /b

Allow: /

Sitemap: https://www.yourblogurll.blogspot.com/feeds/posts/default?orderby=updated 



Let’s Understand Custom Robot.txt Totally: Explanation To Custom Robots.txt File In Blogger Key Words

This code above is divided into different segments. Let us first study each of them and later we will find out how to discover custom robots.txt files in blogger sites. 


User-agent: Mediapartners-Google
This code is for Google Adsense robots which help them to serve superior ads on your blog. Either you’re using Google Adsense on your blog or not simply abandon it as it really is.

User-agent:*
In default, our blog’s tags hyperlinks are limited to found by search crawlers that mean that the webmaster won’t index our tags page hyperlinks because of the below code.

Disallow: /search
That means the hyperlinks having keyword lookup only after the domain name will be ignored. See below example that’s a link of tag page called SEO. 

And if we eliminate Disallow: /search in the above code then crawlers may get our entire blog to index and crawl all its articles and website pages. Here Allow: / describes the Homepage which means internet crawlers can crawl and index our blog’s homepage.

How To Disallow Particular Post: Now suppose if we want to exclude a specific article from indexing then we could put in below lines from the code. Disallow: /yyyy/mm/post-url.html Here This yyyy and mm denotes the publishing month and year of the post respectively. For instance, if we’ve published a post in the year 2020 per month of March then we have to use the following format. Disallow: /2020/03/post-url.html To make this task easy, it is possible to merely copy the post URL and eliminate the domain from the beginning.

How To Disallow Particular Page: If we need to disallow a particular page then we can use exactly the same method as previously. Simply copy the page URL and remove the site address from it that will look like this: Disallow: /p/page-url. html 

Sitemap:http://yoururll.blogspot.com/feeds/posts/defaultorderby=UPDATED
This code denotes the site of the website. Should you put in the site of your site here then you’re able to maximize the speed of this blog. It means if the crawlers crawl your robots.txt file then they can find a route to crawl the site of your site also. Should you include your site sitemap in robots.txt document then net crawlers will easily crawl all of the pages and articles without missing one.





Let me explain further for better understanding, there are 3 types of Custom Robots.txt File 

Robots.txt Type 1

User-agent: Mediapartners-Google
Disallow:

User-agent: *
Disallow: /search
Disallow: /b
Allow: /

Sitemap: https://www.yourblogurl.blogspot.com/sitemap.xml


Robots.txt Type 2

User-agent: Mediapartners-Google
Disallow:

User-agent: *
Disallow: /search
Disallow: /b
Allow: /

Sitemap: https://www.yourblogurl.blogspot.com/feeds/posts/default?orderby=updated


Note: Don’t forget to change the https://www.yourblogurl.blogspot.com with your blog address or a custom domain. Please see the implementation below - How To Add Custom Robots.txt File In Blogger Step By Step. If you want search engine bots to crawl the most recent 500 posts, then you should need to use following robots.txt type 2. If already you have more than 500 posts on your blog, then you can add one more sitemap line.

Note: This sitemap will only tell the web crawlers about the recent 25 posts. If you want to increase the number of links in your sitemap then replace the default sitemap with below one. It will work for the first 500 recent posts.

Sitemap: http://example.blogspot.com/atom.xml?redirect=false&start-index=1&max-results=500

If you have more than 500 published posts in your blog then you can use two sitemaps like below:

Sitemap: http://example.blogspot.com/atom.xml?redirect=false&start-index=1&max-results=500Sitemap: http://example.blogspot.com/atom.xml?redirect=false&start-index=500&max-results=1000

Robots.txt Type 3

User-agent: Mediapartners-Google
Disallow:

User-agent: *
Disallow: /search
Disallow: /b
Allow: /

Sitemap: https://www.yourblogurl.blogspot.com/atom.xml?redirect=false&start-index=1&max-results=500
Sitemap: https://www.yourblogurl.blogspot.com/atom.xml?redirect=false&start-index=500&max-results=1000

How To Add Custom Robots.txt File In Blogger Step By Step

Now the main part of this tutorial is how to add custom robots.txt in blogger. So below are steps to add it.
Step 1 – Go to your blogger blog.
Step 2 – Navigate to Settings Search Preferences ›› Crawlers and indexing ›› Custom robots.txt
Step 3 – Now enable the custom robots.txt content by selecting “Yes.”
Step 4 – Now paste your robots.txt file code in the box.
Step 5- Click on the Save Changes button.
Now You Are Done! 
See the infographic below for more information


How To Add Custom Robots.txt File In Blogger Step By Step

How to Check Your Robots.txt File

You can check this file on your blog by adding /robots.txt at the end of your blog URL in the web browser. For example:

http://www.yourblogurl.blogspot.com/robots.txt

Once you visit the robots.txt file URL you will see the entire code which you are using in your custom robots.txt file. See the infographic below:


How to Check Your Robots.txt File
I tried to cover as much as I could about how to add custom robots.txt file in blogger Blogspot By adding Custom Robots.txt File In Blogger it helps to increase your website’s organic traffic. And If you enjoyed this blog article, share it with your friends who wish to add custom robots.txt file in blogger and want to improve site traffic.If you have any doubts regarding this, feel free to ask in the below comment section!
full-width


Article "How To Add Custom Robots.txt File In Blogger Step By Step" protected

Post a Comment

Previous Post Next Post