How To Add Robots.txt File In Blogger – Full Info
Today we will see how the robots.txt file will be added in blogger. This also makes the site SEO correct.
But misuse of robots.txt can also make bad seo of your site, so please read this article carefully.
If you are new to blogging then you have a question about what is robots.txt?
When search engine bot crawls your site, this Robots.txt file works to give it direction.
Mainly this file page excludes i.e., not to index a particular page or path. Or it is used to prevent a specific bot from crawling.
And the third thing is to get your post, images index, through sitemap.
Also Read: Blogger custom robots tags settings
For what purpose Robots.txt file used in Blogger?
As I mentioned above three reasons for which we use robots.txt. But do we have to add these three things to the robots.txt in blogger?
The answer is no!
In order to noindex a path or page, the blogger has given us the option of custom robots header tag. You have to do noindex if you want to do this from the header tag.
So why should I add robots.txt to blogger?
Its direct answer is to make an index in your search engine (google, yahoo, bing, etc) through sitemap. Which will help you improve the seo of the site.
Before you take any of the robots.txt file you need to get in the blogger, we take a little detail about robots.txt.
Robots.Txt Complete Information With Meaning
After creating a blog in Blogger, a robots.txt file is automatically generated. This is the way to see that file.
From this url, you get a default file which looks something like this.
Disallow: / search
Let’s see what its meaning is.
This means bot which is going to crawl your pages.
Here’s the ‘Mediapartners-Google’ it’s the bot of adsense. And Disallow it in the bottom:
Do you want to allow: / and disallow: means the same meaning i.e. giving permission to the bot to crawl because no path has been given to Disallow.
Remember Disallow: / It means do not crawl the entire site.
And Disallow: / search means / search and all url going through search.
For example- / search / lable / smartphones, / search / lable / cloths, all of these url bots will not be able to crawl.
user agent: *
Here ‘*’ means all bots such as googlebot, Mediapartners-Google, yahoo, etc.
‘*’ Is considered a wild card entry.
The Sitemap contains all the urls of the blog, which by adding it to robots.txt will crawl all the posts in the blog and help in indexing the search engine.
How To Add Best Robots.Txt File in Blogger
The below mentioned code is SEO friendly. You can use it in your blog.
Disallow: / search
<em> Sitemap: https://www.yoursite.com/sitemap.xml </ em>
First of all copy this code and replace example.com with your domain name.
Login to Blogger >> choose Blog >> go to Settings >> Click Search Preferences.
To enable Custom robots.txt, click on edit and select ‘yes’ radio button.
Now paste the code which is copied over step 1 by clicking on Save Changes.
Now you can see your new robots.txt file browser, type in address bar www.yoursite.com/robots.txt and see something like this.
So in this post I explain you How To Add Robots.txt file In Blogger. And the codes that are used in robots.txt are also mentioned.
Keep in mind that if you are editing any code by yourself, adding some extra lines you need to know about it first.
Otherwise the seo of your site may be bad.
If there is any problem or question about this subject,ask in the comments. I will do my best to give answer of your problem.