How to Safely Custom robots.txt SEO Friendly For Bloggers

Teuku.Net - For you bloggers and webmasters, you must know what robots.txt is, what it does and how to optimize robots.txt.

How to Safely Custom robots.txt SEO Friendly For Bloggers

This time I will share about how to set robots.txt SEO Friendly and safe for bloggers so that your blog is more optimized in search engines.

{getToc} $title={Table of Contents}

What is robots.txt?

robots.txt or commonly known as the robot exclusion protocol or robots.txt protocol is a provision for blog users to prevent web crawlers and other web robots, from accessing all or part of a website that is not allowed to be indexed.

How to change robots.txt on Blogger

Here is how to replace blogger's default robots.txt code.

  1. Login to your blogger account
  2. Click Settings
  3. Scroll down, find Crawler and indexing
  4. Click Enable custom robots.txt
  5. And click the custom robots.txt section

Setting robots.txt

After understanding and knowing how to change robots.txt on bloggers, it's time to customize the robots.txt code to maximize SEO functions.

Blogger Default robots.txt

Actually the default robots.txt blogger setting is quite safe for blog use according to what many people often use. Here's the code:
User-agent: Mediapartners-Google
Disallow: 

User-agent: *
Disallow: /search
Allow: /

Sitemap: https://www.teuku.net/sitemap.xml{codeBox}
Replace www.teuku.net with your blog domain.{alertInfo}

Most Secure Custom robots.txt

If you want pages such as disclaimer, privacy, about, or other pages that you do not want to be indexed by search engines, please use the safest custom robots.txt code below: 

User-agent: Mediapartners-Google
Disallow: 

#below lines control all search engines, and blocks all search, archieve and allow all blog posts and pages.

User-agent: *
Disallow: /search*
Disallow: /20*
Allow: /*.html

#sitemap of the blog
Sitemap: https://www.teuku.net/sitemap.xml
Sitemap: https://www.teuku.net/sitemap-pages.xml{codeBox}

  • /search* will disable crawling of all search and label pages.
  • Apply a Disallow rule /20* into the robots.txt file to stop the crawling of archive sections.
  • The /20* rule will block the crawling of all posts, So to avoid this, we’ve to apply a new Allow rule for the /*.html section that allows the bots to crawl posts and pages.{alertInfo}

the setting is the best robots.txt practice for SEO. This will save the website’s crawling budget and help the Blogger blog to appear in the search results. You have to write SEO-friendly content to appear in the search results.

Effects in Search Engine Console after implementing these rules in robots.txt

It’s important to note that Google Search Console may report that some pages are blocked by your robots.txt file. However, it’s crucial to check which pages are blocked. Are they content pages or search or archive pages? We can’t display search and archive pages, which is why these pages are blocked.

But if you want to allow bots to crawl the complete website, the best possible setting for robots.txt and robots meta tag, try robots meta tag and robots.txt file. The combination may exhaust the crawling budget, but the better alternative to boost the SEO of the Blogger blog. 

That's the information from Teuku.Net about to Safely Custom robots.txt SEO Friendly For Bloggers. That way the blog that you have created will be easily indexed according to the robots.txt settings that have been determined. 

Post a Comment

We welcome relevant and respectful comments.