Robots.txt Generator Tool

Every website owner needs to use a robots.txt generator tool to get a custom robots.txt file for their site. It is essential for properly indexing your site on the Google search engine and facilitating the crawling of search engine bots.

How to Use the Robots.txt Code Generator Tool

  1. Select your hosting platform, whether it's Blogger or WordPress.
  2. Enter the URL of your site with the https:// prefix in the designated field.
  3. Click "Generate," and the robots.txt file will be prepared for you.
  4. Copy the generated code to your site's settings: Blogger settings and enable custom Robots.txt content. Then paste the code there.
  5. Now, go to the Robots.txt Testing Tool and paste the same code there, then click "Submit."

Custom Robots.txt Code

Robots.txt Generator

Note:
After choosing the platform for which you want to create a Robots.txt file, you need to enter the URL of your site in the designated field, and then click "Generate the file."

You are now choosing to create a Robots.txt file for Blogger

Status :

User-agent: Mediapartners-Google Disallow: User-agent: * Disallow: /search Allow: / Sitemap: User-Agent: * Allow: /wp-content/uploads/ Allow: /wp-admin/admin-ajax.php Disallow: /wp-content/plugin Disallow: /wp-admin/ Disallow: /blog/page/ Disallow: /search/ Sitemap:

Benefits of Creating a Custom robots.txt File

By creating a robots.txt file for your site, you can instruct the Google search engine about which pages you want to display as search results and also specify the pages you want to prevent from being crawled and indexed.

You may want to prevent specific pages from your site and block search engine crawlers from accessing them.

Add Disallow: /yourpage.html to your robots.txt file and send a notice of the new code update. Be sure to replace yourpage.html with the URL of the page(s) you want to block from indexing before sending.

Full Explanation of robots.txt Commands

User-agent: Mediapartners-Google: This is the code that allows Google AdSense crawlers to access your site, and it has a fixed value that should not be modified.

Disallow: This command is used to prevent certain pages on your site from appearing in Google search results. For example, it is used to block the indexing of search results page URLs.

Allow: / means allowing indexing. After using Disallow to block access to specific pages, Allow is used to allow crawling of the rest of the site's pages.

Sitemap: The sitemap file that contains a list of topics published on your site. You should send and install the same code on both Google Search Console and the robots.txt file to ensure that search engine crawlers correctly understand the content of your site.

Cookie Consent
We serve cookies on this site to analyze traffic, remember your preferences, and optimize your experience.
Oops!
It seems there is something wrong with your internet connection. Please connect to the internet and start browsing again.
Site is Blocked
Sorry! This site is not available in your country.