What is Robots.txt file and how to use it

What is Robots.txt file and how to use it

A bot file is a txt text file whose importance lies in the fact that it gives search spiders the command to crawl and index pages within your site, or the order not to crawl, and thus decides the nature of the traffic of Google’s crawlers within your site.

How can I optimize the Android file?

The process of formatting and optimizing the bot file is not complicated because the important thing for you is to be sure that there is no blocking process for search spiders from accessing important pages or having wrong commands that make it difficult for Google crawlers. And you can simulate the work of the robots.txt file through this Free Tool

For example, blocking access to CSS files is a common mistake because you prevent your site from being properly read by search engine crawlers. In order not to be complicated, I use my method of not preventing search engines from accessing any part of your site as follows:


There is a fact that you should know that adding pages to robots.txt does not prevent some search engines from indexing them and Google is one of them. To stop a page from being indexed securely, you need to use the noindex. More info from Google at noindex

And avoid putting both noindex and nofollow tags together on the same page, because by doing this you give search engines two contradictory signals, one of which says do not read the page and the second says that on this page there is a tag that does not archive the page, so how will he read the noindex on the page and you tell him not to read the page in the first place nofollow? Hence it is also a common mistake that you should avoid

See also  Profit from uploading files via the file upload site

Solve the problem of modifying the robot file in Shopify and Blogger

Some online store platforms and blogging platforms do not allow you to modify the robots.txt file, and this is one of the disadvantages of these platforms such as Shopify, ExpandCart, Blogger, and others.

But there is a trick that I use for some of my clients’ sites on these platforms, which is to convert the 301 link of the bot file through Cloud Flare to an accessible folder (the images folder for example) and then modify it and Google can consider it the main bot file and thus you have solved the problem.

Include a sitemap in the bot file

A sitemap is an XML file that often contains a complete index of your site, which includes all the pages links for your site within it to facilitate the process of search spiders in understanding the structure of your site and distinguishing the main and sub pages and their links to each other. You can create an XML map yourself or via WordPress plugins like Add Yoast SEO or tools like Google XML Sitemaps

The best application for SEO through a sitemap is to add only the pages that you want the search engine to archive, not all pages, and you should not add a page with the noindex or nofollow tag, a page that has been converted to another page 301 or a page that has been canonical to another page.

It means that it should only contain the pages that are important and that you want to report to Google as the most important on your site.

See also  What is AMP, its benefits, and how to use it for a mobile browser

robots txt sitemap

Leave a Reply

Your email address will not be published. Required fields are marked *