Sign in Registration
ArticlesDownloadServicesAudioContacts
ruen

The correct robots.txt for the site - why you need it, how to compose it and where to check

Currently, almost every site has a special file that serves to provide search engines with information on how to index the site correctly. This file is called robots.txt - in fact, it is a simple text file with instructions. It contains bans, permissions and other guidelines for search engines.

robots-txt-right-check

You can compose such a file yourself, you just need to know some general rules. Here's a example robots.txt file and look at it below:

  User-agent: *

# Directories

Disallow: / core /
Disallow: / custom /
Disallow: / modules /

Allow: /custom/modules/*.js
Allow: /custom/modules/*.css
Allow: /custom/modules/*.jpg

# Files

Disallow: /license.txt
Disallow: /index.php

# Url

Host: https://example.com
Sitemap: https://example.com/sitemap.xml
 

Let's take a quick look at all the main elements of the given file:

Using the above keywords, you can compose your robots.txt file as needed. After drawing up the rules, you need to check the robots.txt file using special services that will show if there are any errors in it and offer corrections and recommendations if necessary. It is best to use utilities from large services, for example, Webmaster from Google or Yandex.

Thus, the article covered why robots.txt is needed , how to compose it correctly and where to check it.

Comments (0)
For commenting sign in or register.

Latest articles

Popular sections

Communication

Login to the web version
Android app:
Available on Google Play

Share this

Subscribe to