Home > AI > Frontend > SEO >

robots.txt

Example:

# Group 1
User-agent: Googlebot
Disallow: /nogooglebot/

# Group 2
User-agent: *
Allow: /

Sitemap: http://www.example.com/sitemap.xml

Explanation:

  1. The user agent named “Googlebot” is not allowed to crawl the http://example.com/nogooglebot/ directory or any subdirectories.
  2. All other user agents are allowed to crawl the entire site. This could have been omitted and the result would be the same; the default behavior is that user agents are allowed to crawl the entire site.
  3. The site’s sitemap file is located at http://www.example.com/sitemap.xml.

Example: the most general one

User-Agent: *
Allow: /

Caution: this option allows every bot to crawl every single page of your website.

Leave a Reply