world-game/site/plugins/kirby-seo/docs/2_customization/02_robots-txt.md
isUnknown 58c31ea391
All checks were successful
Deploy / Deploy to Production (push) Successful in 22s
feat: intégration plugin Kirby SEO
- Ajout de tobimori/kirby-seo via Composer
- snippet('seo/head') dans header.php (remplace les meta manuels)
- snippet('seo/schemas') dans footer.php pour JSON-LD
- Onglet SEO ajouté dans site.yml et tous les blueprints de pages
- Configuration SEO dans config.php (sitemap, robots, canonicalBase TODO)

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-25 12:59:18 +01:00

1.4 KiB

title intro
Customizing robots.txt Add custom rules to your robots.txt

By default, Kirby SEO generates a simple robots.txt that allows all crawlers and blocks the Panel. If you need to add your own rules, use the robots.content option.

Blocking specific bots

Some AI providers crawl websites to use the content as training data. You can block their crawlers:

<?php
// site/config/config.php

return [
  'tobimori.seo' => [
    'robots' => [
      'content' => [
        'GPTBot' => [
          'Disallow' => ['/'],
        ],
        'Google-Extended' => [
          'Disallow' => ['/'],
        ],
        'CCBot' => [
          'Disallow' => ['/'],
        ],
      ],
    ],
  ],
];

This adds rules for each bot while keeping the default rules for all other crawlers intact.

Custom rules for all crawlers

If you set rules for *, they replace the default rules entirely:

'content' => [
  '*' => [
    'Allow' => ['/'],
    'Disallow' => ['/panel', '/content', '/private'],
  ],
],

Mixing rules

You can combine rules for all crawlers with rules for specific bots:

'content' => [
  '*' => [
    'Allow' => ['/'],
    'Disallow' => ['/panel', '/content'],
  ],
  'GPTBot' => [
    'Disallow' => ['/'],
  ],
],

The Sitemap: line is added automatically if the sitemap module is active. You can override it with the robots.sitemap option.