feat: intégration plugin Kirby SEO
All checks were successful
Deploy / Deploy to Production (push) Successful in 22s

- Ajout de tobimori/kirby-seo via Composer
- snippet('seo/head') dans header.php (remplace les meta manuels)
- snippet('seo/schemas') dans footer.php pour JSON-LD
- Onglet SEO ajouté dans site.yml et tous les blueprints de pages
- Configuration SEO dans config.php (sitemap, robots, canonicalBase TODO)

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
This commit is contained in:
isUnknown 2026-03-25 12:59:18 +01:00
parent baab2fb3a1
commit 58c31ea391
133 changed files with 9201 additions and 253 deletions

View file

@ -0,0 +1,66 @@
---
title: Customizing robots.txt
intro: Add custom rules to your robots.txt
---
By default, Kirby SEO generates a simple `robots.txt` that allows all crawlers and blocks the Panel. If you need to add your own rules, use the `robots.content` option.
## Blocking specific bots
Some AI providers crawl websites to use the content as training data. You can block their crawlers:
```php
<?php
// site/config/config.php
return [
'tobimori.seo' => [
'robots' => [
'content' => [
'GPTBot' => [
'Disallow' => ['/'],
],
'Google-Extended' => [
'Disallow' => ['/'],
],
'CCBot' => [
'Disallow' => ['/'],
],
],
],
],
];
```
This adds rules for each bot while keeping the default rules for all other crawlers intact.
## Custom rules for all crawlers
If you set rules for `*`, they replace the default rules entirely:
```php
'content' => [
'*' => [
'Allow' => ['/'],
'Disallow' => ['/panel', '/content', '/private'],
],
],
```
## Mixing rules
You can combine rules for all crawlers with rules for specific bots:
```php
'content' => [
'*' => [
'Allow' => ['/'],
'Disallow' => ['/panel', '/content'],
],
'GPTBot' => [
'Disallow' => ['/'],
],
],
```
The `Sitemap:` line is added automatically if the [sitemap module](1_features/01_sitemap) is active. You can override it with the `robots.sitemap` option.