feat: intégration plugin Kirby SEO
All checks were successful
Deploy / Deploy to Production (push) Successful in 22s
All checks were successful
Deploy / Deploy to Production (push) Successful in 22s
- Ajout de tobimori/kirby-seo via Composer
- snippet('seo/head') dans header.php (remplace les meta manuels)
- snippet('seo/schemas') dans footer.php pour JSON-LD
- Onglet SEO ajouté dans site.yml et tous les blueprints de pages
- Configuration SEO dans config.php (sitemap, robots, canonicalBase TODO)
Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
This commit is contained in:
parent
baab2fb3a1
commit
58c31ea391
133 changed files with 9201 additions and 253 deletions
66
site/plugins/kirby-seo/docs/2_customization/02_robots-txt.md
Normal file
66
site/plugins/kirby-seo/docs/2_customization/02_robots-txt.md
Normal file
|
|
@ -0,0 +1,66 @@
|
|||
---
|
||||
title: Customizing robots.txt
|
||||
intro: Add custom rules to your robots.txt
|
||||
---
|
||||
|
||||
By default, Kirby SEO generates a simple `robots.txt` that allows all crawlers and blocks the Panel. If you need to add your own rules, use the `robots.content` option.
|
||||
|
||||
## Blocking specific bots
|
||||
|
||||
Some AI providers crawl websites to use the content as training data. You can block their crawlers:
|
||||
|
||||
```php
|
||||
<?php
|
||||
// site/config/config.php
|
||||
|
||||
return [
|
||||
'tobimori.seo' => [
|
||||
'robots' => [
|
||||
'content' => [
|
||||
'GPTBot' => [
|
||||
'Disallow' => ['/'],
|
||||
],
|
||||
'Google-Extended' => [
|
||||
'Disallow' => ['/'],
|
||||
],
|
||||
'CCBot' => [
|
||||
'Disallow' => ['/'],
|
||||
],
|
||||
],
|
||||
],
|
||||
],
|
||||
];
|
||||
```
|
||||
|
||||
This adds rules for each bot while keeping the default rules for all other crawlers intact.
|
||||
|
||||
## Custom rules for all crawlers
|
||||
|
||||
If you set rules for `*`, they replace the default rules entirely:
|
||||
|
||||
```php
|
||||
'content' => [
|
||||
'*' => [
|
||||
'Allow' => ['/'],
|
||||
'Disallow' => ['/panel', '/content', '/private'],
|
||||
],
|
||||
],
|
||||
```
|
||||
|
||||
## Mixing rules
|
||||
|
||||
You can combine rules for all crawlers with rules for specific bots:
|
||||
|
||||
```php
|
||||
'content' => [
|
||||
'*' => [
|
||||
'Allow' => ['/'],
|
||||
'Disallow' => ['/panel', '/content'],
|
||||
],
|
||||
'GPTBot' => [
|
||||
'Disallow' => ['/'],
|
||||
],
|
||||
],
|
||||
```
|
||||
|
||||
The `Sitemap:` line is added automatically if the [sitemap module](1_features/01_sitemap) is active. You can override it with the `robots.sitemap` option.
|
||||
Loading…
Add table
Add a link
Reference in a new issue