Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Robots.txt #91

Open
maehr opened this issue Jun 6, 2024 · 2 comments
Open

Robots.txt #91

maehr opened this issue Jun 6, 2024 · 2 comments

Comments

@maehr
Copy link
Contributor

maehr commented Jun 6, 2024

Describe what feature you'd like. Pseudo-code, mockups, or screenshots of similar solutions are encouraged!

Hi guys, we implemented a robots.txt over here Stadt-Geschichte-Basel#113

Do you want me to open a PR for CB-CSV as well?

What type of pull request would this be?

New Feature

Any links to similar examples or other references we should review?

No response

@evanwill
Copy link
Contributor

evanwill commented Jun 6, 2024

@maehr I like the idea of it in "utilities". However, robots.txt should only be at the root of a domain or subdomain, and only one per domain. I think the majority of CB projects are likely not at the root, so it might just end up being inapplicable and potentially confusing. The noindex: true option will cover adding robots meta tags to individual pages.

I think "how to add a robots.txt" might be a good as a cb-docs "advanced" topic, with the code for the example template? Then people who are actually at a subdomain root or root can add one in and learn more about why.

@maehr
Copy link
Contributor Author

maehr commented Jun 7, 2024

I will work on the docs and get back to you.
PS: I could add a check for baseurl: to make sure it's not hosted in a subfolder.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants