Friendly Robots makes adding a dynamic robots.txt file to your Umbraco website easy!
Friendly Robots is supported on Umbraco 8.1+.
Friendly Robots is available from Our Umbraco, NuGet, or as a manual download directly from GitHub.
You can find a downloadable package, along with a discussion forum for this package, on the Our Umbraco site.
To install from NuGet, run the following command in your instance of Visual Studio.
PM> Install-Package Our.Umbraco.FriendlyRobots
Once installed, the robots.txt file will be visible on the URL /robots.txt
, such as https://www.yoursite.com/robots.txt
.
If a physical robots.txt
file exists in the root of the website, the dynamically generated file will be overwritten.
By default the robots.txt is automatically set to allow all traffic, but can be configured via a selection of app settings in the web.config
.
Unless an alternative value is supplied, the "useragent" field will be catch-all ("*").
<add key="Umbraco.Robots.UserAgent" value="*" />
Multiple values can be supplied for each of the "allow", "disallow", and "sitemaps" URL fields as a comma separated list, like this:
<add key="Umbraco.Robots.Disallow" value="/some-path/,/some-other-path/" />
The project wiki contains further details about the advanced configuration options available.
To raise a new bug, create an issue on the GitHub repository. To fix a bug or add new features, fork the repository and send a pull request with your changes. Feel free to add ideas to the repository's issues list if you would to discuss anything related to the library.
This project is maintained by Callum Whyte and contributors. If you have any questions about the project please get in touch on Twitter, or by raising an issue on GitHub.
The logo uses the Robot icon from the Noun Project by Adrien Coquet, licensed under CC BY 3.0 US.
Copyright © 2021 Callum Whyte, and other contributors
Licensed under the MIT License.