This NGINX module enforces the rules in robots.txt for web crawlers that choose
to disregard those rules.
The current code builds but has not been tested and is missing major pieces of function. See Configuration support in particular.
See the Contributor Guide if you'd like to submit changes.
- Configure NGINX to block specific user agents, although this doesn't share the configuration in
robots.txt - NGINX configuration for AI web crawlers
- Roboo protects against robots that do not implement certain browser features