WORK IN PROGRESS
This NGINX module enforces the rules in robots.txt for web crawlers that choose
to disregard those rules.
See the Contributor Guide if you'd like to submit changes.
- Configure NGINX to block specific user agents, although this doesn't share the configuration in
robots.txt - Roboo protects against robots that do not implement certain browser features