You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
AI Crawlers are a massive issue on the web nowadays. Bad acting ones, which disrespect robots.txt, normally act basically as a DDoS on the application/website.
Proposed solution
I'd like to suggest adding new rules, specifically targeted at detecting malicous crawlers(through more than just user agent, as that's easy to lie about) in order to better protect against such cases.
Motivation
AI Crawlers are a massive issue on the web nowadays. Bad acting ones, which disrespect robots.txt, normally act basically as a DDoS on the application/website.
Proposed solution
I'd like to suggest adding new rules, specifically targeted at detecting malicous crawlers(through more than just user agent, as that's easy to lie about) in order to better protect against such cases.
Alternatives
I've considered trying tools such as https://github.com/fabriziosalmi/patterns that add on top of OWASP but I think it'd be best if this project had it
Additional context
AI Crawlers have been a huge problem, specially for smaller websites. Most of those rely on OWASP for defense and things like cloudflare.
Cloudflare however is extremely bad/basic, specially on the free tier, so it'd be great if OWASP did some level of bot/ai crawler protection