What's new
  • SNBForums Code of Conduct

    SNBForums is a community for everyone, no matter what their level of experience.

    Please be tolerant and patient of others, especially newcomers. We are all here to share and learn!

    The rules are simple: Be patient, be nice, be helpful or be gone!

Block AI Crawlers

kramttocs

Regular Contributor
[Edit: When hosting an exposed /internet facing site] Is there currently a way to flat out block AI Crawlers at the router level?

User Agent blocking?

Edit: I am handling it at the nginx level per host but asking if the router (or Skynet) can handle it at that layer.
 
Last edited:
Sure. Do not have a web site behind the router or open ports/port forward.
 
Well, yes but...
 
Ai and Web crawlers look for web sites. If you have no ports open or forwarded and you have AiCloud and etc. disabled the crawlers will be stopped by the router firewall.
 
My fault. I didn't assume anyone not hosting a site would ask how to block them. I've updated the first post to clarify.
 
I do appreciate the replies but you aren't understanding my ask.

Maybe someone else on the forum knows a way though.
 
https://serverfault.com/questions/690870/iptables-block-user-agent

Maybe there is something within entware, but I personally run Fail2Ban on the Nginx server (offload everything from the router as much as possible) and add offender IPs from the fail2ban.log to a blacklist that then gets added to an IPSET block on the router
 
Thanks. I am also using Fail2Ban and it's actually what is catching the status codes that nginx is spitting out and banning them (temporarily) that way. Which does work well.
Fair point in keeping things off the router.
 

Similar threads

Support SNBForums w/ Amazon

If you'd like to support SNBForums, just use this link and buy anything on Amazon. Thanks!

Sign Up For SNBForums Daily Digest

Get an update of what's new every day delivered to your mailbox. Sign up here!
Back
Top