Anubis - Weigh the soul of incoming HTTP requests using proof-of-work to stop AI crawlers (https://anubis.techaro.lol)
I give that a try. Maybe it can reduce the AI crawler mess a little bit on my servers.
Anubis - Weigh the soul of incoming HTTP requests using proof-of-work to stop AI crawlers (https://anubis.techaro.lol)
I give that a try. Maybe it can reduce the AI crawler mess a little bit on my servers.
Took extra time today to install plugins dedicated to keep away various #ai crawlers away off my #wordpress #blog , if you don't want your wordpress (not .com) website to be annexed to ai overviews or search results you have plug-ins like "block ai crawlers" you can install :)
#artificialintelligence #MetaAI #chatgpt #noai #aicrawler
Does anybody have a User Agent string for a current #Opera #OperaBrowser for me?
Asking to fight an #AICrawler spam wave.
More logfile analysis for MacPorts Trac today to fight the #AICrawler spambot wave… some 500-600 requests from PowerPCs running Mac OS X 10.6, 10.7, 10.8, 10.9, 10.10, 10.11, and 10.12.
I must have missed the memo from Apple on the extended OS support for PPC chips!
So according to the request statistics, since the last rotation of the access log file for the #MacPorts trac this morning, there were:
20.8k requests from IE 3
20.9k requests from IE 4
21.3k requests from IE 5
43 requests from IE 6 and
23 requests from IE 7
These requests came from these Windows versions (roughly 4k per version): CE, 95, 98 (9.5k), NT 4, 2000, XP, NT 5.01(?!), Server 2003, Vista, 7, and 8.0.
I'm sure none of those are AI crawler bots.
Cool idea. Trap disrespectful crawlers in an infinite maze...
Minotaur is waiting for them.
via @clive
#Development #Techniques
Good sabotage for bad robots · Ways to block AI crawlers and safeguard your content https://ilo.im/15ztix
_____
#AI #AiCrawler #Copyright #Content #Website #Sabotage #RobotsTxt #Development #Frontend #Backend
Har har. Ich habe heute mit dem Gedanken gespielt, meinem #NGINX Reverse-Proxy beizubringen, diese ganzen lumpigen #AI-Crawler rauszuwerfen, die bloß gratis Content für ihre LLMS usw. leechen wollen. Dabei stieß ich auf die Arbeit von @cory@social.lol und @robb@social.lol und dachte, ich vereine die beiden in einem #Bash-Skript. Das Ergebnis kommt die Tage in mein Git, heute bin ich zu müde.