I was reading the reddit thread on Claude AI crawlers effectively DDOSing Linux Mint forums https://libreddit.lunar.icu/r/linux/comments/1ceco4f/claude_ai_name_and_shame/
and I wanted to block all ai crawlers from my selfhosted stuff.
I don’t trust crawlers to respect the Robots.txt but you can get one here: https://darkvisitors.com/
Since I use Caddy as a Server, I generated a directive that blocks them based on their useragent. The content of the regex basically comes from darkvisitors.
Sidenote - there is a module for blocking crawlers as well, but it seemed overkill for me https://github.com/Xumeiquer/nobots
For anybody who is interested, here is the block_ai_crawlers.conf I wrote.
(blockAiCrawlers) {
@blockAiCrawlers {
header_regexp User-Agent "(?i)(Bytespider|CCBot|Diffbot|FacebookBot|Google-Extended|GPTBot|omgili|anthropic-ai|Claude-Web|ClaudeBot|cohere-ai)"
}
handle @blockAiCrawlers {
abort
}
}
# Usage:
# 1. Place this file next to your Caddyfile
# 2. Edit your Caddyfile as in the example below
#
# ```
# import block_ai_crawlers.conf
#
# www.mywebsite.com {
# import blockAiCrawlers
# reverse_proxy * localhost:3000
# }
# ```
I got meaner with them :3c
I just want you to know that was an amazing read, was actually thinking “It gets worse? Oh it does. Oh, IT GETS EVEN WORSE?”
The nobots module I’ve linked bombs them
This is one of the best things I’ve ever read.
I’d love to see a robots.txt do a couple safe listings, then a zip bomb, then a safe listing. It would be fun to see how many log entries from an IP look like get a, get b, get zip bomb… no more requests.
Suggestion at the end:
<a class="boom" href="https://boom .arielaw.ar">hehe</a>
Wouldn’t it destroy GoogleBot (and other search engine) those making your site delisted from Search?
In dark mode, the anchor tags are difficult to read. They’re dark blue on a dark background. Perhaps consider something with a much higher contrast?
Apart from that, nice idea - I’m going to deploy the zipbomb today!
I’m a fan of hellpotting them.
Ooh, didn’t know about that one… thanks
We should do more than block them, they need to be teergrubed.
Thats an easy modification. Just redirect or reverse proxy to the tarpit instead of
abort
.I was even thinking about an infinitely linked data-poisoned html document, but there seemed to be no ready made project that can generate one at the moment. (No published data-poisoning techniques for plain text at all afaik. But there is one for images.)
Ultimately I decided to just abort the connection as I don’t want my servers to waste traffic or CPU cycles.
Such a cool person making the video available for download