summaryrefslogtreecommitdiff
path: root/content/bsd
diff options
context:
space:
mode:
authormms <michal@sapka.me>2023-12-11 21:33:18 +0100
committermms <michal@sapka.me>2023-12-11 21:33:18 +0100
commit5fd3c1d796123976c1896618cd6d6aab268119af (patch)
tree045599d07da03ad474e7c25548cd7474b90a0726 /content/bsd
parent032844bfa56a4e045e689a96be4b7037186707db (diff)
feat: ban bad bots
Diffstat (limited to 'content/bsd')
-rw-r--r--content/bsd/blocking-bad-bots-openbsd.md59
-rw-r--r--content/bsd/home.md1
2 files changed, 60 insertions, 0 deletions
diff --git a/content/bsd/blocking-bad-bots-openbsd.md b/content/bsd/blocking-bad-bots-openbsd.md
new file mode 100644
index 0000000..121f6d0
--- /dev/null
+++ b/content/bsd/blocking-bad-bots-openbsd.md
@@ -0,0 +1,59 @@
+---
+title: "Blocking bad bots using Relayd"
+category:
+- bsd
+- update
+- bsd-update
+abstract:
+date: 2023-12-10T12:27:54+02:00
+---
+The bane of existence for most of small pages: web crawlers.
+They create most traffic this site sees and makes my [site stats](https://michal.sapka.me/site/info/#site-stats) overly optimistic.
+We can go with [robots.txt](https://en.wikipedia.org/wiki/Robots_Exclusion_Protocol), but what if it's not enough?
+I can tell a valuable bot to not index some part of my site, but:
+a) some bots ignore it
+a) what if I don't want some bots to even have the chance to ask?
+
+Get that SEO scanning and LLM training out of here!
+
+## Blocking crawlers
+
+The rest of this guide assumes webstack: Relayd and Httpd.
+Relayd is great and since it works on higher level than pf, we can read headers. Luckily, those crawlers send usable "User-Agents" which we can block.
+
+First, let's see who uses my site the most. Assuming you use "forwarded" style for logs, we can do:
+
+{{<highlight shell>}}
+awk -F '"' '{print $6}' <path to log file> | sort | uniq -c | sort
+{{</highlight>}}
+
+Then we need to manually select agents we want to block. It won't be easy, as the strings are long and contain a lot of unnecessary information - which includes plain lies. You need to define which part of the full Uer-Agent is common and can be used for blocking.
+
+Then we can create block rules in a Relayd protocol. Relayd doesn't use regexp, and instead allows using case-sensitive Lua globs. Stars will match everything.
+
+{{<highlight shell>}}
+block request method "GET" header "User-Agent" value "*<common part>*"
+{{</highlight>}}
+
+Remember that config assumes last-one-wins, so the block rules should be the last matching. I just put those end the end of my config. You can create a `block quick...` rule if you want - it will short-circuit the entire protocol.
+
+Therefore, my "https" protocol now has a series of blocks:
+
+{{<highlight shell "linenos=inline">}}
+http protocol "https" {
+# most of the procol omitted
+ block request method "GET" header "User-Agent" value "*Bytespider*"
+ block request method "GET" header "User-Agent" value "*ahrefs*"
+ block request method "GET" header "User-Agent" value "*censys*"
+ block request method "GET" header "User-Agent" value "*commoncrawl*"
+ block request method "GET" header "User-Agent" value "*dataforseo*"
+ block request method "GET" header "User-Agent" value "*mj12*"
+ block request method "GET" header "User-Agent" value "*semrush*"
+ block request method "GET" header "User-Agent" value "*webmeup*"
+ block request method "GET" header "User-Agent" value "*zoominfo*"
+}
+{{</highlight>}}
+
+*(using globs was proposed to me on [OpenBSD mailing list](https://marc.info/?l=openbsd-misc&m=170206886109953&w=2)*
+
+
diff --git a/content/bsd/home.md b/content/bsd/home.md
index 0f5d9d5..2366758 100644
--- a/content/bsd/home.md
+++ b/content/bsd/home.md
@@ -32,3 +32,4 @@ Since at least a year, I've been a BSD type of a guy. My personal laptop is runn
- OpenBSD server
- [OpenBSD Amsterdam](/bsd/moved-to-openbsd/)
- [Webstack - Httpd(8), Relayd(8)](/bsd/moved-to-openbsd/#httpd8-and-relayd8)
+ - [Blocking bad bots and crawlers](/bsd/blocking-bad-bots-openbsd/)