Comment by ๐ฎ jprjr
Re: "Made myself a personal capsule"
I created a script on my site at /tarpit/. Each page ends with a new "next page" link, and it just goes on forever. Responses also get slower and slower as you go. I made sure to list it in my robots.txt so, hoping it wastes bad crawler's time.
At a certain point of following the never-ending links you start getting chapters of The Damned by Algernon Blackwood. The first chapter's at the 64th link, the second is at the 4,096 link, and so on.
I think in theory it will generate 64^990 links (though there's only 9 chapters, so once you hit link 18014398509481984 you'll have all the chapters.)
Feb 06 ยท 3 months ago
5 Later Comments โ
In the old days, you had to break a sweat to get crawled by bots, now you have to put in effort to not get crawled...
๐ gritty ยท Feb 07 at 02:25:
I love the client server interaction example on your gemlog. "let's fucking go!"
๐ฆ bsj38381 ยท Feb 07 at 03:00:
I'm thinking of adding a "crawler trap" for my https mirror of my Gemini capsule that's running on Web 1.0 hosting. Oh yeah, I shoudl share that on my next post
๐ pirkka ยท Feb 07 at 08:48:
Only one post in your gemlog - post more!
๐ฆ bsj38381 ยท Feb 09 at 00:13:
I'll make sure to post more for this month.
Original Post
Made myself a personal capsule โ Hoping to get back into doing some writing! I've also got a fun surprise for crawlers that ignore robots.txt.