Comment by ๐ŸŽฎ jprjr

Re: "Made myself a personal capsule"

In: u/jprjr

I created a script on my site at /tarpit/. Each page ends with a new "next page" link, and it just goes on forever. Responses also get slower and slower as you go. I made sure to list it in my robots.txt so, hoping it wastes bad crawler's time.

At a certain point of following the never-ending links you start getting chapters of The Damned by Algernon Blackwood. The first chapter's at the 64th link, the second is at the 4,096 link, and so on.

I think in theory it will generate 64^990 links (though there's only 9 chapters, so once you hit link 18014398509481984 you'll have all the chapters.)

๐ŸŽฎ jprjr [OP]

Feb 06 ยท 3 months ago

5 Later Comments โ†“

๐Ÿ›ž cbpn ยท Feb 06 at 22:00:

In the old days, you had to break a sweat to get crawled by bots, now you have to put in effort to not get crawled...

๐Ÿ€ gritty ยท Feb 07 at 02:25:

I love the client server interaction example on your gemlog. "let's fucking go!"

๐Ÿฆ” bsj38381 ยท Feb 07 at 03:00:

I'm thinking of adding a "crawler trap" for my https mirror of my Gemini capsule that's running on Web 1.0 hosting. Oh yeah, I shoudl share that on my next post

๐Ÿ pirkka ยท Feb 07 at 08:48:

Only one post in your gemlog - post more!

๐Ÿฆ” bsj38381 ยท Feb 09 at 00:13:

I'll make sure to post more for this month.

Original Post

๐ŸŽฎ jprjr

โ€” buffering.party/

Made myself a personal capsule โ€” Hoping to get back into doing some writing! I've also got a fun surprise for crawlers that ignore robots.txt.

๐Ÿ’ฌ 7 comments ยท 2 likes ยท Feb 06 ยท 3 months ago