Is there a smolnet search engine/search aggregator for multiple protocols at once? (gemini, gopher, nex, etc)

I was wondering if there were a multi smolnet protocol search engine I could set as default for Lagrange.

Posted in: s/SmallWeb

๐Ÿš€ Copenhagen_Bram

2025-06-20 ยท 11 months ago

5 Comments โ†“

๐Ÿš€ clseibold [๐Ÿ›‚] ยท 2025-06-20 at 23:13:

AuraGem Search supports gemini, nex, scroll, and spartan. It does not yet support gopher, but I intend to add this at some point in the future.

However, please note that today my search engine is being moved over from Firebird to Postgresql, so the searching will not work for today. I'll have it fixed/finished by tomorrow.

โ€” AuraGem Search

๐Ÿš€ Copenhagen_Bram [OP] ยท 2025-06-21 at 02:15:

Awesome! Hope you get it back up soon. Looking forward to seeing it search gopher too.

๐Ÿš€ clseibold [๐Ÿ›‚] ยท 2025-06-21 at 04:05:

@Copenhagen_Bram The search engine is back up now, but the crawler is going atm to rebuild the database back up. Also, the search querying has changed so that search terms are ANDed together rather than ORed, because this is the default in postgres, but I intend to fix this soon.

The stats page provides info on the crawler and feed aggregator crawler anytime they are running:

โ€” AuraGem Search Stats

It also shows the current capsule being crawled by the capsule on-demand crawler. It goes one capsule at a time. You can submit your capsule to be crawled on-demand by submitting it to this link:

โ€” Crawl Capsule On-Demand

โ˜ฏ๏ธ dragfyre ยท 2025-06-22 at 11:45:

@clseibold are there significant changes that need to be done to get it searching on gopher, or is it more about flipping the right switches and running the crawler?

๐Ÿš€ clseibold [๐Ÿ›‚] ยท 2025-06-22 at 22:50:

@dragfyre Aside from writing a gopher client and gopehrmap parser in Golang, the only other thing is to figure out how I'm going to store the info in the database.

Some gopher search engines will store all of the itemtypes of a gophermap in the db, but that's a different model from how my db works for gemini, nex, scroll, and spartan.

Currently, I save every *url* and details of its content, one row for each url. Whereas a gopher search engine might save multiple rows of gophermap items per path. So it's definitely a little different, particularly because a gophermap item doesn't necessarily correlate to a URL, let alone a gopher resource (there are other itemtypes like text and WAIS, etc.).

But I think I can adapt the gopher model into the current model that I use. Idk, we'll see.