geminispace.info

gemini search engine
git clone https://git.clttr.info/geminispace.info.git
Log (Feed) | Files | Refs (Tags) | README | LICENSE

commit b99b541e900cb19ee1e2d295702a162df33115eb
parent f5ae878d3dbecb37a6f917e7ff736fa315aaa2d5
Author: René Wagner <rwa@clttr.info>
Date:   Mon, 18 Sep 2023 10:49:52 +0200

switch links to geminiprotocol.net

Diffstat:
Mserve/templates/about.gmi | 3+--
Mserve/templates/documentation/indexing.gmi | 8+++-----
Mserve/templates/documentation/searching.gmi | 2+-
Mserve/templates/news.gmi | 2+-
4 files changed, 6 insertions(+), 9 deletions(-)

diff --git a/serve/templates/about.gmi b/serve/templates/about.gmi @@ -37,8 +37,7 @@ I've you prefer direct personal contact feel free to use the link below instead: Gemini is an application-level internet protocol for the distribution of arbitrary files, with some special consideration for serving a lightweight hypertext format which facilitates linking between files. It's been described as both "the web, stripped right back to its essence" as well as "Gopher, souped up and modernised a little". If you want to learn more, you should head over to the Gemini Project site itself! - -=> gemini://gemini.circumlunar.space Gemini Project Homepage +=> gemini://geminiprotocol.net Gemini Project Homepage ### Documentation diff --git a/serve/templates/documentation/indexing.gmi b/serve/templates/documentation/indexing.gmi @@ -27,13 +27,11 @@ geminispace.info checks for specific return codes like 31 PERMANENT REDIRECT and When your capsule served an permanent redirect for some sort of stuff, geminispace.info will not re-crawl this stuff for at least a week. ### Controlling what geminispace.info indexes with a robots.txt -To control crawling of your site, you can use a robots.txt file. Place it in your capsule's root directory such that a request for "robots.txt" will fetch it. It should be returned with a mimetype of `text/plain`. -=> gemini://gemini.circumlunar.space/docs/companion/robots.gmi See the robots.txt companion spec for more details. +To control crawling of your site, you can use a "robots.txt" file. Place it in your capsule's root directory such that a request for "robots.txt" will fetch it. It should be returned with a mimetype of `text/plain`.: +=> gemini://geminiprotocol.net/docs/companion/robots.gmi See the robots.txt companion spec for more details. When interpreting a robots.txt, geminispace.info will use the first line that matches the URI that should be visited. -Be sure to sort your rules accordingly if you want use exhaustive rules with wildcards or the "Allow" rule that is not specificed in the companion spec. - -Your robots.txt should not include blank lines within a ruleset. Rules after a blank line will be ignored until the next ruleset is started with a "User-agent:" line. +Keep your robots file as simple as possible, avoid empty lines, wildcards and similar stuff, just stick to the rules defined in the companion spec. geminispace.info obeys the following user-agents, listed in descending priority: * gus diff --git a/serve/templates/documentation/searching.gmi b/serve/templates/documentation/searching.gmi @@ -21,7 +21,7 @@ To filter by one of these, simply add it to your query followed by a colon, and => /search?content_type%3Aimage/jpeg image/jpeg => /search?content_type%3Ainput input -=> /search?domain%3Acircumlunar domain:circumlunar +=> /search?domain%3Ageminiprotocol domain:geminiprotocol => /search?contextual%20domain%3Agus contextual domain:gus => /search?computers%20content_type%3Agemini%20AND%20NOT%20charset%3AUS-ASCII computers content_type:gemini AND NOT charset:US-ASCII diff --git a/serve/templates/news.gmi b/serve/templates/news.gmi @@ -12,7 +12,7 @@ The provisions for this feature have been in the code for quite a long time, but ### 2023-07-29 robots.txt Dear capsule pilots (especially those who run mirrors or gateways of bloated web site): Please put a robots.txt in place. -=> gemini://gemini.circumlunar.space/docs/companion/robots.gmi robots.txt companion spec at circumlunar +=> gemini://geminiprotocol.net/docs/companion/robots.gmi robots.txt companion spec at circumlunar ### 2023-06-07 server switch Be welcome on our brand new production instance sporting Debian 12.