geminispace.info

gemini search engine
git clone https://git.clttr.info/geminispace.info.git
Log (Feed) | Files | Refs (Tags) | README | LICENSE

commit 3df47bfd42a8ea5142333e0cf5fa652752a04363
parent a38b003bd5b90529d98350f162e7403f25353b7e
Author: René Wagner <rwagner@rw-net.de>
Date:   Wed, 27 Jan 2021 13:35:54 +0100

add "/robots.txt" route to views.py

It's a hard coded approach to serve a robots.txt to other crawlers.
No crawler may access /add-seed & /threads and all relevant virtual agents
may not access /search and /backlinks

Diffstat:
Mserve/views.py | 4++++
1 file changed, 4 insertions(+), 0 deletions(-)

diff --git a/serve/views.py b/serve/views.py @@ -57,6 +57,10 @@ gus = GUS() def status(request): return Response(Status.SUCCESS, "text/plain", "ok") +@app.route("/robots.txt", strict_trailing_slash=False) +def status(request): + return Response(Status.SUCCESS, "text/plain", + "User-agent: researcher\nUser-agent: indexer\nUser-agent: archiver\nDisallow: /search\nDisallow: /backlinks\n\nUser-agent: *\nDisallow: /add-seed\nDisallow: /threads") @app.route("/favicon.txt", strict_trailing_slash=False) def favicon(request):