change request - make robots.txt work on fho

Ricky Elrod codeblock at elrod.me
Tue May 10 18:02:11 UTC 2011


Wanted to do this before freeze, but never had a chance -- let's get
robots.txt working on hosted1 and try to bring down cpu load and improve
page load times a bit.

Thoughts/+1's please. robots.txt is already there and has been for a long
time, but nothing has told apache to use it, because apache requests go
straight to trac.

robots.txt is set to block crawlers from accessing fh.o/*/browser/* (which
is the trac source code browser) -- as per
https://fedorahosted.org/fedora-infrastructure/ticket/1848



diff --git a/configs/web/fedorahosted.org.conf
b/configs/web/fedorahosted.org.conf
index d0f7139..8f1f2e1 100644
--- a/configs/web/fedorahosted.org.conf
+++ b/configs/web/fedorahosted.org.conf
@@ -2,6 +2,9 @@
     ServerName fedorahosted.org
     ServerAlias www.fedorahosted.org

+    # Make robots.txt be used.
+    Alias /robots.txt /srv/web/fedorahosted.org/robots.txt
+
     Redirect 301 / https://fedorahosted.org/
 </VirtualHost>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://lists.fedoraproject.org/pipermail/infrastructure/attachments/20110510/46c4397b/attachment.html 


More information about the infrastructure mailing list