[Koha-devel] Web crawlers hammered our Koha multi server

Rick Welykochy rick at praxis.com.au
Wed Jan 13 12:23:42 CET 2010


Owen Leonard wrote:

>> Addendum: also install a robots.txt file at the following location
>> in the Koha source tree:
>>
>>     opac/htdocs/robots.txt
>
> Isn't this already a part of a standard Koha installation, and even if
> not, isn't this all that is required to ward off search engine
> spiders?

No, imho. This is not part of the Koha/3 distro AFAIK.

and Yes: All you need to stop the spiders is a robots.txt
file in opac/htdocs/robots.txt


> Killing by default the ability to deep-link to Koha records would be a
> terrible thing to do.

Then don't use a robots.txt restriction.


cheers
rickw


-- 
_________________________________
Rick Welykochy || Praxis Services

A computer is a state machine. Threads are for people who can't program state machines.
       -- Alan Cox



More information about the Koha-devel mailing list