Thomas Krichel wrote:
<snip>
I am sure they will heed to your advice. When have libraries ever
taken part in a rat race? ;-) And in the slow march towards engine
visibility, they have not even started to get moving. If a search
engine can find that I live in Jackson Heights, NY, should it not
point my query to "Moby Dick" say, to the copy of it in a close-by
public library? It won't be able to. Library catalogs are not
visible to search engine crawlers unless libraries prepare a
complete browsable index to their holdings on the public web. I am
sure it's not difficult to set up such pages. It would probably take
me a couple of hours to do it for Koha, the system I am familar
with, to set up an ugly and primitive one. Please correct me if I am
wrong, but it does not apppear to a standard feature of ILS
software. It ought to be.
</snip>
You are absolutely right! I have closed my (Koha) catalog to Googlebot crawls because it has crashed my server several times. But, when I allowed it, the number of hits in my catalog went up exponentially. Since I catalog a lot of openly accessible websites, I thought they would be useful, but I always considered that for someone knowing that there is a copy of a physical book on the shelves in a small academic library at the top of the Janiculum Hill in Rome, Italy, was not all that useful. But the way you mention it, with GPS etc. maybe it would be useful. Or maybe not. I'll think about it.
In any case, I have looked into creating a Google sitemap since it *seems* like not such a difficult thing to do, but I haven't had the time. Actually, I just now thought that this may be a perfect add-on for MarcEdit! Or at any rate, as an XSLT (as I was considering).
James L. Weinheimer j.weinheimer_at_aur.edu
Director of Library and Information Services
The American University of Rome
Rome, Italy
First Thus: http://catalogingmatters.blogspot.com/
Received on Sun Feb 13 2011 - 13:04:44 EST