Re: Using controlled vocabularies to enhance search/browse

From: K.G. Schneider <kgs_at_nyob>
Date: Fri, 9 Jun 2006 10:56:30 -0700
To: NGC4LIB_at_listserv.nd.edu
> How much normalization can we afford to do in terms of staff
> time?  Subject terms?  Names?  Locations?  Genre?  And one of the
> most challenging, Dates?  (Although the CDL date normalization tool
> is a huge step forward in that area)  If people think creating
> metadata is expensive, try normalizing it.

I think it's essential that we assume we can't normalize metadata, and that
sometimes, for some very valuable content sets, metadata will need to be
teased out from Fibber McGee's closet through automated extraction.
Furthermore, we need to look at how much we spend on generating metadata and
while recognizing that metadata is valuable decide what the limits are on
how much we invest in human-generated metadata and how much we turn towards
machine-generated metadata and/or simpler metadata methods to
supplement/enhance/replace what we are now generating by hand.

This morning I woke up with a very Dilbert image in my head: a pie chart.
The pie chart described line item allocations for a mythical library; though
maybe I was thinking of LibraryLand. In terms of time, training, tools, and
materials, I wondered how big the wedges are for various library services...
for the OPAC versus the other databases, for findability versus controlled
vocabulary, etc.

Karen G. Schneider
kgs_at_bluehighways.com
Received on Fri Jun 09 2006 - 14:00:03 EDT