Re: ONIX data

From: Cory Rockliff <rockliff_at_nyob>
Date: Thu, 23 Dec 2010 13:40:26 -0500
To: NGC4LIB_at_LISTSERV.ND.EDU
On 12/23/10 1:08 PM, Weinheimer Jim wrote:
> Cory Rockliff wrote:
> <snip>
> Isn't this, to some extent, a false either-or? In the card era, a "Do it
> once, and do it right" mentality made perfect sense, since any change
> meant pulling cards and revving up the electric eraser, and what today
> is a simple global find-and-replace could mean months of labor.
> Nowadays, a more iterative approach to cataloging is possible, so
> perhaps the priority should be building better systems for collaborative
> editing and enhancement of bibliographic metadata, rather than trying to
> enforce standards.
> </snip>
>
> This is why I added the possible option 3, which method would, when transferred to the food industry, amount to doing a recall when the quality control finds problems. While this is an option, the number of "recalls" must be kept to a minimum of course. What is this minimum: 90% compliance or 50%? In library terms, this is equivalent to the amount of recataloging. While some changes may be fixable through globals, this assumes that the errors are consistent. But if it's a matter of lousy titles because the title in the ONIX record is actually the title of an earlier version that was changed, or there are typos (which come from my own experience), then the only way of dealing with it is to retrieve materials. If we are dealing with XML documents however, the title and much of the descriptive information could quite literally come from the item itself, and this could mean a major savings. This is a ways off in the future, though.
OK, but the key word in my statement was "iterative." To clarify, I'm 
not talking exclusively or even primarily about correcting systemic 
errors with global changes. I'm questioning the "do it once, and do it 
right" premise. To follow your analogy, yes--in our current ecosystem 
(OCLC, essentially),  if one wanted to make a change to a record or 
record set that would then propagate to all participating libraries, it 
would be very much like doing a product recall (but possibly more 
painful). I don't think it needs to be this way, though. Standards 
aside, as Karen observed, bad data is bad data. But if the data's open 
and there are enough eyeballs on it, errors stand a better chance of 
being caught, and substandard data of being upgraded. Unfortunately, our 
current systems aren't designed for this approach.

---
[This E-mail scanned for viruses by Declude Virus]
Received on Thu Dec 23 2010 - 13:41:00 EST