>
> To bring this back to Dan Mullineux's earlier statement about RDF,
this
> is an example of *exactly* what RDF is designed to do. It has the
> potential to reduce the granularity of bibliographic information,
> and all sorts of other metadata, to the statement level. It could
allow
> making other assertions about those statements, such as who
contributed
> them. The framework for a system that allows algorithmically
selecting
> which bits of MARC data is already coming together. I'd argue this
> should be even more atomic than Tim suggests: down to the specificity
of
> data currently found in a subfield or a leader byte.
>
> Additionally, it could be used to more effectively integrate data from
> outside our walls into the library data corpus, and vice-versa.
>
> Imagine if OCLC's database were built around this principle, and a
> nightly SPARQL query could retrieve any statement Doug made about a
> resource when he added a field or subfield, and add it to the
data-pool
> of a local catalog. Imagine then that another query could pull in the
> tags that Bob added to Library Thing and the reviews Sarah posted on
> Amazon.
>
> -Corey
I think this is all excellent and I can see that this would go a long
way to solving the problems with keeping local catalogues accurate,
relevant, and useful in new ways. Perhaps the one concern is making sure
that the process of "integrated change control" (I have an instructor in
a project management course who relentlessly hammers away at this point)
at the local level is fully accounted for. As much as many view the
world through the glasses needed for e-books on a computer screen, most
cataloguers stand between the online world of organized metadata and the
awkwardness of the physically real.
I have occasionally had to split or change records when I found that
name headings or subject headings have split or changed. There is the
potential for greater fallout from this -- from having to change call
number labels on books to having to reissue printed bibliographies and
pathfinders. Needless to say I have been frustrated when I've found
recently loaded MARC records for older material that have the obsolete
headings attached (particularly annoying for changed subject
subdivisions -- this messes up the linking mechanisms for our indexes in
our ILS-- this also means I can't extol the virtues of messiness). There
has also been the general cataloguing convention of not going back and
fixing things-- a concrete example being leaving obsolete call numbers
on books, but this also applies to not adding subject headings to our
older MARC records, or not spending a lot of time overlaying our records
with newer ones. Needless to say I despise that convention and that is a
big reason for my interest in the next gen catalogue.
So there has to be better integrated change control processes as part of
the next gen catalogue. While local cataloguers may love to give up the
need to endlessly update and fix their local records by tapping into a
vibrant, link-rich world of centralized wiki-catalogues, there is also a
need for being notified about and for examining the ramifications of new
data as it affects access to and the organization of the local holdings.
As a cataloguer, and provided there are good integrated change control
mechanisms in place, I would actually see this as overall a better
system and better use of my time. I would love to have the best and most
current for my users, and I would like to facilitate the contributions
of my users plus my own judgements on matters to be used as input in a
truly shared cataloguing environment.
Thomas Brenndorfer, B.A, M.L.I.S.
Guelph Public Library
100 Norfolk St.
Guelph, ON
N1H 4J6
(519) 824-6220 ext. 276
tbrenndorfer_at_library.guelph.on.ca
Received on Thu May 17 2007 - 08:49:35 EDT