You can collect in two ways, Tim's way or my way, or a combination of both.
My way, or what I could call the quality way means that You carefully
looks at
every title. Who likes "white noise"? I use the same criteria I use for
paper books.
But I have to confess, that I am not a fundamentalist in that respect.
The equivalent to Gutenberg in Sweden is Project Runeberg.
http://runeberg.org/
I have collected them all and by creating new records by using old ones,
taken from
our national catalogue Libris or from WorldCat made the e-books part of
my library's
catalogue and our national.
Try to write fri e-bok Runeberg
http://webbgunda.ub.gu.se/cgi-bin/chameleon
Here an example, the first part of a classical work
Författare Linnström, Hjalmar
Titel Svenskt boklexikon [Elektronisk resurs] : åren 1830-1865. A-L
Deltitel 1, A-L
Ort/förlag/år Uppsala : Bokgillet ; [Stockholm] : [Seelig], 1961 ;
Omfång 995 s.
SAB-klassning Aa-c/DR
Internet http://runeberg.org/linnstrom/1/
Bibliotek/hylla Digitala biblioteket:
Se alla delar 997001272X
and here is the marc-record taken from the national catalogue
001 10238157
003 LIBRIS
005 20061218153422.0
007 cr aa ---aaaaa
008 061016r196118nnsw sb 000 0 swe c
040 a Gdig
084 a Aa-c.01/DR 2 kssb/6
100 1 a Linnström, Hjalmar
245 0 0 a Svenskt boklexikon h [Elektronisk resurs] : b åren 1830-1865
/ c utarb. av Hjalmar Linnström
250 a 2. uppl.
260 a Uppsala : b Bokgillet ; a [Stockholm] : b [Seelig], c 1961 ;
e (Stockholm : f Temotryck [fotolito] ; Nike-tryck [offsettryck])
300 a 2 vol.
500 a Faks. av 2. uppl.
533 a Fritt tillgängliga via Projekt Runeberg
599 a Huvudpost (flerbandsverk)
856 4 0 u http://runeberg.org/linnstrom/
976 2 a Aa-c.01 b Bibliografi Sverige Nationalbibliografier
841 5 Gdig a x a b 061016|| |||||001||||||000000 e 1
852 5 Gdig b Gdig x Fri e-bok
It takes less than five minutes to create an e-record by reusing an
p-record and add the
fiels necessary to transform the record to an e-record.
I have collected by myself up to today more than 17.000 e-books. E-books
that suits my
clients and my library and without any compromise with quality.
I would expect that from one milion Google-books I would collect less
than ten percent, the
rest is just bad quality, bad relevance and should stay hidden in
American stacks.
But on the other side, much should be imported automatically, like
official documents not just
Swedish but also from the other Scandinavian countries, from
international organizations, like
the UN, the EU, Human Rights Watch, eScholarship Editions, National
Academies Press.
The problem is how to do that. The second problem is how to raise
interest in free quality
e-books because my collegues at other libraries are not interested. I
can do about 10.000
per year and if we were ten working with this, that would mean that we
could collect
100.000 titles per year and that seems OK, because there are more crap
than quality out
there in cyber-space. So what is the point to mecanically harvest GBS
URLs if most of it
is not of any value?
Correct me please if I am wrong
Jan
Tim Spalding wrote:
>> Tim Spalding at LibraryThing developed a bookmarklet to harvest GBS
>> URLs.[5] The bookmarklet collected over 253,000 URLs in about a day
>> before Tim suspended the effort.[6]
>>
>
> We're had some conversations with Google that suggest a better
> solution is on the way, and one that others could use, not just LT. If
> it's not, we'll go back to harvesting them and wait for, er, someone
> else to suspend us.
>
> Tim
>
--
Jan Szczepanski
Förste bibliotekarie
Goteborgs universitetsbibliotek
Box 222
SE 405 30 Goteborg, SWEDEN
Tel: +46 31 773 1164 Fax: +46 31 163797
E-mail: Jan.Szczepanski_at_ub.gu.se
Received on Wed Oct 17 2007 - 02:49:37 EDT