[Koha-devel] The future of Koha (some ideas and thoughts)

Thomas D koha at alinto.com
Mon Sep 5 21:57:03 CEST 2005


The call number related columns from the biblioitems table, such as
biblioitems.dewey, would not be needed in Koha provided that the templates
are changed to use items.itemcallnumber.

Biblioitems.itemtype is a flexible field with multiple uses in non-MARC
Koha.  Unfortunately, the one to one relationship between the biblio,
biblioitems, and items tables required for MARC Koha has undermined that
flexible utility for biblioitems.itemtype.  Circulatioon rules are usually
tied to a media format and binding based itemtype.  Non-MARC Koha, allows
multiple biblioitems for each item.  Items can switch itemtypes for an item
as needed.  MARC Koha provides no interface for changing the itemtype to
accommodate a need to change the circulation rules for that item only
without changing the circulation rules for every other item common to that
itemtype as well.

Both MARC and non-MARC Koha need a more flexible approach to these issues. 
Circulation rules need to be set for the item in the items table.  Separate
columns are needed to specify format, binding, audience and whatever else
for which biblioitems.itemtype has been used in Koha.  A means of grouping
this information by assigning defaults based on other items for that biblio
could preserve the advantage of grouping by biblioitems.itemtype.

There should be no need to tie a circulation rule to a media format or
binding.  Tying circulation rules in that way is a decetpive convenience
that leads to an unnecessarily inflexible data model.


Thomas D

Quoting Stephen Hedges <shedges at skemotah.com> :
> ---------------- Beginning of the original message ------------------
> 
> Thomas D said:
> > What holdings data does NPL map to the biblioitems table?
> >
> >
> > Thomas D
> 
> 942 $c (item type) and 942 $k (dewey) -- here's the complete
> biblioitems
> mapping:
> 
> biblioitemnumber --> 090 $d  (Koha biblioitemnumber)
> biblionumber
> volume --> 440 $v  (Volume number/sequential designation)
> number --> 440 $n  (Number of part/section of a work)
> classification --> 942 $k  (dewey)
> itemtype --> 942 $c  (item type)
> isbn --> 020 $a  (International Standard Book Number)
> issn --> 022 $a  (International Standard Serial Number)
> dewey
> subclass
> publicationyear
> publishercode --> 260 $b  (Name of publisher, distributor,
> etc)
> volumedate
> volumeddesc
> timestamp
> illus --> 300 $b  (Other physical details)
> pages --> 300 $a  (Extent)
> notes
> size --> 300 $c  (Dimensions)
> place --> 260 $a  (Place of publication, distribution, etc)
> url --> 856 $u  (Uniform Resource Identifier)
> lccn --> 010 $a  (LC control number)
> marc
> 
> >
> > Quoting Stephen Hedges <shedges at skemotah.com> :
> >> ---------------- Beginning of the original message
> ------------------
> >>
> >> Thomas D said:
> >> > What alterations were done for the customised version of
> >> bulkmarcimport.pl
> >> > at NPL?
> >>
> >> Some code was added to the beginning to re-write the MARC
> >> records (using
> >> MARC::Record) to split holdings information into two MARC
> tags
> >> instead of
> >> the normal single tag, so that some of the holdings data
> could
> >> be mapped
> >> to biblioitems and the rest to items.
> >>
> >> Stephen
> >>
> >> >
> >> >
> >> > Thomas D
> >> >
> >> > Quoting Stephen Hedges <shedges at skemotah.com> :
> >> >> ---------------- Beginning of the original message
> >> ------------------
> >> >>
> >> >> I'd like to reinforce Thomas' point about
> "disintegrating
> >> >> integrated
> >> >> library systems" (and also correct a small error).
> >> >>
> >> >> NPL now uses BookWhere instead of ITS MARC for
> cataloging
> >> >> materials --
> >> >> which actually reinforces Thomas' point, because NPL was
> >> able
> >> >> to switch
> >> >> from one cataloging application to another without
> making
> >> >> changes to Koha.
> >> >>  This was possible because both cataloging applications
> >> >> generate MARC
> >> >> records in a standard format (iso2709), and Koha can
> import
> >> >> those
> >> >> standardized records.  However, Koha is only able to do
> the
> >> >> import after
> >> >> the records have been altered by a customized version of
> >> >> bulkmarcimport.pl.
> >> >>
> >> >> I believe I remember that Paul has already discussed the
> >> >> notion of
> >> >> rewriting the Koha cataloging code so that it generates
> >> files
> >> >> of records
> >> >> for import in batches, instead of adding each record as
> it
> >> is
> >> >> created
> >> >> (which is very slow).  I think it would be wise to aim
> for
> >> >> doing this
> >> >> rewrite in such a way that it also makes it easy for a
> >> library
> >> >> to use
> >> >> their own cataloging utility.  In other words, I suggest
> >> that
> >> >> we start the
> >> >> disintegration process with the cataloging module,
> making
> >> it a
> >> >> "stand-alone" application that _can_ be used with Koha,
> but
> >> >> may also be
> >> >> replaced with another cataloging application of the
> >> library's
> >> >> choosing.
> >> >>
> >> >> Stephen
> >
> > [snip]
> >
> >>
> >>
> >> --
> >> Stephen Hedges
> >> Skemotah Solutions, USA
> >> www.skemotah.com  --  shedges at skemotah.com
> >>
> >>
> >> ------------------- End of the original message
> ---------------------
> >
> >
> >
> >
> > ---------------------------------------------
> > Protect your mails from viruses thanks to Alinto Premium
> services
> > http://www.alinto.com
> >
> 
> 
> -- 
> Stephen Hedges
> Skemotah Solutions, USA
> www.skemotah.com  --  shedges at skemotah.com
> 
> 
> ------------------- End of the original message ---------------------




---------------------------------------------
Protect your mails from viruses thanks to Alinto Premium services http://www.alinto.com




More information about the Koha-devel mailing list